10-715 Fall 2019: Advanced Introduction to Machine Learning

The rapid improvement of sensory techniques and processor speed, and the availability of inexpensive massive digital storage, have led to a growing demand for systems that can automatically comprehend and mine massive and complex data from diverse sources. Machine Learning is becoming the primary mechanism by which information is extracted from Big Data, and a primary pillar that Artificial Intelligence is built upon. This course is designed for Ph.D. students whose primary field of study is machine learning, or who intend to make machine learning methodological research a main focus of their thesis. It will give students a thorough grounding in the algorithms, mathematics, theories, and insights needed to do in-depth research and applications in machine learning. The topics of this course will in part parallel those covered in the general graduate machine learning course (10-701), but with a greater emphasis on depth in theory and algorithms. The course will also include additional advanced topics such as fairness in machine learning.

Time and location: The class is scheduled for Monday and Wednesday 10.30am to 11.50am in MM A14 and Friday 10.30am to 11.50am also in MM A14. The Friday slot will usually be used for recitations, but will also often be used for make-up classes (due to any travels on Mon/Wed).

Units: 12

Instructor: Nihar Shah

Syllabus: The topics covered will be similar to those covered last year.

[SB] Understanding Machine Learning: From Theory to Algorithms by Shai Shalev-Shwartz and Shai Ben-David (available online)

Course staff and contact details:
Nihar Shah: nihars at cs dot cmu dot edu      Office hours by appointment (GHC 8211). To set up a meeting, please send Nihar an email with your availability, as well as the topics you would like to discuss (e.g., specific lectures or project content).

Charvi Rastogi: crastogi at andrew.cmu.edu      Office hours Friday 2-3 pm in the common area outside GHC 8011
Leqi Liu: leqil at andrew.cmu.edu      Office hours Tuesday 2-3pm in the common area outside GHC 8011
Juyong Kim: juyongk at cs.cmu.edu      Office hours Thursday 3-4pm in the common area outside GHC 8011

Tentative schedule (subject to change) and references: Note that the lectures will be taught on the board, and hence there are no "slides''.
Aug 26Introduction, logistics, terminology, basicsSlides from past offering (ignore "logistics"), SB chapter 2
Aug 28Perceptrons: Hope, hopelessness, and hope again SB chapter 9
Aug 30Optimization for MLNotes
Sept 2No class (labor day)
Sept 4Class canceled due to flooding of the classroomNo, this is not a prank
Sept 9Support vector machines SB chapter 15
Sept 11Kernel methods ISB chapter 16
Sept 13Recitation: Optimization
Sept 16Kernel methods IISB chapter 16
Sept 18Decision trees, random forests, crossvalidation, bagging, bootstrapping, interpretability, explanabilitySB chapter 18, Paper
Sept 20Recitation: Tail bounds
Sept 23(continued) Decision trees, random forests, crossvalidation, bagging, bootstrapping, interpretability, explanability
Learning theory I

SB Chapters 2 - 5
Sept 25Learning theory IISB Chapters 2 - 6
Sept 30Learning theory IIISB Chapters 2 - 6
Oct 2Learning theory IVSB Chapters 2 - 6
Oct 7No classThere was a make-up class on Aug 30
Oct 9Midterm (in class)All material covered until Oct 4
Oct 14Learning theory V, Interpolation regimeSB Chapters 6 - 7, Paper
Oct 16Neural networks I SB Chapter 20
Oct 21Neural networks II SB Chapter 20, Paper
Oct 23Unsupervised learning: Clustering SB Chapter 22
Oct 28Guest lecture: Yuanzhi Li "Separation between Neural networks and Kernels"
Nov 1Guest lecture: Andrej Risteski "Better understanding of modern paradigms in generative models"
Nov 4Dimensionality reductionSB Chapter 23
Nov 6BoostingSB Chapter 10
Nov 8Recitation: MLE and MAP
Nov 11Online learningSB Chapter 21
Nov 13Semi-supervised learning, Active learning, Multi-armed banditsTransductive SVM, Active learning, Multi-armed bandits, Ranking via MABs
Nov 15Recitation: Linear regression
Nov 18Reinforcement learningSurvey
Nov 20Graphical modelsGraphical models
Nov 22Recitation: Rademacher Complexity
Nov 25Fairness Hiring example, Paper 1, Paper 2, In peer review
Nov 27No class (Thanksgiving)
Dec 2Project presentation (in class)
Dec 4Project presentation (in class)

Evaluation: Major homeworks (30%), mini homeworks (10%), midterm (25%), final project (25%), class participation (10%).
To audit the course, the student must do all of the above except the final project and get a passing grade.

Project: One of the course requirements is to do a project in a team of 3 or 4. Your class project is an opportunity for you to explore an interesting machine learning problem of your choice, whether empirically or theoretically. You may conduct an experiment (pick some datasets, and apply an existing or new machine learning algorithm) or work on a theoretical question in machine learning. The timeline for projects is as follows:
Sept 26, 5pmForm teams
Oct 15, 5pmProposal
Dec 2, 10amFinal report
Last two lectures (Dec 2 and 4; in class)Presentation

Homeworks Homeworks will be released on Diderot

Collaboration policy for homeworks