15-859(B) MACHINE LEARNING THEORY

Instructor: Avrim Blum
Time: TR 3:00-4:20
Place: We've moved to Wean 4623
Course description: This course will focus on theoretical aspects of machine learning. We will examine questions such as: What kinds of guarantees can one prove about learning algorithms? What are good algorithms for achieving certain types of goals? Can we devise models that are both amenable to mathematical analysis and make sense empirically? What can we say about the inherent ease or difficulty of learning problems? Addressing these questions will require pulling in notions and ideas from statistics, complexity theory, cryptography, game theory, and empirical machine learning research.

Text: An Introduction to Computational Learning Theory by Michael Kearns and Umesh Vazirani, plus papers and notes for topics not in the book.

Office hours: Just stop by my office or make an appointment.


The graded finals are now available in Dorothy's office (Wean 4116). Feel free to stop by and pick yours up.

Handouts

Lecture Notes

  1. 01/15: Introduction, basic definitions, the consistency model.
  2. 01/17: The Mistake-bound model, relation to consistency, halving and Std Opt algorithms.
  3. 01/22: The weighted majority algorithm & applications.
  4. 01/24: The Perceptron and Winnow algorithms.
  5. 01/29: Intro to PAC model, Occam's razor.
  6. 01/31: Chernoff/Hoeffding bounds, MB->PAC. See also this handout on probabilistic inequalities.
  7. 02/05: Slides and Notes: finish MB->PAC, VC-dimension.
  8. 02/07: Today we will go to John Langford's thesis defense in NSH 3305.
  9. 02/12: VC-dimension: the proofs. Slides and Notes.
  10. 02/14: Boosting I: weak vs strong learning, Schapire's original method.
  11. 02/19: Boosting II: Adaboost.
  12. 02/21: Learning from random classification noise. The Statistical Query model.
  13. 02/26: Learning with noise and SQ model contd. Begin Fourier analysis. Notes and Slides.
  14. 02/28: Characterizing SQ learnability with Fourier analysis, contd (see notes from last time). One more SQ algorithm.
  15. 03/05: Membership queries: some basic algorithms (monotone DNF, log-n var functions); KM/GL alg for finding large fourier coeffs.
  16. 03/12: Membership queries II: finish KM/GL. Bshouty's alg for learning XOR of terms (and DTs).
  17. 03/14: Finish Bshouty's algorithm. Cryptographic hardness results for learning.
  18. 03/19: Finish crypto hardness. Angluin's algorithm for learning finite state automata. Notes, slides, more notes.
  19. 03/21: Learning finite state environments without a reset. Notes, and slides.
  20. 03/26: Maximum likelihood, EM, and hidden Markov models. Notes, slides, and more slides.
  21. 03/28: Strong-learning DNF over the uniform distribution [presentation by Shuchi and Nikhil]
  22. 04/09: Learning monotone boolean functions. Notes, and a paper.
  23. 04/11: Kedar Dhamdhere: random projection
  24. 04/16: Umut Acar: boosting and hard-core set construction.
  25. 04/18: Maximum entropy learning, connection to maximum likelihood and Winnow.
  26. 04/23: John Langford: PAC-Bayes bounds
  27. 04/25: Allison Bruce and Benoit Hudson: Q-learning
  28. 04/30: Si Luo / Rong Jin
  29. 05/02: TBD

Additional Readings

(Related papers that go into more detail on topics discussed in class)