Instructor: Avrim Blum
Time: TR 1:30-2:50
Place: Wean 5409
Course description: This course will focus on theoretical aspects of machine learning. We will examine questions such as: What kinds of guarantees can one prove about learning algorithms? What are good algorithms for achieving certain types of goals? Can we devise models that are both amenable to mathematical analysis and make sense empirically? What can we say about the inherent ease or difficulty of learning problems? Addressing these questions will require pulling in notions and ideas from statistics, complexity theory, information theory, cryptography, game theory, and empirical machine learning research.

Text: An Introduction to Computational Learning Theory by Michael Kearns and Umesh Vazirani, plus papers and notes for topics not in the book.

Office hours: Just stop by my office or make an appointment.

Note: This is the 2004 version of the Machine Learning Theory course. For the 2007 version, click here.


Lecture Notes

  1. 01/13: Introduction, basic definitions, the consistency model.
  2. 01/15: The Mistake-bound model, relation to consistency, halving and Std Opt algorithms.
  3. 01/20: The weighted majority algorithm (combining "expert" advice) & applications.
  4. 01/22: The Perceptron and Winnow algorithms.
  5. 01/27: The PAC model: basic results and Occam's razor.
  6. 01/29: Uniform convergence, Chernoff/Hoeffding bounds, MB=>PAC, greedy set-cover. Handout on Tail inequalities.
  7. 02/03: VC-dimension. Proof of Sauer's lemma. slides and notes.
  8. 02/05: VC-dimension, contd. Proof of main thm.
  9. 02/10: Boosting I: weak vs strong learning, basic issues, Schapire's original algorithm.
  10. 02/12: Boosting II: Adaboost. Use as proof of Hoeffding bounds.
  11. 02/17: Learning from noisy data. SQ model intro.
  12. 02/19: SQ => PAC + noise. SQ-dimension. Begin hardness results for SQ algorithms.
  13. 02/24: Characterizing SQ learnability with Fourier analysis, contd. (same notes as last time)
  14. 02/26: Weak-learning of monotone functions.
  15. 03/02: Membership queries I: basic algorithms (monotone DNF, log-n var functions); begin KM for finding large fourier coeffs.
  16. 03/04: Membership queries II: finish KM algorithm. Fourier spectrum of Decision Trees and DNF.
  17. 03/16: Membership queries III: Bshouty's alg for decision trees and XOR-of-terms. Cryptographic lower bounds.
  18. 03/18: Angluin's algorithm for learning DFAs (ppt). More notes.
  19. 03/23: Learning finite-state environments without a reset. Also slides.
  20. 03/25: MDPs and reinforcement learning.
  21. 03/30: More on MDPs. (short class today)
  22. 04/01: Kalai-Vempala algorithm for online optimization.
  23. 04/06: Maxent and maximum-likelihood exponential models. Connection to winnow.
  24. 04/08: Sham Kakade guest lecture: Deterministic Calibration and Nash Equilibrium
  25. 04/13: Linear separators, Margins, & Kernels.

Additional Readings

(Related papers that go into more detail on topics discussed in class)

Online Learning

PAC model Boosting

Fourier analysis, weak learning, SQ learning

Web sites on maxent