Machine Learning, 10-701 and 15-781
Tom M. Mitchell & Andrew W. Moore
School of Computer Science, Carnegie Mellon University
Fall 2002
It is hard to imagine anything more fascinating than
automated systems that improve their own
performance. The study of learning from data is
commercially and scientifically important. This
course is designed to give a graduate-level student a
thorough grounding in the methodologies,
technologies, mathematics and algorithms currently
needed by people who do research in learning and data
mining or who may need to apply learning or data
mining techniques to a target problem.
The topics of the course draw from classical
statistics, from machine learning, from data mining,
from Bayesian statististics and from statistical
algorithmics.
Students entering the class with a pre-existing
working knowledge of probability, statistics and
algorithms will be at an advantage, but the class has
been designed so that anyone with a strong numerate
background can catch up and fully participate.
Class lectures: Tues & Thurs 10:30-11:50, Wean Hall 7500
Optional recitation section: Mondays 5:00-6:30 beginning Sept 23 Newell-Simon Hall 1305
Instructors:
Teaching Assistants:
Textbook:
Course Website (this page):
- http://www.cs.cmu.edu/afs/cs.cmu.edu/project/theo-20/www/mlc/index.html
Grading:
- Final grades will be based on midterm (25%), six homeworks (35%), and final exam (40%)
Policy on late homework:
- Homework is worth full credit at the beginning of class on the due date.
- It is worth half credit for the next 48 hours.
- It is worth zero credit after that.
- You must turn in at least 4 of the 6 assignments, even if for zero credit, in order to pass the course.
- Free exemption: We will ignore your lowest homework grade for the semester.
Policy on collaboration:
- You may wish to discuss the homework with other
students. If you like, you may form groups of two or
three students and turn in one homework solution with
up to three names on the assignment. (Of course
collaboration on exams is cheating and grounds for
immediate failure and worse!)
Homework assignments
- HW1: Decision trees. Available in PDF, PostScript, or TeX. Out Sept 17, due Sept 26. Solutions and a histogram of grades (before extra credit).
- HW2: Probabilistic methods, Neural nets. Available in PDF, PostScript, or Compressed TeX and EPS source. Out Sept 26, due Oct 8. Solutions in PDF and PostScript, and a histogram of grades.
- HW3: PAC Learning, SVM's, Cross validation, kNN. Available in PDF, PostScript, or Compressed TeX and EPS source. Out Oct 10, due Oct 29. Solutions in PDF and PostScript. 12/3: Solutions updated to include answer to question 2.
- HW4: Bayes nets. Available in PDF. Useful C++ code. (Also the bike.bn sample file separately.) Out Oct 31, due Nov 19. Coyote.bn file and solutions in PDF. Updated 12/9 3:30 pm
- HW5: Clustering, HMM's. Available in PDF, PostScript, or Compressed TeX and EPS source. Data files for problem 4. Out Nov 19, due Nov 26. Solutions in PDF and PostScript. Updated 12/9 2:30 pm
- HW6: Markov processes and Reinforcement learning. Out Nov 26, due Dec 6 in Allison's office by 5pm. Available in PDF, PostScript, or Compressed TeX and EPS source. Solutions in PDF and PostScript. Updated 12/9 2:30 pm
Lecture schedule (and online slides if available)
- Sept 12. Decision trees (Tom) Reading: Ch. 3 of Machine Learning
- Sept 17. Decision trees (Tom) Reading: Ch. 3 of Machine Learning
- Sept 19. Probabilistic Machine Learning (Andrew) Lecture slides on Probability for Data Mining , Lecture slides on Probability Densities Functions Reading: Ch.6.1 through 6.4, Machine Learning
- Sept 24. Class cancelled.
- Sept 26. MLE, MAP and Bayes Classifiers (Andrew) Lecture slides on Gaussians , Lecture slides on Maximum Likelihood Estimation , Lecture slides on Gaussian Bayes Classifiers Reading: Ch. 6.6 through 6.10, Machine Learning
- Oct 1. Linear Regression (Andrew) Lecture slides on Linear Regression and Neural Nets Reading: Ch. 4.4, Machine Learning
- Oct 3. Neural Nets (Andrew) Reading: Remainder of Ch. 4, Machine Learning
- Oct 8. Naive Bayes and text classification Read Ch. 6.10, PAC Learning Reading: Ch. 7, Machine Learning (Tom)
- Oct 10. PAC Learning, VCDimension (Tom - same slides as Oct 8) and Cross Validation (Andrew) Reading: Ch.7, Machine Learning. Lecture slides on Cross-Validation for preventing Overfitting.
- Oct 15. Mistake Bounds (Tom - same slides as Oct 8)
- Oct 17. Support Vector Machines (Tom) slides, Burges' SVM tutorial, Müller, et al. tutorial(PDF) especially sections I,II,III,IV up to IV.A, and VII-A. See also Andrew's SVM Tutorial Slides.
- Oct 22. Midterm exam (open book) (Midterm solutions).
- Oct 24. K-NN and Instance-based learning (Andrew) Lecture slides on K-NN and Instance-Based Learning. Reading: Ch. 8.1 through 8.4, Machine Learning
- Oct 29. Boosting (Tom) (Tom's lecture slides on Boosting , Avrim Blum's notes on Boosting , Schapire's overview of boosting )
- Oct 31. Bayesian Networks (Andrew) Lecture slides on Bayesian Networks. Lecture slides on Bayesian Network Inference (which we will cover only if there's time). Reading: Ch. 6.11, Machine Learning
- Nov 5. Bayes Net Structure Learning (Andrew) Lecture slides on Learning Bayesian Networks.
- Nov 7, 12,14. Gaussian Mixture Models, K-means and Hierarchical Clustering (Andrew) Lecture slides on Gaussian Mixture Models. Lecture slides on K-means and Hierarchical Clustering. Reading: Ch. 6.12, Machine Learning
- Nov 19,21. Unlabelled data for supervised learning: (Tom) EM , preventing overfitting , cotraining
- Nov 26. Hidden Markov Models (Andrew) Lecture slides on HMMs ,
- Dec 3. Markov Decision Processes, Reinforcement Learning (Tom) Lecture slides on MDPs, Value iteration, Policy iteration
- Dec 5. MDPs and Reinforcement Learning continued (Tom) Lecture slides on Reinforcement Learning, Q learning Reading: Ch. 13, Machine Learning
- Dec 10. FINAL EXAM 8:30am-11:30am. Location: WeH 7500
(solutions available in PDF
or PostScript. There was
a problem with a couple of figures, so they are
missing from the solution. The figure for the RL
problem with the answers written on it is available
separately in PDF or PostScript.)
Here are some example questions for studying for the final. Note that these are exams from earlier years, and contain some topics that will not appear in this year's final. And some topics will apear this year that do not appear in the following examples.
Review Sessions (Mondays, 5pm-6.20pm, NSH 1305)
The "Review of topics to date" sessions will be run by Andrew or Tom bringing a bunch of questions from recent exams and using them as starting points to discuss things that have come up in class so far. Students will be very strongly encouraged to ask questions about anything that they feel they need to know more about or would like to know more about, or things that were done in class but which they'd like to see repeated in slow motion.
The assignment sessions will be similar, except preference will be given to questions relevant to understanding the assignment. In some cases the TA might show examples of solving questions that are similar to those on the assignment.
- 9/23 (Andrés) Assignment 1
- 9/30 (Andrés) Assignment 2
- 10/7 Tom: Review of topics to date
- 10/14 Andrew: Review of topics to date
- 10/21 (Allison) Assignment 3
- 10/28 Tom: Review of topics to date
- 11/4 (Andrés) Assignment 4
- 11/11 Andrew: Review of topics to date
- 11/18 (Allison) Assignment 5
- 11/25 Andrew: Review of topics to date
- 12/2 (Allison) Assignment 6
- 12/9 Tom: Review of topics to date
Note to people outside CMU
Feel free to use the slides and materials available online here. Please email the instructors with any corrections or improvements. Additional slides are available at the Machine Learning textbook homepage and at Andrew Moore's tutorials page. Past homework exercises and exams are available at Mitchell's Fall 1998 course homepage.