Machine Learning, 15:681, Fall 1997
Professor Tom M. Mitchell
School of Computer Science, Carnegie Mellon University
Machine Learning is concerned with computer programs that
automatically improve their performance through experience. This
course covers the theory and practice of machine learning from a
variety of perspectives. We cover topics such as learning decision
trees, neural network learning, statistical learning methods, genetic
algorithms, Bayesian learning methods, explanation-based learning, and
reinforcement learning. The course covers theoretical concepts such as
inductive bias, the PAC and Mistake-bound learning frameworks, minimum
description length principle, and Occam's Razor. Programming
assignments include hands-on experiments with various learning
algorithms. Typical assignments include neural network learning for
face recognition, and decision tree learning from databases of credit
Time and Place: Tues & Thurs 12:00-1:20, Porter Hall 125C
Tom Mitchell, Wean
Hall 5309, x8-2611, Office hours: Wed 3:00-4:00
Dellaert, Smith Hall 212, x8-6880, Office hours: Thursday 3:30-5:00
Belinda Thom, Wean Hall 4610,
x8-3608, Office hours: Tuesday 3:30-5:00
Harpley, Wean Hall 5313, x8-3802
Learning, Tom Mitchell, McGraw Hill, 1997.
This course is a combination upper-level undergraduate and introductory
Ph.D. students in CS may obtain one core credit unit by arranging an
extra course project.
Grading: Will be based on homeworks (35%), midterm (30%), and
Policy on late homework:
Homework is worth full credit at the beginning of class on the due date,
It is worth half credit for the next 48 hours,
It is worth zero credit after that.
You must turn in all but two assignments, even if for zero credit
Free exemption: We will ignore your lowest homework grade for the semester.
Homework assignments (postscript)
Assignment 1 .
Due Sept 11, 1997. Concept learning, sample complexity.
Assignment 2 .
Due Sept 25. Decision tree learning, credit analysis dataset.
Due Oct 16. Neural Network learning, face recognition dataset.
Due Oct 28. Confidence intervals and Bayesian reasoning.
Due Nov 13. Genetic algorithms, Inductive logic programming.
Due Dec 2. Naive Bayes for text classification.
Lecture Notes (postscript)
The course syllabus.
Aug 26, 1997. Overview
and design of a checkers learner. (Chapter 1)
Aug 28. No lecture today.
Sept 2 and 4. Learning
concepts, Version Spaces, inductive bias. (Chapter 2)
Sept 9. Sample complexity,
PAC framework. (Chapt 7, sections 7.2 and 7.3 up to (not including)
Sept 11. PAC learning,
VC dimension (remainder of 7.3 except for 184.108.40.206, read 7.4 through
7.4.3, read 7.5 through 7.5.2)
Sept 16. Decision trees
(Chapter 3 through 3.6)
Sept 18. Decision trees
II ( Chapter 3.7 through 3.8)
Sept 23. Neural networks
Sept 25. Neural networks
II (Chapter 4)
Sept 30. Neural nets
III. (Chaper 4)
Oct 2. Review for Midterm
- Oct 7.
Midterm exam. in class, open notes, open book.
Oct 9. Estimation and confidence intervals (Chapter 5). Guest lecture:
Estimation and confidence intervals II (Chapter 5), and relation to PAC model
Bayesian learning methods (Chapter 6).
- Oct 21.
Bayesian learning methods II (Chapter 6).
Bayesian learning methods III (and text classification) (Chapter 6).
Oct 28. Instance-based learning (Chapter 8). Guest lecture:
Genetic algorithms, Genetic programming (Chapter 9).
Genetic algorithms II (Chapter 9).
Learning Sets of Rules, ILP (Chapter 10).
Explanation based learning (Chapter 11 thru 11.3)
Combining Inductive and Analytical Learning (Chapter 12 thru 12.3.)
Bayesian Belief Networks, EM (Chapter 6.11, 6.12)
Nov 20. Hidden Markov Models. Guest lecture: Prof. Rosenfeld
Reinforcement learning (Chapter 13)
- Dec 2. Reinforcement learning II (Chapter 13), Wrapup: Summary and
Dec 4. No lecture. Optional review session for final.
Dec 9. Final exam. 5:30-8:30 pm, Wean 7500, open notes, open book.
Note to people outside CMU
Feel free to use the slides and materials available online here. Please
email Tom.Mitchell@cmu.edu with
any corrections or improvements.
See also the Fall
1996 version of this course , co-taught with Prof. Avrim Blum.