Module
|
Material covered
|
Class details, online material, and homework
|
Module 1; Basics
(1 Lectures)
|
- What is learning?
- Version spaces
- Sample complexity
- Training set/test
set split
- Point estimation
- MLE
- Bayesian
- MAP
-
Bias-Variance trade off
|
Mon., Jan 16:
|
|
|
|
** No Class. MLK B-Day **
|
|
|
Module
2:
Linear models
(3
Lectures)
|
- Linear regression [Applet]
http://www.mste.uiuc.edu/users/exner/java.f/leastsquares/
- Bias-Variance tradeoff
- Overfitting
- Bayes optimal classifier
- Naive Bayes
- Logistic regression
- Discriminative v.
Generative models
|
Mon., Jan. 23:
|
|
|
|
|
Homework #1 out
Assignment: [PDF]
Prob#2 Chess dataset [zip]
|
|
Module
3: Non-linear models
Model selection
(5
Lectures)
|
- Decision trees
- Overfitting, again
- Regularization
- MDL
- Cross-validation
- Boosting [Adaboost Applet]
www.cse.ucsd.edu/~yfreund/adaboost
- Instance-based
learning
[Applet]
www.site.uottawa.ca/~gcaron/applets.htm
- K-nearest
neighbors
- Kernels
- Neural nets [CMU Course]
www.cs.cmu.edu/afs/cs/academic/class/15782-s04/
|
Wed., Feb. 8:
|
|
|
|
Lecture:
Boosting, Cross Validation, Simple Model Selection, Regularization, MDL
[Slides]
[Annotated]
|
|
EXTENSION:
Homework #1 due
(beginning of class)
Homework #2 out
Updated assignment with more hints:
[PDF]
Voting dataset [zip]
|
Mon., Feb. 13:
|
|
|
|
Lecture:
Cross Validation, Simple Model Selection, Regularization, MDL,
Neural Nets
[Slides]
[Annotated]
|
|
|
Module
4:
Margin-based approaches
(2
Lectures)
|
- SVMs [Applets]
www.site.uottawa.ca/~gcaron/applets.htm
- Kernel trick
|
|
Module
5:
Learning theory
(3
Lectures)
|
- Sample complexity
- PAC learning [Applets]
www.site.uottawa.ca/~gcaron/applets.htm
- Error bounds
- VC-dimension
- Margin-based bounds
- Large-deviation bounds
- Hoeffding's
inequality, Chernoff bound
- Mistake bounds
- No Free Lunch theorem
|
Wed., Mar. 1
|
|
|
|
Lecture:
SVMs - The Kernel Trick, Learning Theory
[Slides]
[Annotated]
|
Homework #3 due
(beginning of class)
Project Out
|
|
Mid-term Exam
|
All material thus far
|
Wed., Mar 8:
|
|
|
|
Mid-term exam (in class)
|
|
|
Spring break
|
|
Mon., Mar. 13:
|
|
|
|
** No class **
|
|
Wed., Mar. 15:
|
|
|
|
** No class **
|
|
|
Module 6: Structured models
(4
Lectures)
|
- HMMs
- Forwards-Backwards
- Viterbi
- Supervised learning
- Graphical Models
- Representation
- Inference
- Learning
- BIC
|
Wed., Mar. 22:
|
|
|
|
Lecture:
Bayes nets - Representation (cont.), Inference
[Slides]
[Annotated]
|
Homework #4 out
Project Proposal due
(beginning of class)
|
|
Module
7:
Unsupervised
and semi-supervised learning
(4
Lectures)
|
- K-means
- Expectation
Maximization (EM)
- mixture of Gaussians
- for training Bayes
nets
- for training HMMs
- Combining labeled and
unlabeled
data
- EM
- reweighting labeled
data
- Co-training
- unlabeled data and
model selection
- Dimensionality
reduction
- Feature selection
|
Mon., Apr. 3:
|
|
|
|
Lecture:
Bayes nets - Structure Learning
Clustering - K-means & Gaussian mixture models
[Slides]
[Annotated]
|
|
|
Module
8:
Learning to make decisions
(3 Lectures)
|
- Markov decision
processes
- Reinforcement learning
|
Wed., Apr. 19:
|
|
|
|
Learning from text data
Lecture by
Tom Mitchell
|
Homework #5 due
(beginning of class)
|
|
Module
9:
Advanced topics
(3 Lectures)
|
- Text data
- Hierarchial Bayesian models
- Tackling very large
datasets
- Active learning
- Overview of follow-up
classes
|
|
Project Poster Session
|
|
Fri., May 5:
Newell-Simon Hall Atrium
2:00-5:00pm
|
|
|
|
|
|
|
Project Paper
|
|
Mon., May 8:
Project paper due
|
|
|
|
|
|
|
Final Exam
|
All material thus far
|
Friday, May 12th, 1-4 p.m.
Location TBD
|
|
|
|
|
|
|