Module

Material covered

Class details, online material, and homework

Module 1; Basics
(1 Lectures)

 What is learning?
 Version spaces
 Sample complexity
 Training set/test
set split
 Point estimation
 MLE
 Bayesian
 MAP

BiasVariance trade off

Mon., Jan 15:




** No Class. MLK BDay **



Module
2:
Linear models
(3
Lectures)

 Linear regression [Applet]
http://www.mste.uiuc.edu/users/exner/java.f/leastsquares/
 BiasVariance tradeoff
 Overfitting
 Bayes optimal classifier
 Naive Bayes [Applet]
http://www.cs.technion.ac.il/~rani/LocBoost/
 Logistic regression [Applet]
 Discriminative v.
Generative models [Applet]

Mon., Jan. 22:
 Lecture: Gaussians, Linear Regression, BiasVariance Tradeoff
[Slides]
[Annotated]
 Readings: Bishop 1.1 to 1.4, Bishop 3.1, 3.1.1, 3.1.4, 3.1.5, 3.2, 3.3, 3.3.1, 3.3.2







Module
3: Nonlinear models
Model selection
(5
Lectures)

 Decision trees [Applet]
 Overfitting, again
 Regularization
 MDL
 Crossvalidation
 Boosting [Adaboost Applet]
www.cse.ucsd.edu/~yfreund/adaboost
 Instancebased
learning
[Applet]
www.site.uottawa.ca/~gcaron/applets.htm
 Knearest
neighbors
 Kernels
 Neural nets [CMU Course]
www.cs.cmu.edu/afs/cs/academic/class/15782s04/

Mon., Feb. 12:




 Lecture:
Cross Validation, Simple Model Selection, Regularization, MDL,
Neural Nets
[Slides]
[Annotated]
 Readings: (Bishop 1.3)
Model Selection / Cross Validation
 (Bishop 3.1.4) Regularized least squares
 (Bishop 5.1) Feedforward Network Functions


Wed., Feb. 14:




 Lecture:
Neural Nets, Instancebased Learning
[Slides]
[Annotated]
 Readings:
(Bishop 5.1) Feedforward Network Functions
 (Bishop 5.2) Network Training
 (Bishop 5.3) Error Backpropagation



Module
4:
Marginbased approaches
(2
Lectures)

 SVMs [Applets]
www.site.uottawa.ca/~gcaron/applets.htm
 Kernel trick


Module
5:
Learning theory
(3
Lectures)

 Sample complexity
 PAC learning [Applets]
www.site.uottawa.ca/~gcaron/applets.htm
 Error bounds
 VCdimension
 Marginbased bounds
 Largedeviation bounds
 Hoeffding's
inequality, Chernoff bound
 Mistake bounds
 No Free Lunch theorem


Midterm Exam

All material thus far


Spring break


Mon., Mar. 12:




** No class **


Wed., Mar. 14:




** No class **



Module 6: Structured models
(4
Lectures)

 HMMs
 ForwardsBackwards
 Viterbi
 Supervised learning
 Graphical Models


Module
7:
Unsupervised
and semisupervised learning
(6
Lectures)

 Kmeans
 Expectation
Maximization (EM)
 mixture of Gaussians
 for training Bayes
nets
 for training HMMs
 Combining labeled and
unlabeled
data
 EM
 reweighting labeled
data
 Cotraining
 unlabeled data and
model selection
 Dimensionality
reduction
 Feature selection

Mon., Apr. 2:




Lecture:
Bayes nets  Structure Learning
Clustering  Kmeans & Gaussian mixture models
[Slides]
[Annotated]
Readings: (Bishop 9.1, 9.2)  Kmeans, Mixtures of Gaussian



Module
8:
Learning to make decisions
(3 Lectures)

 Markov decision
processes
 Reinforcement learning


Module
9:
Advanced topics
(3 Lectures)

 Text data
 Hierarchial Bayesian models
 Tackling very large
datasets
 Active learning
 Overview of followup
classes


Project Poster Session


Fri., May 4:
NewellSimon Hall Atrium
2:005:00pm







Project Paper


Thur., May 10:
Project paper due







Final Exam

All material thus far

Tuesday, May 15th, 14 p.m.
Location: Baker Hall, Room A51






