VC dimension

Tutorial Slides by Andrew Moore

This tutorial concerns a well-known piece of Machine Learning Theory. If you've got a learning algorithm in one hand and a dataset in the other hand, to what extent can you decide whether the learning algorithm is in danger of overfitting or underfitting? If you want to put some formal analysis into the fascinating question of how overfitting can happen, then this is the tutorial for you. In addition to getting good understanding of the overfitting phenomenon, you also end up with a method for estimating how well an algorithm will perform on future data that is solely based on its training set error, and a property (VC dimension) of the learning algorithm. VC-dimension thus gives an alternative to cross-validation, called Structural Risk Minimization (SRM), for choosing classifiers. We'll discuss that. We'll also very briefly compare both CV and SRM to two other model selection methods: AIC and BIC.

Download Tutorial Slides (PDF format)

Powerpoint Format: The Powerpoint originals of these slides are freely available to anyone who wishes to use them for their own work, or who wishes to teach using them in an academic institution. Please email Andrew Moore at awm@cs.cmu.edu if you would like him to send them to you. The only restriction is that they are not freely available for use as teaching materials in classes or tutorials outside degree-granting academic institutions.

Advertisment: I have recently joined Google, and am starting up the new Google Pittsburgh office on CMU's campus. We are hiring creative computer scientists who love programming, and Machine Learning is one the focus areas of the office. If you might be interested, feel welcome to send me email: awm@google.com .