John Lafferty Professor of Computer Science, Machine Learning, and Statistics
Carnegie Mellon University School of Computer Science
Picture of John Lafferty Gates and Hillman Centers (GHC)

Courses

Most of my teaching in the past few years has been related to the following three courses. The Statistical Machine Learning course is part of a book project.


15-251: Great Theoretical Ideas in Computer Science
  • with Anupam Gupta

    This course introduces some of the fundamental ideas and techniques in computer science, in a self-contained way. What is computation? What is computable, in principle? What is especially easy, or especially hard to compute? To what extent does the inherent nature of computation shape how we learn and think about the world? Topics include: representations of number, induction, ancient and modern arithmetic, basic counting principles, probability, random walks, number theory, the idea of proof, formal proof, logic, problem solving methods, polynomial representations, automata theory, cryptography, infinity, diagonalization, computability, time complexity, and incompleteness and undecidability.

10-702: Statistical Machine Learning
  • with Larry Wasserman

    Statistical Machine Learning is a second graduate level course in machine learning, assuming students have taken Machine Learning (10-701) and Intermediate Statistics (36-705). The term "statistical" in the title reflects the emphasis on statistical analysis and methodology, which is the predominant approach in modern machine learning.

    The course combines methodology with theoretical foundations and computational aspects. It treats both the "art" of designing good learning algorithms and the "science" of analyzing an algorithm's statistical properties and performance guarantees. Theorems are presented together with practical aspects of methodology and intuition to help students develop tools for selecting appropriate methods and approaches to problems in their own research.

    The course includes topics in statistical theory that are now becoming important for researchers in machine learning, including consistency, minimax estimation, and concentration of measure. It also presents topics in computation including elements of convex optimization, variational methods, randomized projection algorithms, and techniques for handling large data sets.


15-359: Probability and Computing
  • with Mor Harchol-Balter

    Probability theory has become indispensable in computer science. In areas such as artificial intelligence and computer science theory, probabilistic methods and ideas based on randomization are central. In other areas such as networks and systems, probability is becoming an increasingly useful framework for handling uncertainty and modeling the patterns of data that occur in complex systems. This course gives an introduction to probability as it is used in computer science theory and practice, drawing on applications and current research developments as motivation and context. Topics include combinatorial probability and random graphs, heavy tail distributions, concentration inequalities, various randomized algorithms, sampling random variables and computer simulation, and Markov chains and their many applications, from Web search engines to models of network protocols. The course assumes only familiarity with basic calculus and linear algebra; no prior probability and statistics background is expected. Prerequiste: 15-251.

Designed by Polo Chau