CONALD, June 11-13 Conference on Automated Learning and Discovery
General Information Submission Instructions Registration Workshops Travel and Accommodation Committees
Plenary Speakers

Tom Dietterich

Stuart Geman

David Heckerman

Michael Jordan

Daryl Pregibon

Herb Simon

Robert Tibshirani

Michael Jordan
Graphical Models and Variational Approximation

Graphical models (also known as Bayesian belief networks) provide an elegant formalism for managing uncertainty that unifies much of the literature on stochastic modeling. For sparse networks (e.g., networks in the form of chains or trees, such as Kalman filters, hidden Markov models, and probabilistic decision trees), graphical model algorithms are exact, efficient and practical. For dense networks, however, the exact algorithms are often (hopelessly) inefficient, and this fact has hindered the application of this richer class of models to large-scale problems. I discuss variational methodology, which provides a general framework for approximate graphical model inference. The variational methods I present are efficient; moreover, they tend to be more accurate for dense networks than for sparse networks. They can readily be combined with exact techniques to yield a class of algorithms that perform well for a variety of network architectures. I illustrate these ideas with examples of applications of dense networks to problems in diagnosis, prediction, and control.

Michael I. Jordan is Professor in the Department of Computer Science and the Department of Statistics at the University of California, Berkeley. He received his Masters in Mathematics from Arizona State University, and earned his PhD in Cognitive Science from the University of California, San Diego. He has worked on a variety of topics in the area of machine learning, focusing on neural networks and graphical models.
More Information

Contact for more information

The conference is sponsored by CMU's newly created Center for Automated Learning and Discovery.