Probabilistic Graphical Models Session

Graphical model approach to Iris matching under local shifts and occlusions
by Balakrishnan Narayaswamy and Ryan Kerekes

Ryan Kerekes

Template matching of iris images for biometric recognition typically suffers from both local deformations between the template and query images. Previous work in this area has shown that recognition accuracy can be significantly improved by imposing a probabilistic model on the local deformations. In this work, we propose a Bayesian estimate of the deformations by using lattice-type undirected graphical models to represent interdependent regions of the iris. We present underlying theory as well as experimental results from two iris databases. We show that our method significantly improves recognition accuracy on these datasets.

Exponential Family Harmoniums for Classifying Video Data
by Jun Yang, Yan Liu, Jiazhi Ou

Jun Yang

In this project, we investigate the use of exponential-family harmoniums (EFH) as undirected graphical models for the task of modeling and classifying video data. Three EFH-based models are explored, which include dual-wing harmonium (DWH) with a discriminative classifier, a family of class-dependent harmoniums (FoH), and triple-wing harmoniums (TWH) with label variables. Moreover, various inference methods such as mean field, contrastive divergence, Langevin, loopy belief propagation, and Gibbs sampling are implemented for the learning and inference of these models. The proposed models and different inference methods are evaluated on news video data.

Euclidean Embedding of Author-Word Co-occurrence Data Through Time
by Purnamrita Sarkar and Sajid Siddiqi

Purnamrita Sarkar

We address the problem of embedding entities into Euclidean space over time based on co-occurrence count data. We model the probability of the data given the (hidden) Euclidean coordinates of the entities. This results in a factored state space model with real-valued hidden parent nodes and discrete children. An approximation on the log partition function of the observation model makes it conjugate to the Normal distribution, allowing us to formulate the dynamic model as a Kalman filter. Qualitative results on co-occurrences of authors and words in the NIPS corpus show that our model results in efficient and meaningful embeddings of large datasets.

Back to the Main Page

Pradeep Ravikumar
Last modified: Wed Feb 8 14:02:09 EST 2006