I am interested in statistical machine learning, parallel algorithms, scalable machine learning, computer vision, computational biology, nonparametric methods, natural language processing, and analysis of networks.

For a publication list, please see my CV.

Below are a few videos showing results for a project on unsupervised object tracking and modeling.

Unsupervised Detection and Tracking of Multiple Objects with Dependent Dirichlet Process Mixtures.

This project involves a new technique for the unsupervised detection and tracking of arbitrary objects in videos. It is intended to reduce the need for detection or localization methods tailored to specific object types and serve as a general framework applicable to videos with varied objects, backgrounds, and film qualities. The technique uses a dependent Dirichlet process mixture (DDPM) known as the Generalized Polya Urn (GPUDDPM) to model image pixel data that can be extracted in a general manner from the regions in a video that represent objects.

The video to the left shows a specific implementation of the model using spatial and color pixel data extracted via frame differencing from a PETS2000 Workshop video dataset; inference is performed via a MCMC sampling procedure. This method has shown its ability to, without modification, detect and track objects with diverse physical charactersitics moving over non-uniform backgrounds and through occlusion.

Sequential Monte Carlo (Particle Filter) for the Dependent Dirichlet Process Object Tracker

This is a video recording of a sequential monte carlo (particle filter) sampling algorithm for the dependent Dirichlet process mixture model; inference on this model provides a method for unsupervised detection and tracking of arbitrary objects. Above, hard-to-detect ants are localized and tracked without any explicit detection criteria.
Upper right figure: Video data.
Lower right figure:

Video with current sample shown. The blue line denote previous path of object. Gibbs sampling is used to iteratively sample at each time step; the final Gibbs sample for each time is shown in red.

Left figure:

Extraction data, video, and sample at current and all previous time steps are shown.

TC-MAT: the T Cell Motility Analysis Tool

The T Cell Motility Analysis Tool (TC-MAT) is an open source software package that may be used for detecting, tracking, extracting features from, and providing analysis of T cell behavior and morphology from time lapse video microscopy images. The TC-MAT is intended to allow for increased accuracy in the detection and tracking of T cells by performing detection using a method based on the circular hough transform and tracking in a two-step process involving the identification of subtracks followed by the joining of these subtracks into full tracks (making use of a modified nearest neighbor algorithm and the Hungarian algorithm, respectively). We hope to allow for robust detection of T cells in noisy, low contrast Differential Interference Contrast (DIC) microscopy images, and tracking of quickly moving cells in low frame rate videos, which are commonly obtained as results in T cell motility studies.

The video to the left shows tracking results for a typical time lapse microscopy video of T cells.

Pre-processed Tracking Data
Click here to download pre-processed tracking data (position and color features) used in the dependent Dirichlet process mixtures tracking project (view README for instructions, zip size: 934K).