This Wednesday I’ll be presenting a tutorial introduction to Dirichlet Process Mixture Models (DPMs), the flexible, nonparametric Bayesian method for clustering with variable numbers of clusters (among other things).

Tom Stepleton 31 Jan

This Wednesday I’ll be presenting a tutorial introduction to Dirichlet Process Mixture Models (DPMs), the flexible, nonparametric Bayesian method for clustering with variable numbers of clusters (among other things). I’ll introduce a number of terms and metaphors people use to discuss DPMs, and I’ll derive the so-called Chinese Restaurant Process, the Markov Chain Monte Carlo method for DPM inference. Finally, I’ll describe one or two applications to computer vision, including recent work that integrates DPMs and MRFs for smooth image segmentation.
This talk comes with a guarantee: once it’s done, you’ll be able to go back to your office or cube and implement a Dirichlet Process Mixture Model on your own—or your money back!

I will cover topics from some of the following papers—the first is a terrific reference, and the rest can serve as a “seed bibliography” on the subject:

R. Neal, “Markov chain sampling methods for Dirichlet process mixture models.” J. Computational and Graphical Statistics, 2000.
Orbanz and Buhmann, “Smooth Image Segmentation by Nonparametric Bayesian Inference.” ECCV 2006.
Zhu, Ghahramani, and Lafferty, “Time-Sensitive Dirichlet Process Mixture Models.” CMU-CALD-05-104, May 2005.
Beal, Ghahramani, and Rasmussen, “The Infinite Hidden Markov Model.” NIPS 2001.
E.B. Sudderth, A. Torralba, W.T. Freeman, A.S. Willsky, “Describing Visual Scenes using Transformed Dirichlet Processes.” NIPS 2005.
D.M. Blei, M.I. Jordan, “Variational methods for the Dirichlet Process.” ICML 2004.
(and for background) D.M. Blei, A.Y. Ng, M.I. Jordan, “Latent Dirichlet Allocation.” J. Machine Learning Research 3:993-1022, 2005.