Date & Time:
Friday, July 27, 2012; 12:00 PM
Bayesian nonparametrics, using stochastic processes as prior distributions, is a relatively young and rapidly growing research area in statistics and machine learning. In this talk, we first briefly review completely random measures, a family of pure-jump non-negative stochastic processes that are simple to construct and amenable for posterior computation. We then present nonparametric Bayesian latent variable models based on the beta process, Bernoulli process, gamma process, Poisson process, and in particular, the negative binomial process. Specifically, for continuous data, we discuss dictionary learning with the beta-Bernoulli process and dependent hierarchical beta process, and for count data, we present the beta-negative binomial process and Poisson factor analysis. Furthermore, we discuss how the seeming disjoint count and mixture modelings can be united under the negative binomial processes framework, providing new opportunities to build mixture and hierarchical mixture models with better data fitting, more efficient inference and more flexible model constructions. We show successful applications of our nonparametric Bayesian latent variable models to image processing, topic modeling and count data analysis.
Mingyuan Zhou is currently a Ph.D. candidate in the Department of Electrical and Computer Engineering at Duke University. His research interests are in statistical machine learning. He focuses on developing nonparametric Bayesian methods for both continuous and discrete latent variable models, which have been successfully applied to image processing, topic modeling, and count data analysis. He has devoted much of his recent research to modeling and inference centering on "negative binomial." He has contributed oral presentations in international conferences including NIPS, AISTATS and ICML. He received his B.Sc. in Acoustics from Nanjing University in 2005 and his M.Eng. in Signal & Information Processing from Chinese Academy of Sciences in 2008.