A Brief Overview of Nonparametric Bayesian Models

Size: px
Start display at page:

Download "A Brief Overview of Nonparametric Bayesian Models"

Transcription

1 A Brief Overview of Nonparametric Bayesian Models Eurandom Zoubin Ghahramani Department of Engineering University of Cambridge, UK Also at Machine Learning Department, CMU

2 Parametric vs Nonparametric Models Parametric models assume some finite set of parameters θ. Given the parameters, future predictions, x, are independent of the observed data, D: P (x θ, D) = P (x θ) therefore θ capture everything there is to know about the data. So the complexity of the model is bounded even if the amount of data is unbounded. This makes them not very flexible. Non-parametric models assume that the data distribution cannot be defined in terms of such a finite set of parameters. But they can often be defined by assuming an infinite dimensional θ. Usually we think of θ as a function. The amount of information that θ can capture about the data D can grow as the amount of data grows. This makes them more flexible.

3 Why? flexibility better predictive performance more realistic

4 Outline Bayesian nonparametrics has many uses. Some modelling goals and examples of associated nonparametric Bayesian models: Modelling Goal Example process Distributions on functions Gaussian process Distributions on distributions Dirichlet process Polya Tree Clustering Chinese restaurant process Pitman-Yor process Hierarchical clustering Dirichlet diffusion tree Kingman s coalescent Sparse latent feature models Indian buffet processes Survival analysis Beta processes Distributions on measures Completely random measures......

5 Gaussian Processes A Gaussian process defines a distribution P (f) on functions, f, where f is a function mapping some input space X to R. f : X R. Let f = (f(x ), f(x ),..., f(x n )) be an n-dimensional vector of function values evaluated at n points x i X. Note f is a random variable. Definition: P (f) is a Gaussian process if for any finite subset {x,..., x n } X, the marginal distribution on that finite subset P (f) has a multivariate Gaussian distribution, and all these marginals are coherent. We can think of GPs as the extension of multivariate Gaussians to distributions on functions.

6 Samples from Gaussian processes with different c(x, x ) f(x).5 f(x) f(x).5 f(x) x x x x f(x) f(x) f(x) f(x) x x x x f(x) f(x) f(x) f(x) x x x x

7 Dirichlet Processes Gaussian processes define a distribution on functions f GP( µ, c) where µ is the mean function and c is the covariance function. We can think of GPs as infinite-dimensional Gaussians Dirichlet processes define a distribution on distributions (a measure on measures) G DP( G, α) where α > is a scaling parameter, and G is the base measure. We can think of DPs as infinite-dimensional Dirichlet distributions. Note that both f and G are infinite dimensional objects.

8 Dirichlet Process Let Θ be a measurable space, G be a probability measure on Θ, and α a positive real number. For all (A,... A K ) finite partitions of Θ, G DP( G, α) means that (G(A ),..., G(A K )) Dir(αG (A ),..., αg (A K )) (Ferguson, 973)

9 Dirichlet Distribution The Dirichlet distribution is a distribution on the K-dim probability simplex. Let p be a K-dimensional vector s.t. j : p j and K j= p j = P (p α) = Dir(α,..., α K ) def = Γ( j α j) j Γ(α j) K j= p α j j where the first term is a normalization constant and E(p j ) = α j /( k α k) The Dirichlet is conjugate to the multinomial distribution. Let c p Multinomial( p) That is, P (c = j p) = p j. Then the posterior is also Dirichlet: P (p c = j, α) = where α j = α j +, and l j : α l = α l P (c = j p)p (p α) P (c = j α) = Dir(α ) Γ(x) = (x )Γ(x ) = R tx e t dt. For integer n, Γ(n) = (n )!

10 Dirichlet Process G DP( G, α) OK, but what does it look like? Samples from a DP are discrete with probability one: G(θ) = π k δ θk (θ) k= where δ θk ( ) is a Dirac delta at θ k, and θ k G ( ). Note: E(G) = G As α, G looks more like G.

11 Dirichlet Process: Conjugacy If the prior on G is a DP:...and you observe θ......then the posterior is also a DP: P (G θ) = P (θ G)P (G) P (θ) G DP( G, α) P (G) = DP(G G, α) P (θ G) = G(θ) = DP ( α α + G + ) α + δ θ, α + Generalization for n observations: ( α P (G θ,..., θ n ) = DP α + n G + α + n ) n δ θi, α + n i= Analogous to Dirichlet being conjugate to multinomial observations.

12 Dirichlet Process Blackwell and MacQueen s (973) urn representation G DP( G, α) and θ G G( ) Then θ n θ,... θ n, G, α α n + α G ( ) + n n + α n j= δ θj ( ) P (θ n θ,... θ n, G, α) dg P (θ j G)P (G G, α) j= The model exhibits a clustering effect.

13 Relationship between DPs and CRPs DP is a distribution on distributions DP results in discrete distributions, so if you draw n points you are likely to get repeated values A DP induces a partitioning of the n points e.g. ( 3 4) ( 5) θ = θ 3 = θ 4 θ = θ 5 Chinese Restaurant Process (CRP) defines the corresponding distribution on partitions Although the CRP is a sequential process, the distribution on θ,..., θ n exchangeable (i.e. invariant to permuting the indices of the θs): e.g. is P (θ, θ, θ 3, θ 4 ) = P (θ, θ 4, θ 3, θ )

14 Chinese Restaurant Process The CRP generates samples from the distribution on partitions induced by a DPM T T T 3 T Generating from a CRP: customer enters the restaurant and sits at table. K =, n =, n = for n =,..., customer n sits at table { k with prob n k n +α K + with prob α n +α if new table was chosen then K K + endif endfor for k =... K (new table) Rich get richer property. (Aldous 985; Pitman )

15 Dirichlet Processes: Big Picture There are many ways to derive the Dirichlet Process: Dirichlet distribution Urn model Chinese restaurant process Stick breaking Gamma process DP: distribution on distributions Dirichlet process mixture (DPM): a mixture model with infinitely many components where parameters of each component are drawn from a DP. Useful for clustering; assignments of points to clusters follows a CRP.

16 Samples from a Dirichlet Process Mixture of Gaussians N= N= N= N=3 Notice that more structure (clusters) appear as you draw more points. (figure inspired by Neal)

17 Approximate Inference in DPMs Gibbs sampling (e.g. Escobar and West, 995; Neal, ; Rasmussen, ) Expectation propagation (Minka and Ghahramani, 3) Variational approximation (Blei and Jordan, 5) Hierarchical clustering (Heller and Ghahramani, 5)

18 Hierarchical Clustering

19 Dirichlet Diffusion Trees (DFT) (Neal, 3) In a DPM, parameters of one mixture component are independent of another components this lack of structure is potentially undesirable. A DFT is a generalization of DPMs with hierarchical structure between components. To generate from a DFT, we will consider θ taking a random walk according to a Brownian motion Gaussian diffusion process. θ (t) Gaussian diffusion process starting at origin (θ () = ) for unit time. θ (t), also starts at the origin and follows θ but diverges at some time τ d, at which point the path followed by θ becomes independent of θ s path. a(t) is a divergence or hazard function, e.g. a(t) = /( t). For small dt: P (θ diverges (t, t + dt)) = a(t)dt m where m is the number of previous points that have followed this path. If θ i reaches a branch point between two paths, it picks a branch in proportion to the number of points that have followed that path.

20 Dirichlet Diffusion Trees (DFT) Generating from a DFT: Figure from Neal, 3.

21 Dirichlet Diffusion Trees (DFT) Some samples from DFT priors: Figure from Neal, 3.

22 Kingman s Coalescent (Kingman, 98) A model for genealogies. Working backwards in time from t = Generate tree with n individuals at leaves by merging lineages backwards in time. Every pair of lineages merges independently with rate. Time of first merge for n individuals is t Exp(n(n )/), etc This generates w.p.. a binary tree where all lineages are merged at t =. The coalescent results in the limit n and is infinitely exchangeable over individuals and has uniform marginals over tree topologies. The coalescent has been used as the basis of a hierarchical clustering algorithm by (Teh, Daumé, Roy, 7) z!") (a) (b) (c) y {,,3,4 } 3 y {, } t 3 t t π(t) = { {,, 3, 4} } { {, }, {3, 4} } { {}, {}, {3, 4} } { {}, {}, {3}, {4} } y { 3,4 } x x x 3 x 4 t =! &") &!&")!!!!")!%!%")!(!!"#!!"$!!"%!!!&"'!&"#!&"$!&"% & t #& $& %&! %&! $! $& # $ %! #!!! "! #! $ % $

23 Hierarchical Dirichlet Processes (HDP) (Teh et al, 5) Assume you have data which is divided into J groups. You assume there are clusters within each group, but you also believe these clusters are shared between groups (i.e. data points in different groups can belong to the same cluster). In an HDP there is a common DP: γ G H G H, γ DP( H, γ) which forms the base measure for a draw from a DP within each group G j G, α DP( G, α) G j φ ji x ji α n j J

24 Nonparametric Times Series Models

25 Infinite hidden Markov models (ihmms) Hidden Markov models (HMMs) are widely used sequence models for speech recognition, bioinformatics, text modelling, video monitoring, etc. S Y S Y S 3 Y 3 S T Y T HMMs can be thought of as time-dependent mixture models. s t K π k (.) In an HMM with K states, the transition matrix has K K elements. s t- k We want to let K Infinite HMMs can be derived from the HDP. K β γ Stick( γ) (base distribution over states) π k α, β DP( α, β) (transition parameters for state k =,... ) θ k H H( ) (emission parameters for state k =,... ) s t s t, (π k ) k= π st ( ) (transition) y t s t, (θ k ) k= p( θ st ) (emission) (Beal, Ghahramani, and Rasmussen, ) (Teh et al. 5)

26 Infinite hidden Markov models We let the number of states K. S S S 3 S T Y Y Y 3 Y T Experiment: Changepoint Detection Introduced in (Beal, Ghahramani and Rasmussen, ). Teh, Jordan, Beal and Blei (5) showed that ihmms can be derived from hierarchical Dirichlet processes, and provided a more efficient Gibbs sampler. We have recently derived a much more efficient sampler based on Dynamic Programming (Van Gael, Saatci, Teh, and Ghahramani, 8).

27 Completely Random Measures (Kingman, 967) measurable space: Θ with σ-algebra Ω measure: function µ : Ω [, ] assigning to each measurable set a non-neg. real random measure: measures are drawn from some distribution on measures: µ P completely random measure (CRM): the values that µ takes on disjoint subsets are independent: µ(a) µ(b) if A B = CRMs can be represented as sum of nonrandom measure, atomic measure with fixed atoms but random masses, and atomic measure with random atoms and masses. µ = µ + N or i= u iδ φ i + M or i= u i δ φi We can write µ CRM(µ, Λ, {φ i, F i }) where u i F i and {u i, φ i } are drawn from a Poisson process on (, ] Θ with rate measure Λ called the Lévy measure. Examples: Gamma process, Beta process (Hjort, 99), Stable-beta process (Teh and Görür, 9).

28 Beta Processes (Hjort, 99) CRMs can be represented as sum of nonrandom measure, atomic measure with fixed atoms but random masses, and atomic measure with random atoms and masses. µ = µ + N or i= u iδ φ i + M or i= u i δ φi For a beta process we have µ CRM(, Λ, {}) where {u i, φ i } are drawn from a Poisson process on (, ] Θ with rate Lévy measure: Λ(du dθ) = α c u ( u) c du H(dθ) θ 3.. u θ

29 Selected References Gaussian Processes: O Hagan, A. (978). Curve Fitting and Optimal Design for Prediction (with discussion). Journal of the Royal Statistical Society B, 4():-4. Rasmussen, C.E and Williams, C.K.I. (6) Gaussian processes for machine learning. MIT Press. Dirichlet Processes, Chinese Restaurant Processes, and related work Ferguson, T. (973), A Bayesian Analysis of Some Nonparametric Problems, Annals of Statistics, (), pp Blackwell, D. and MacQueen, J. (973), Ferguson Distributions via Polya Urn Schemes, Annals of Statistics,, pp Aldous, D. (985), Exchangeability and Related Topics, in Ecole d Ete de Probabilites de Saint-Flour XIII 983, Springer, Berlin, pp. 98. Hierarchical Dirichlet Processes and Infinite Hidden Markov Models Beal, M. J., Ghahramani, Z., and Rasmussen, C.E. (), The Infinite Hidden Markov Model, in T. G. Dietterich, S. Becker, and Z. Ghahramani (eds.) Advances in Neural Information Processing Systems, Cambridge, MA: MIT Press, vol. 4, pp Teh, Y.W, Jordan, M.I, Beal, M.J., and Blei, D.M. (5) Hierarchical Dirichlet Processes. J American Statistical Association.

30 van Gael, J., Saatci, Y., Teh, Y.-W., and Ghahramani, Z. (8) Beam sampling for the infinite Hidden Markov Model. In International Conference on Machine Learning (ICML 8), Dirichlet Process Mixtures Antoniak, C.E. (974) Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems. Annals of Statistics, :5-74. Escobar, M.D. and West, M. (995) Bayesian density estimation and inference using mixtures. J American Statistical Association. 9: Neal, R.M. (). Markov chain sampling methods for Dirichlet process mixture models.journal of Computational and Graphical Statistics, 9, Rasmussen, C.E. (). The infinite Gaussian mixture model. In Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press. Blei, D.M. and Jordan, M.I. (5) Variational methods for Dirichlet process mixtures. Bayesian Analysis. Minka, T.P. and Ghahramani, Z. (3) Expectation propagation for infinite mixtures. NIPS 3 Workshop on Nonparametric Bayesian Methods and Infinite Models. Heller, K.A. and Ghahramani, Z. (5) Bayesian Hierarchical Clustering. Twenty Second International Conference on Machine Learning (ICML-5) Dirichlet Diffusion Trees and Kingman s Coalescent Neal, R.M. (3) Density modeling and clustering using Dirichlet diffusion trees, in J. M. Bernardo, et al. (editors) Bayesian Statistics 7. Kingman., J. F. C. (98) The coalescent. Stochastic Processes and their Applications, 3(3):35 48.

31 Teh, Y. W., Daumé III, H., and Roy, D. (7) Bayesian agglomerative clustering with coalescents. In Advances in Neural Information Processing Systems (NIPS), Vancouver, Canada. Indian Buffet Processes Griffiths, T.L., and Ghahramani, Z. (6) Infinite Latent Feature Models and the Indian Buffet Process. In Advances in Neural Information Processing Systems 8: Cambridge, MA: MIT Press. Teh, Y.W., Görür, D. and Ghahramani, Z. (7) Stick-breaking Construction for the Indian Buffet. In Eleventh International Conference on Artificial Intelligence and Statistics (AISTATS 7). San Juan, Puerto Rico. Thibaux, R. and Jordan, M. I. (7) Hierarchical Beta processes and the Indian buffet process. In Eleventh International Conference on Artificial Intelligence and Statistics (AISTATS 7). San Juan, Puerto Rico. Other Hjort, N. L. (99) Nonparametric Bayes estimators based on Beta processes in models for life history data. Annals of Statistics, 8: Kingman, J. F. C. (967) Completely Random Measures. (). Pacific Journal of Mathematics Teh, Y. W. and Görür, D. (9) Indian buffet processes with power-law behavior. In Advances in Neural Information Processing Systems.

Dirichlet Processes and other non-parametric Bayesian models

Dirichlet Processes and other non-parametric Bayesian models Dirichlet Processes and other non-parametric Bayesian models Zoubin Ghahramani http://learning.eng.cam.ac.uk/zoubin/ zoubin@cs.cmu.edu Statistical Machine Learning CMU 10-702 / 36-702 Spring 2008 Model

More information

Non-parametric Bayesian Methods

Non-parametric Bayesian Methods Non-parametric Bayesian Methods Uncertainty in Artificial Intelligence Tutorial July 25 Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK Center for Automated Learning

More information

Lecture 3a: Dirichlet processes

Lecture 3a: Dirichlet processes Lecture 3a: Dirichlet processes Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London c.archambeau@cs.ucl.ac.uk Advanced Topics

More information

Bayesian Nonparametrics for Speech and Signal Processing

Bayesian Nonparametrics for Speech and Signal Processing Bayesian Nonparametrics for Speech and Signal Processing Michael I. Jordan University of California, Berkeley June 28, 2011 Acknowledgments: Emily Fox, Erik Sudderth, Yee Whye Teh, and Romain Thibaux Computer

More information

Bayesian Hidden Markov Models and Extensions

Bayesian Hidden Markov Models and Extensions Bayesian Hidden Markov Models and Extensions Zoubin Ghahramani Department of Engineering University of Cambridge joint work with Matt Beal, Jurgen van Gael, Yunus Saatci, Tom Stepleton, Yee Whye Teh Modeling

More information

Bayesian Nonparametrics: Dirichlet Process

Bayesian Nonparametrics: Dirichlet Process Bayesian Nonparametrics: Dirichlet Process Yee Whye Teh Gatsby Computational Neuroscience Unit, UCL http://www.gatsby.ucl.ac.uk/~ywteh/teaching/npbayes2012 Dirichlet Process Cornerstone of modern Bayesian

More information

Non-parametric Clustering with Dirichlet Processes

Non-parametric Clustering with Dirichlet Processes Non-parametric Clustering with Dirichlet Processes Timothy Burns SUNY at Buffalo Mar. 31 2009 T. Burns (SUNY at Buffalo) Non-parametric Clustering with Dirichlet Processes Mar. 31 2009 1 / 24 Introduction

More information

Bayesian Nonparametrics

Bayesian Nonparametrics Bayesian Nonparametrics Lorenzo Rosasco 9.520 Class 18 April 11, 2011 About this class Goal To give an overview of some of the basic concepts in Bayesian Nonparametrics. In particular, to discuss Dirichelet

More information

Dirichlet Processes: Tutorial and Practical Course

Dirichlet Processes: Tutorial and Practical Course Dirichlet Processes: Tutorial and Practical Course (updated) Yee Whye Teh Gatsby Computational Neuroscience Unit University College London August 2007 / MLSS Yee Whye Teh (Gatsby) DP August 2007 / MLSS

More information

Bayesian Nonparametrics

Bayesian Nonparametrics Bayesian Nonparametrics Peter Orbanz Columbia University PARAMETERS AND PATTERNS Parameters P(X θ) = Probability[data pattern] 3 2 1 0 1 2 3 5 0 5 Inference idea data = underlying pattern + independent

More information

Bayesian nonparametric latent feature models

Bayesian nonparametric latent feature models Bayesian nonparametric latent feature models François Caron UBC October 2, 2007 / MLRG François Caron (UBC) Bayes. nonparametric latent feature models October 2, 2007 / MLRG 1 / 29 Overview 1 Introduction

More information

Bayesian Nonparametric Models

Bayesian Nonparametric Models Bayesian Nonparametric Models David M. Blei Columbia University December 15, 2015 Introduction We have been looking at models that posit latent structure in high dimensional data. We use the posterior

More information

Infinite latent feature models and the Indian Buffet Process

Infinite latent feature models and the Indian Buffet Process p.1 Infinite latent feature models and the Indian Buffet Process Tom Griffiths Cognitive and Linguistic Sciences Brown University Joint work with Zoubin Ghahramani p.2 Beyond latent classes Unsupervised

More information

Nonparametric Probabilistic Modelling

Nonparametric Probabilistic Modelling Nonparametric Probabilistic Modelling Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Signal processing and inference

More information

Non-Parametric Bayes

Non-Parametric Bayes Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian

More information

Nonparametric Bayesian Models for Sparse Matrices and Covariances

Nonparametric Bayesian Models for Sparse Matrices and Covariances Nonparametric Bayesian Models for Sparse Matrices and Covariances Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Bayes

More information

Infinite Latent Feature Models and the Indian Buffet Process

Infinite Latent Feature Models and the Indian Buffet Process Infinite Latent Feature Models and the Indian Buffet Process Thomas L. Griffiths Cognitive and Linguistic Sciences Brown University, Providence RI 292 tom griffiths@brown.edu Zoubin Ghahramani Gatsby Computational

More information

Dirichlet Process. Yee Whye Teh, University College London

Dirichlet Process. Yee Whye Teh, University College London Dirichlet Process Yee Whye Teh, University College London Related keywords: Bayesian nonparametrics, stochastic processes, clustering, infinite mixture model, Blackwell-MacQueen urn scheme, Chinese restaurant

More information

Hierarchical Dirichlet Processes

Hierarchical Dirichlet Processes Hierarchical Dirichlet Processes Yee Whye Teh, Michael I. Jordan, Matthew J. Beal and David M. Blei Computer Science Div., Dept. of Statistics Dept. of Computer Science University of California at Berkeley

More information

Tree-Based Inference for Dirichlet Process Mixtures

Tree-Based Inference for Dirichlet Process Mixtures Yang Xu Machine Learning Department School of Computer Science Carnegie Mellon University Pittsburgh, USA Katherine A. Heller Department of Engineering University of Cambridge Cambridge, UK Zoubin Ghahramani

More information

Bayesian nonparametric latent feature models

Bayesian nonparametric latent feature models Bayesian nonparametric latent feature models Indian Buffet process, beta process, and related models François Caron Department of Statistics, Oxford Applied Bayesian Statistics Summer School Como, Italy

More information

The Indian Buffet Process: An Introduction and Review

The Indian Buffet Process: An Introduction and Review Journal of Machine Learning Research 12 (2011) 1185-1224 Submitted 3/10; Revised 3/11; Published 4/11 The Indian Buffet Process: An Introduction and Review Thomas L. Griffiths Department of Psychology

More information

Sharing Clusters Among Related Groups: Hierarchical Dirichlet Processes

Sharing Clusters Among Related Groups: Hierarchical Dirichlet Processes Sharing Clusters Among Related Groups: Hierarchical Dirichlet Processes Yee Whye Teh (1), Michael I. Jordan (1,2), Matthew J. Beal (3) and David M. Blei (1) (1) Computer Science Div., (2) Dept. of Statistics

More information

Nonparametric Bayesian Methods: Models, Algorithms, and Applications (Day 5)

Nonparametric Bayesian Methods: Models, Algorithms, and Applications (Day 5) Nonparametric Bayesian Methods: Models, Algorithms, and Applications (Day 5) Tamara Broderick ITT Career Development Assistant Professor Electrical Engineering & Computer Science MIT Bayes Foundations

More information

Bayesian nonparametrics

Bayesian nonparametrics Bayesian nonparametrics 1 Some preliminaries 1.1 de Finetti s theorem We will start our discussion with this foundational theorem. We will assume throughout all variables are defined on the probability

More information

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes MAD-Bayes: MAP-based Asymptotic Derivations from Bayes Tamara Broderick Brian Kulis Michael I. Jordan Cat Clusters Mouse clusters Dog 1 Cat Clusters Dog Mouse Lizard Sheep Picture 1 Picture 2 Picture 3

More information

Bayesian Nonparametric Learning of Complex Dynamical Phenomena

Bayesian Nonparametric Learning of Complex Dynamical Phenomena Duke University Department of Statistical Science Bayesian Nonparametric Learning of Complex Dynamical Phenomena Emily Fox Joint work with Erik Sudderth (Brown University), Michael Jordan (UC Berkeley),

More information

Hierarchical Models, Nested Models and Completely Random Measures

Hierarchical Models, Nested Models and Completely Random Measures See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/238729763 Hierarchical Models, Nested Models and Completely Random Measures Article March 2012

More information

Bayesian Nonparametrics: Models Based on the Dirichlet Process

Bayesian Nonparametrics: Models Based on the Dirichlet Process Bayesian Nonparametrics: Models Based on the Dirichlet Process Alessandro Panella Department of Computer Science University of Illinois at Chicago Machine Learning Seminar Series February 18, 2013 Alessandro

More information

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution Outline A short review on Bayesian analysis. Binomial, Multinomial, Normal, Beta, Dirichlet Posterior mean, MAP, credible interval, posterior distribution Gibbs sampling Revisit the Gaussian mixture model

More information

Bayesian Nonparametric Models on Decomposable Graphs

Bayesian Nonparametric Models on Decomposable Graphs Bayesian Nonparametric Models on Decomposable Graphs François Caron INRIA Bordeaux Sud Ouest Institut de Mathématiques de Bordeaux University of Bordeaux, France francois.caron@inria.fr Arnaud Doucet Departments

More information

Foundations of Nonparametric Bayesian Methods

Foundations of Nonparametric Bayesian Methods 1 / 27 Foundations of Nonparametric Bayesian Methods Part II: Models on the Simplex Peter Orbanz http://mlg.eng.cam.ac.uk/porbanz/npb-tutorial.html 2 / 27 Tutorial Overview Part I: Basics Part II: Models

More information

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations

More information

Properties of Bayesian nonparametric models and priors over trees

Properties of Bayesian nonparametric models and priors over trees Properties of Bayesian nonparametric models and priors over trees David A. Knowles Computer Science Department Stanford University July 24, 2013 Introduction Theory: what characteristics might we want?

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Infinite Feature Models: The Indian Buffet Process Eric Xing Lecture 21, April 2, 214 Acknowledgement: slides first drafted by Sinead Williamson

More information

Nonparametric Bayesian Methods - Lecture I

Nonparametric Bayesian Methods - Lecture I Nonparametric Bayesian Methods - Lecture I Harry van Zanten Korteweg-de Vries Institute for Mathematics CRiSM Masterclass, April 4-6, 2016 Overview of the lectures I Intro to nonparametric Bayesian statistics

More information

Haupthseminar: Machine Learning. Chinese Restaurant Process, Indian Buffet Process

Haupthseminar: Machine Learning. Chinese Restaurant Process, Indian Buffet Process Haupthseminar: Machine Learning Chinese Restaurant Process, Indian Buffet Process Agenda Motivation Chinese Restaurant Process- CRP Dirichlet Process Interlude on CRP Infinite and CRP mixture model Estimation

More information

Bayesian nonparametric models for bipartite graphs

Bayesian nonparametric models for bipartite graphs Bayesian nonparametric models for bipartite graphs François Caron Department of Statistics, Oxford Statistics Colloquium, Harvard University November 11, 2013 F. Caron 1 / 27 Bipartite networks Readers/Customers

More information

Applied Bayesian Nonparametrics 3. Infinite Hidden Markov Models

Applied Bayesian Nonparametrics 3. Infinite Hidden Markov Models Applied Bayesian Nonparametrics 3. Infinite Hidden Markov Models Tutorial at CVPR 2012 Erik Sudderth Brown University Work by E. Fox, E. Sudderth, M. Jordan, & A. Willsky AOAS 2011: A Sticky HDP-HMM with

More information

CS281B / Stat 241B : Statistical Learning Theory Lecture: #22 on 19 Apr Dirichlet Process I

CS281B / Stat 241B : Statistical Learning Theory Lecture: #22 on 19 Apr Dirichlet Process I X i Ν CS281B / Stat 241B : Statistical Learning Theory Lecture: #22 on 19 Apr 2004 Dirichlet Process I Lecturer: Prof. Michael Jordan Scribe: Daniel Schonberg dschonbe@eecs.berkeley.edu 22.1 Dirichlet

More information

Applied Nonparametric Bayes

Applied Nonparametric Bayes Applied Nonparametric Bayes Michael I. Jordan Department of Electrical Engineering and Computer Science Department of Statistics University of California, Berkeley http://www.cs.berkeley.edu/ jordan Acknowledgments:

More information

Truncation error of a superposed gamma process in a decreasing order representation

Truncation error of a superposed gamma process in a decreasing order representation Truncation error of a superposed gamma process in a decreasing order representation B julyan.arbel@inria.fr Í www.julyanarbel.com Inria, Mistis, Grenoble, France Joint work with Igor Pru nster (Bocconi

More information

Bayesian non parametric approaches: an introduction

Bayesian non parametric approaches: an introduction Introduction Latent class models Latent feature models Conclusion & Perspectives Bayesian non parametric approaches: an introduction Pierre CHAINAIS Bordeaux - nov. 2012 Trajectory 1 Bayesian non parametric

More information

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group

Nonparmeteric Bayes & Gaussian Processes. Baback Moghaddam Machine Learning Group Nonparmeteric Bayes & Gaussian Processes Baback Moghaddam baback@jpl.nasa.gov Machine Learning Group Outline Bayesian Inference Hierarchical Models Model Selection Parametric vs. Nonparametric Gaussian

More information

19 : Bayesian Nonparametrics: The Indian Buffet Process. 1 Latent Variable Models and the Indian Buffet Process

19 : Bayesian Nonparametrics: The Indian Buffet Process. 1 Latent Variable Models and the Indian Buffet Process 10-708: Probabilistic Graphical Models, Spring 2015 19 : Bayesian Nonparametrics: The Indian Buffet Process Lecturer: Avinava Dubey Scribes: Rishav Das, Adam Brodie, and Hemank Lamba 1 Latent Variable

More information

Gentle Introduction to Infinite Gaussian Mixture Modeling

Gentle Introduction to Infinite Gaussian Mixture Modeling Gentle Introduction to Infinite Gaussian Mixture Modeling with an application in neuroscience By Frank Wood Rasmussen, NIPS 1999 Neuroscience Application: Spike Sorting Important in neuroscience and for

More information

On the posterior structure of NRMI

On the posterior structure of NRMI On the posterior structure of NRMI Igor Prünster University of Turin, Collegio Carlo Alberto and ICER Joint work with L.F. James and A. Lijoi Isaac Newton Institute, BNR Programme, 8th August 2007 Outline

More information

arxiv: v2 [stat.ml] 4 Aug 2011

arxiv: v2 [stat.ml] 4 Aug 2011 A Tutorial on Bayesian Nonparametric Models Samuel J. Gershman 1 and David M. Blei 2 1 Department of Psychology and Neuroscience Institute, Princeton University 2 Department of Computer Science, Princeton

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Lecture 16-17: Bayesian Nonparametrics I. STAT 6474 Instructor: Hongxiao Zhu

Lecture 16-17: Bayesian Nonparametrics I. STAT 6474 Instructor: Hongxiao Zhu Lecture 16-17: Bayesian Nonparametrics I STAT 6474 Instructor: Hongxiao Zhu Plan for today Why Bayesian Nonparametrics? Dirichlet Distribution and Dirichlet Processes. 2 Parameter and Patterns Reference:

More information

A Simple Proof of the Stick-Breaking Construction of the Dirichlet Process

A Simple Proof of the Stick-Breaking Construction of the Dirichlet Process A Simple Proof of the Stick-Breaking Construction of the Dirichlet Process John Paisley Department of Computer Science Princeton University, Princeton, NJ jpaisley@princeton.edu Abstract We give a simple

More information

Construction of Dependent Dirichlet Processes based on Poisson Processes

Construction of Dependent Dirichlet Processes based on Poisson Processes 1 / 31 Construction of Dependent Dirichlet Processes based on Poisson Processes Dahua Lin Eric Grimson John Fisher CSAIL MIT NIPS 2010 Outstanding Student Paper Award Presented by Shouyuan Chen Outline

More information

Dirichlet Enhanced Latent Semantic Analysis

Dirichlet Enhanced Latent Semantic Analysis Dirichlet Enhanced Latent Semantic Analysis Kai Yu Siemens Corporate Technology D-81730 Munich, Germany Kai.Yu@siemens.com Shipeng Yu Institute for Computer Science University of Munich D-80538 Munich,

More information

An Introduction to Bayesian Nonparametric Modelling

An Introduction to Bayesian Nonparametric Modelling An Introduction to Bayesian Nonparametric Modelling Yee Whye Teh Gatsby Computational Neuroscience Unit University College London October, 2009 / Toronto Outline Some Examples of Parametric Models Bayesian

More information

STAT Advanced Bayesian Inference

STAT Advanced Bayesian Inference 1 / 32 STAT 625 - Advanced Bayesian Inference Meng Li Department of Statistics Jan 23, 218 The Dirichlet distribution 2 / 32 θ Dirichlet(a 1,...,a k ) with density p(θ 1,θ 2,...,θ k ) = k j=1 Γ(a j) Γ(

More information

Stochastic Processes, Kernel Regression, Infinite Mixture Models

Stochastic Processes, Kernel Regression, Infinite Mixture Models Stochastic Processes, Kernel Regression, Infinite Mixture Models Gabriel Huang (TA for Simon Lacoste-Julien) IFT 6269 : Probabilistic Graphical Models - Fall 2018 Stochastic Process = Random Function 2

More information

Nonparametric Mixed Membership Models

Nonparametric Mixed Membership Models 5 Nonparametric Mixed Membership Models Daniel Heinz Department of Mathematics and Statistics, Loyola University of Maryland, Baltimore, MD 21210, USA CONTENTS 5.1 Introduction................................................................................

More information

Distance dependent Chinese restaurant processes

Distance dependent Chinese restaurant processes David M. Blei Department of Computer Science, Princeton University 35 Olden St., Princeton, NJ 08540 Peter Frazier Department of Operations Research and Information Engineering, Cornell University 232

More information

Spatial Bayesian Nonparametrics for Natural Image Segmentation

Spatial Bayesian Nonparametrics for Natural Image Segmentation Spatial Bayesian Nonparametrics for Natural Image Segmentation Erik Sudderth Brown University Joint work with Michael Jordan University of California Soumya Ghosh Brown University Parsing Visual Scenes

More information

CSCI 5822 Probabilistic Model of Human and Machine Learning. Mike Mozer University of Colorado

CSCI 5822 Probabilistic Model of Human and Machine Learning. Mike Mozer University of Colorado CSCI 5822 Probabilistic Model of Human and Machine Learning Mike Mozer University of Colorado Topics Language modeling Hierarchical processes Pitman-Yor processes Based on work of Teh (2006), A hierarchical

More information

IPSJ SIG Technical Report Vol.2014-MPS-100 No /9/25 1,a) 1 1 SNS / / / / / / Time Series Topic Model Considering Dependence to Multiple Topics S

IPSJ SIG Technical Report Vol.2014-MPS-100 No /9/25 1,a) 1 1 SNS / / / / / / Time Series Topic Model Considering Dependence to Multiple Topics S 1,a) 1 1 SNS /// / // Time Series Topic Model Considering Dependence to Multiple Topics Sasaki Kentaro 1,a) Yoshikawa Tomohiro 1 Furuhashi Takeshi 1 Abstract: This pater proposes a topic model that considers

More information

Machine Learning Summer School

Machine Learning Summer School Machine Learning Summer School Lecture 3: Learning parameters and structure Zoubin Ghahramani zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Department of Engineering University of Cambridge,

More information

arxiv: v1 [stat.ml] 8 Jan 2012

arxiv: v1 [stat.ml] 8 Jan 2012 A Split-Merge MCMC Algorithm for the Hierarchical Dirichlet Process Chong Wang David M. Blei arxiv:1201.1657v1 [stat.ml] 8 Jan 2012 Received: date / Accepted: date Abstract The hierarchical Dirichlet process

More information

Nonparametric Factor Analysis with Beta Process Priors

Nonparametric Factor Analysis with Beta Process Priors Nonparametric Factor Analysis with Beta Process Priors John Paisley Lawrence Carin Department of Electrical & Computer Engineering Duke University, Durham, NC 7708 jwp4@ee.duke.edu lcarin@ee.duke.edu Abstract

More information

Inference in Explicit Duration Hidden Markov Models

Inference in Explicit Duration Hidden Markov Models Inference in Explicit Duration Hidden Markov Models Frank Wood Joint work with Chris Wiggins, Mike Dewar Columbia University November, 2011 Wood (Columbia University) EDHMM Inference November, 2011 1 /

More information

Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures

Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures 17th Europ. Conf. on Machine Learning, Berlin, Germany, 2006. Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures Shipeng Yu 1,2, Kai Yu 2, Volker Tresp 2, and Hans-Peter

More information

An Alternative Prior Process for Nonparametric Bayesian Clustering

An Alternative Prior Process for Nonparametric Bayesian Clustering University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 2010 An Alternative Prior Process for Nonparametric Bayesian Clustering Hanna M. Wallach University of Massachusetts

More information

Dependent hierarchical processes for multi armed bandits

Dependent hierarchical processes for multi armed bandits Dependent hierarchical processes for multi armed bandits Federico Camerlenghi University of Bologna, BIDSA & Collegio Carlo Alberto First Italian meeting on Probability and Mathematical Statistics, Torino

More information

Evolutionary Clustering by Hierarchical Dirichlet Process with Hidden Markov State

Evolutionary Clustering by Hierarchical Dirichlet Process with Hidden Markov State Evolutionary Clustering by Hierarchical Dirichlet Process with Hidden Markov State Tianbing Xu 1 Zhongfei (Mark) Zhang 1 1 Dept. of Computer Science State Univ. of New York at Binghamton Binghamton, NY

More information

Priors for Random Count Matrices with Random or Fixed Row Sums

Priors for Random Count Matrices with Random or Fixed Row Sums Priors for Random Count Matrices with Random or Fixed Row Sums Mingyuan Zhou Joint work with Oscar Madrid and James Scott IROM Department, McCombs School of Business Department of Statistics and Data Sciences

More information

Variational Inference for the Nested Chinese Restaurant Process

Variational Inference for the Nested Chinese Restaurant Process Variational Inference for the Nested Chinese Restaurant Process Chong Wang Computer Science Department Princeton University chongw@cs.princeton.edu David M. Blei Computer Science Department Princeton University

More information

The Infinite Factorial Hidden Markov Model

The Infinite Factorial Hidden Markov Model The Infinite Factorial Hidden Markov Model Jurgen Van Gael Department of Engineering University of Cambridge, UK jv279@cam.ac.uk Yee Whye Teh Gatsby Unit University College London, UK ywteh@gatsby.ucl.ac.uk

More information

A permutation-augmented sampler for DP mixture models

A permutation-augmented sampler for DP mixture models Percy Liang University of California, Berkeley Michael Jordan University of California, Berkeley Ben Taskar University of Pennsylvania Abstract We introduce a new inference algorithm for Dirichlet process

More information

Bayesian Mixtures of Bernoulli Distributions

Bayesian Mixtures of Bernoulli Distributions Bayesian Mixtures of Bernoulli Distributions Laurens van der Maaten Department of Computer Science and Engineering University of California, San Diego Introduction The mixture of Bernoulli distributions

More information

Construction of Dependent Dirichlet Processes based on Poisson Processes

Construction of Dependent Dirichlet Processes based on Poisson Processes Construction of Dependent Dirichlet Processes based on Poisson Processes Dahua Lin CSAIL, MIT dhlin@mit.edu Eric Grimson CSAIL, MIT welg@csail.mit.edu John Fisher CSAIL, MIT fisher@csail.mit.edu Abstract

More information

Lecture 6: Graphical Models: Learning

Lecture 6: Graphical Models: Learning Lecture 6: Graphical Models: Learning 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering, University of Cambridge February 3rd, 2010 Ghahramani & Rasmussen (CUED)

More information

Image segmentation combining Markov Random Fields and Dirichlet Processes

Image segmentation combining Markov Random Fields and Dirichlet Processes Image segmentation combining Markov Random Fields and Dirichlet Processes Jessica SODJO IMS, Groupe Signal Image, Talence Encadrants : A. Giremus, J.-F. Giovannelli, F. Caron, N. Dobigeon Jessica SODJO

More information

Advanced Machine Learning

Advanced Machine Learning Advanced Machine Learning Nonparametric Bayesian Models --Learning/Reasoning in Open Possible Worlds Eric Xing Lecture 7, August 4, 2009 Reading: Eric Xing Eric Xing @ CMU, 2006-2009 Clustering Eric Xing

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 4 Problem: Density Estimation We have observed data, y 1,..., y n, drawn independently from some unknown

More information

Stick-Breaking Beta Processes and the Poisson Process

Stick-Breaking Beta Processes and the Poisson Process Stic-Breaing Beta Processes and the Poisson Process John Paisley David M. Blei 3 Michael I. Jordan,2 Department of EECS, 2 Department of Statistics, UC Bereley 3 Computer Science Department, Princeton

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Message Passing Algorithms for Dirichlet Diffusion Trees

Message Passing Algorithms for Dirichlet Diffusion Trees Message Passing Algorithms for Dirichlet Diffusion Trees David A. Knowles University of Cambridge Jurgen Van Gael Microsoft Research Cambridge Zoubin Ghahramani University of Cambridge dak33@cam.ac.uk

More information

Spatial Normalized Gamma Process

Spatial Normalized Gamma Process Spatial Normalized Gamma Process Vinayak Rao Yee Whye Teh Presented at NIPS 2009 Discussion and Slides by Eric Wang June 23, 2010 Outline Introduction Motivation The Gamma Process Spatial Normalized Gamma

More information

Bayesian nonparametric models for bipartite graphs

Bayesian nonparametric models for bipartite graphs Bayesian nonparametric models for bipartite graphs François Caron INRIA IMB - University of Bordeaux Talence, France Francois.Caron@inria.fr Abstract We develop a novel Bayesian nonparametric model for

More information

Mixed Membership Models for Time Series

Mixed Membership Models for Time Series 20 Mixed Membership Models for Time Series Emily B. Fox Department of Statistics, University of Washington, Seattle, WA 98195, USA Michael I. Jordan Computer Science Division and Department of Statistics,

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Compound Random Measures

Compound Random Measures Compound Random Measures Jim Griffin (joint work with Fabrizio Leisen) University of Kent Introduction: Two clinical studies 3 CALGB8881 3 CALGB916 2 2 β 1 1 β 1 1 1 5 5 β 1 5 5 β Infinite mixture models

More information

Infinite Hierarchical Hidden Markov Models

Infinite Hierarchical Hidden Markov Models Katherine A. Heller Engineering Department University of Cambridge Cambridge, UK heller@gatsby.ucl.ac.uk Yee Whye Teh and Dilan Görür Gatsby Unit University College London London, UK {ywteh,dilan}@gatsby.ucl.ac.uk

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

Sparsity? A Bayesian view

Sparsity? A Bayesian view Sparsity? A Bayesian view Zoubin Ghahramani Department of Engineering University of Cambridge SPARS Conference Cambridge, July 2015 Sparsity Many people are interested in sparsity. Why? Sparsity Many people

More information

arxiv: v1 [stat.ml] 5 Dec 2016

arxiv: v1 [stat.ml] 5 Dec 2016 A Nonparametric Latent Factor Model For Location-Aware Video Recommendations arxiv:1612.01481v1 [stat.ml] 5 Dec 2016 Ehtsham Elahi Algorithms Engineering Netflix, Inc. Los Gatos, CA 95032 eelahi@netflix.com

More information

Hierarchical Bayesian Languge Model Based on Pitman-Yor Processes. Yee Whye Teh

Hierarchical Bayesian Languge Model Based on Pitman-Yor Processes. Yee Whye Teh Hierarchical Bayesian Languge Model Based on Pitman-Yor Processes Yee Whye Teh Probabilistic model of language n-gram model Utility i-1 P(word i word i-n+1 ) Typically, trigram model (n=3) e.g., speech,

More information

Unsupervised Learning

Unsupervised Learning Unsupervised Learning Bayesian Model Comparison Zoubin Ghahramani zoubin@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc in Intelligent Systems, Dept Computer Science University College

More information

arxiv: v1 [stat.ml] 20 Nov 2012

arxiv: v1 [stat.ml] 20 Nov 2012 A survey of non-exchangeable priors for Bayesian nonparametric models arxiv:1211.4798v1 [stat.ml] 20 Nov 2012 Nicholas J. Foti 1 and Sinead Williamson 2 1 Department of Computer Science, Dartmouth College

More information

Bayesian Nonparametrics: some contributions to construction and properties of prior distributions

Bayesian Nonparametrics: some contributions to construction and properties of prior distributions Bayesian Nonparametrics: some contributions to construction and properties of prior distributions Annalisa Cerquetti Collegio Nuovo, University of Pavia, Italy Interview Day, CETL Lectureship in Statistics,

More information

A Stick-Breaking Construction of the Beta Process

A Stick-Breaking Construction of the Beta Process John Paisley 1 jwp4@ee.duke.edu Aimee Zaas 2 aimee.zaas@duke.edu Christopher W. Woods 2 woods004@mc.duke.edu Geoffrey S. Ginsburg 2 ginsb005@duke.edu Lawrence Carin 1 lcarin@ee.duke.edu 1 Department of

More information

Statistical Approaches to Learning and Discovery

Statistical Approaches to Learning and Discovery Statistical Approaches to Learning and Discovery Bayesian Model Selection Zoubin Ghahramani & Teddy Seidenfeld zoubin@cs.cmu.edu & teddy@stat.cmu.edu CALD / CS / Statistics / Philosophy Carnegie Mellon

More information

Hidden Markov models: from the beginning to the state of the art

Hidden Markov models: from the beginning to the state of the art Hidden Markov models: from the beginning to the state of the art Frank Wood Columbia University November, 2011 Wood (Columbia University) HMMs November, 2011 1 / 44 Outline Overview of hidden Markov models

More information

CMPS 242: Project Report

CMPS 242: Project Report CMPS 242: Project Report RadhaKrishna Vuppala Univ. of California, Santa Cruz vrk@soe.ucsc.edu Abstract The classification procedures impose certain models on the data and when the assumption match the

More information

The Infinite Markov Model

The Infinite Markov Model The Infinite Markov Model Daichi Mochihashi NTT Communication Science Laboratories, Japan daichi@cslab.kecl.ntt.co.jp NIPS 2007 The Infinite Markov Model (NIPS 2007) p.1/20 Overview ɛ ɛ is of will is of

More information