Bayesian nonparametric latent feature models

Size: px
Start display at page:

Download "Bayesian nonparametric latent feature models"

Transcription

1 Bayesian nonparametric latent feature models Indian Buffet process, beta process, and related models François Caron Department of Statistics, Oxford Applied Bayesian Statistics Summer School Como, Italy June 16-, 14 F. Caron 1 / 62 Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 2 / 62

2 Outline Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 3 / 62 Introduction Clustering Cluster/partition a set of items i = 1,..., n into clusters F. Caron 4 / 62

3 Introduction Clustering Random partition π n = {A n,1,..., A n,kn } where A n,j, j = 1,..., K n non-empty and non-overlapping subsets of [n] := {1,..., n} with K n A j = [n] A j are clusters, K n n is the number of clusters Example π 6 = {{1, 4, }, {2, 3}, {6}} F. Caron / 62 Introduction Clustering Nonparametric approach: K n can increase unboundedly with the number of items n Exchangeable random partition: Distribution is invariant w.r.t. any permutation of [n], e.g. P ({{1, 2}, {3}}) = P ({{2, 3}, {1}}) = P ({{1, 3}, {2}}) Labelling/ordering of the items is of no importance Chinese restaurant process is an example of a generative process for an exchangeable partition F. Caron 6 / 62

4 Introduction Latent feature models Set of objects i = 1,..., n Objects i have a set of features/attributes, shared amongst objects Example: Image 1 Image 2 Tree Human Image 3 Human Image 4 Tree Human Image Road Animal F. Caron 7 / 62 Introduction Latent feature models Dynamic state-space models Collection of time series with shared dynamical behaviors [Fox et al., 09] F. Caron 8 / 62

5 Introduction Latent feature models Application to dynamic state-space models Collection of time series with shared dynamical behaviors [Fox et al., 09] F. Caron 9 / 62 Introduction Latent feature models Collaborative filtering: predict missing entries in a user/items matrix from a subset of its entries Low-rank assumption: matrix can be decomposed with a small number of latent features User/feature association matrix [Meeds et al., 07] F. Caron / 62

6 Introduction Latent feature models Random feature allocation Representation as a multiset of [n] = {1,..., n} f n = {A n,1,..., A n,kn } where A n,j, j = 1,..., K n are non-empty (possibly overlapping) subsets of [n] A n,j, j = 1,..., K n are sets of objects sharing a given feature j Example: f = {{2, 3, 4}, {2, 4}, {}, {}} Image 1 Image 2 Tree Human Image 3 Human Image 4 Tree Human Image Road Animal [Broderick et al., 13a] F. Caron 11 / 62 Introduction Latent feature models Multisets often graphically represented by a binary matrix Beware that feature labelling does not matter! Features Features Object 1 Object 2 Object 3 represent the same multiset and f 3 = {{1, 2, 3}, {1, 3}, {1, 2}, {2}, {2, 3}, {3}, {3}} F. Caron 12 / 62

7 Introduction Latent feature models Nonparametric approach: the number of features K n can increase unboundedly with n Exchangeable latent feature model: distribution of f n invariant w.r.t. any permutation σ of [n], e.g. Pr({{2, 3, 4}, {2, 4}, {}, {}}) = Pr({{3, 4, }, {3, }, {1}, {1}}) = Pr({{σ(2), σ(3), σ(4)}, {σ(2), σ(4)}, {σ()}, {σ()}}) for any permutation σ of {1, 2, 3, 4, } F. Caron 13 / 62 Outline Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 14 / 62

8 Indian buffet process Generative model for multisets Single parameter α > 0 First customer picks K + 1 Poisson(α) dishes Then each customer i = 2,... chooses a dish j previously chosen mi 1,j times with probability m i 1,j /i picks an additional set of dishes K + i Customer 1 Customer 2 Customer 3 Dishes Poisson(α/i) f 3 = {{1, 2, 3}, {1, 3}, {1, 2}, {2}, {2, 3}, {3}, {3}} [Griffiths and Ghahramani, 0, Griffiths and Ghahramani, 11] F. Caron / 62 Indian buffet process alpha=1 alpha= alpha= Objects Objects Objects Features Features Features F. Caron 16 / 62

9 Indian buffet process Rich gets richer process: more popular dishes are more likely to be chosen by new customers New dishes can always be picked as new customers arrive, but at a decreasing rate α/i Number of features/dishes for n customers follows a Poisson distribution with rate α n i=1 1 i α log(n) Number of dishes picked by each customer (degree of a customer) follows Poisson(α) Degree distribution of features follows a heavy tail distribution F. Caron 17 / 62 Indian buffet process Number of occurences Distribution Degree of objects Degree of features F. Caron 18 / 62

10 Indian buffet process Multiset f n = {A n,1,..., A n,kn } with m n,j = A n,j Let {Ãn,1,..., Ãn, K n } be the set of unique values in f n, and κ 1,..., be their multiplicities, then κ Kn Pr(f n ) = αk n Kn h=1 κ h! e α n i=1 1 i K n (m n,j 1)!(n m n,j )! n! Does not depend on the ordering of the customers Exchangeable latent feature model F. Caron 19 / 62 Indian buffet process How to derive the IBP? Limit of a parametric beta Bernoulli model Completely random measures F. Caron / 62

11 Outline Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 21 / 62 Parametric beta Bernoulli model Binary matrix z = (z i,j ) of size n p For j = 1,..., p ( ) α π j Beta p, 1 For i = 1,..., n and j = 1,..., p z i,j π j Ber(π j ) (a) p = 0 (b) p = 00 F. Caron 22 / 62

12 Parametric beta Bernoulli model Pr(z) = = = = p 1 0 n 0 i=1 p 1 πj p p π z i,j j (1 π j ) 1 z i,j Beta(π j ; α/p, 1)dπ j i z ij (1 π j ) n i z ij Beta(π j ; α/p, 1)dπ j B( i z ij + α/p, n i z ij + 1) B(α/p, 1) α/pγ( i z ij + α/p)γ(n i z ij + 1) Γ(n α/p) where B(a, b) = Γ(a)Γ(b) Γ(a+b) Γ(a + 1) = aγ(a). is the beta function, using F. Caron 23 / 62 Parametric beta Bernoulli model Let f n = multiset(z) denote the multiset corresponding to z multiset(z) = {{i z ij = 1}, j = 1,..., p s.t. i z ij > 0} Many matrices z correspond to the same multiset Let E(f n ) = {z f n = multiset(z)} be the set of matrices corresponding to the same multiset f n Cardinality of E(f n ) E(f n ) = p! κ 0! Kn h=1 κ h! where κ 0 is the number of all-zero columns. F. Caron 24 / 62

13 Parametric beta Bernoulli model Due to column exchangeability, all matrices z E(f n ) have the same probability Pr(f n ) = Pr(z) = p! z E(f n ) κ 0! Kn h=1 κ h! = αk n Kh K n α/pγ(m n,j + α/p)γ(n m n,j + 1) Γ(n α/p) ( α/pγ(α/p)γ(n + 1) Γ(n α/p) ( ) p! n!γ(α/p) p h=1 κ h! κ 0!p K n Γ(n α/p) K n Γ(m n,j + α/p)(n m n,j )! Γ(α/p)n! ) κ0 F. Caron / 62 Parametric beta Bernoulli model Taking the limit as p α K n Kh h=1 κ h! K n p! κ 0!p K n ( n!γ(α/p) Γ(n+1+α/p) Γ(m n,j +α/p)(n m n,j )! Γ(α/p)n! ) p α K n p Kh h=1 κ h! K n 1 e α n i=1 1/i (m n,j 1)!(n m n,j )! n! F. Caron 26 / 62

14 Outline Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 27 / 62 Beta-Bernoulli process Now assume that each feature j = 1,..., K n has some location θn,j in a feature space Θ Feature locations are assumed to be i.i.d from some distribution G 0 (density g 0 ) Represent the feature model as a collection of point processes Z i = z ij δ θj where δ a is the dirac delta mass and zij = 1 if object i possesses feature θ j {θ n,j } = {θ k i [n] s.t. z ik > 0} F. Caron 28 / 62

15 Beta-Bernoulli process Let f n (Z 1,..., Z n ) be the multiset induced by the point processes f n (Z 1,..., Z n ) = {{i Z i (θ n,j ) = 1}, j = 1,..., K n} Distribution over (Z i ) i=1,...,n is obtained by setting independent priors over the feature allocations and their locations K n p(z 1,..., Z n ) = Pr(f n (Z 1,..., Z n )) Using the IBP prior for the feature allocations Kh g 0 (θn,j ) h=1 κ h! p(z 1,..., Z n ) =α K n e α n i=1 1 i K n g 0 (θ j ) K n (m n,j 1)!(n m n,j )! n! F. Caron 29 / 62 Beta-Bernoulli process Exchangeability over the feature allocations f n carries over (Z i ) i=1,...,n Infinite exchangeability: for any n 1 and any permutation σ of [n] p(z 1,..., Z n ) = p(z σ(1),..., Z σ(n) ) De Finetti representation theorem implies p(z 1,..., Z n ) = n i=1 p(z i B)P (db) where B is some latent process with distribution P de Finetti measure P (db): beta process [Hjort, 1990, Thibaux and Jordan, 07] F. Caron / 62

16 Beta-Bernoulli process Let B = π j δ θj be a completely random measure characterized by its Lévy measure ν(dπ, dθ) = απ 1 (1 π) α 1 dπg 0 (dθ) defined on [0, 1] Θ. B is called a beta process and we write B BetaP(α, G 0 ) A draw from a beta process is discrete a.s. with an infinite number of atoms [Hjort, 1990] F. Caron 31 / 62 Beta-Bernoulli process Beta process Lévy intensity Feature space Θ Stick weights F. Caron 32 / 62

17 Beta-Bernoulli process Conditional Bernoulli process Z i B BeP(B) Z i = z ij δ θj where z ij Ber(π j ) F. Caron 33 / 62 Beta-Bernoulli process 1 Stick weights B 0 40 Objects Feature space Θ Z F. Caron 34 / 62

18 Beta-Bernoulli process Conjugacy Let θn,1,..., θ n,k n be the number of support points in Z 1,..., Z n and m n,j their occurences Posterior B Z 1,..., Z n BetaP α + n, α α + n G 0 + K n m n,j α + n δ θn,j Predictive distribution Z n+1 Z 1,..., Z n BeP α α + n G 0 + K n m n,j α + n δ θn,j [Hjort, 1990, Kim, 1999, Thibaux and Jordan, 07] F. Caron 3 / 62 Chinese restaurant vs Indian buffet Application Clustering Latent feature Combinatorial object Partition Multiset Generative model Chinese restaurant proc. Indian buffet proc. de Finetti measure Dirichlet process beta process Stick-breaking Yes Yes Conjugacy Yes Yes Power-law extensions Pitman-Yor stable beta process F. Caron 36 / 62

19 Outline Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 37 / 62 Inference Latent variable model Data X of size n d (Marginal) Likelihood Prior Pr(X f n ) = Θ Pr(X f n, θ)p (θ)dθ Pr(f n ) Posterior Pr(f n ) Pr(X f n ) Pr(f n ) Inference can be carried out using IBP MCMC with Metropolis-Hastings within Gibbs updates Sequential Monte Carlo [Meeds et al., 07, Wood and Griffiths, 07] F. Caron 38 / 62

20 Outline Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 39 / 62 Stable Indian buffet process Three parameters α > 0, σ [0, 1) and c > σ First customer picks K + 1 Poisson(α) dishes Then each customer i = 2,... chooses a dish j previously chosen m i 1,j times with probability m i 1,j σ c + i 1 picks an additional set of dishes ( ) K + Γ(1 + c)γ(i 1 + c + σ) i Poisson α Γ(i + c)γ(c + σ) Reduces to the one parameter IBP when c = 1 and σ = 0 [Teh and Görür, 09] F. Caron 40 / 62

21 Stable Indian buffet process sigma=0 sigma=0. sigma=0.9 Objects Objects Objects Features Features Features F. Caron 41 / 62 Stable Indian buffet process Power-law behavior for σ > 0 Number of features grows in O(n σ ) Proportion of features associated to m objects is, for n m large, in O ( 1 m 1+σ ) Similar to the Pitman-Yor process for mixture models F. Caron 42 / 62

22 Stable Indian buffet process Distribution σ=0 σ=0. σ=0.9 Number of features σ=0 σ=0. σ= Degree of features Number of objects F. Caron 43 / 62 Outline Introduction Indian buffet process A parametric beta Bernoulli model Beta-Bernoulli process Inference Stable Indian buffet process Beyond the Indian buffet process F. Caron 44 / 62

23 Properties of the Indian buffet Nb of features Overall nb Prop. of features per object of features associated to m objects IBP Poisson O(log(n)) stable IBP Poisson O(n σ ) Power-law behavior (σ > 0) latent IBP Poisson ( rates) O(log(n)) Power-law behavior Mixture of Poisson O(n σ ) In the IBP, all objects have marginally Poisson(α) features One may want: Relax exchangeability assumption: some objects are a priori likely to have more features than others Relax Poisson assumptions: distribution on the number of features per objects may have heavier tails than Poisson F. Caron 4 / 62 Shared pattern in time series [Fox et al., 09] F. Caron 46 / 62

24 Book-crossing community network 000 readers, books, edges F. Caron 47 / 62 Book-crossing community network Degree distributions on log-log scale Distribution 3 4 Distribution Degree Degree (c) Readers (d) Books F. Caron 48 / 62

25 Hierarchical model Collection of atomic measures Z i, i = 1, 2,... Z i = z ij δ θj z ij = 1 if reader i has read book j, 0 otherwise {θj } is the set of books Each book j is assigned a positive popularity parameter w j Each reader i is assigned a positive interest in reading parameter γ i The probability that reader i reads book j is P (z ij = 1 γ i, w j ) = 1 exp( w j γ i ) [Caron, 12] F. Caron 49 / 62 Data Augmentation Latent variable formulation Latent scores s ij Gumbel(log(w j ), 1) All books with a score above log(γi ) are retained, others are discarded log(γ i ) books books popularity score F. Caron 0 / 62

26 Model for the book popularity parameters Random atomic measure G = w j δ θj Construction: two-dimensional Poisson process N = {w j, θ j },... Generalized gamma process G GGP(α, σ, τ, h) characterized by a Lévy measure λ(w)h(θ)dwdθ with α λ(w) = Γ(1 σ) w σ 1 e τ w 0 (1 e w )λ(w)dw < finite total z ij. [Kingman, 1967, Brix, 1999, Regazzini et al., 03, Lijoi and Prünster, ] F. Caron 1 / 62 Posterior characterization Observed Z 1,..., Z n K n books at locations θn,j read m n,j times Cannot derive directly the conditional of G given Z 1,..., Z n nor the predictive of Z n+1 given Z 1,..., Z n Let X i = x ij δ θj where x ij = max(0, s ij + log(γ i )) 0 are latent positive scores. log(γ i ) books books score censored score F. Caron 2 / 62

27 Posterior Characterization The conditional distribution of G given X 1,... X n can be expressed as G = G + K n w j δ θ n,j where G and (w j ) are mutually independent with ( ) n G GGP α, σ, τ + γ i, h and the masses are w j other Gamma ( i=1 m n,j σ, τ + ) n γ i e x ij i=1 Characterization related to that for normalized random measures [Prünster, 02, James, 02, James et al., 09] F. Caron 3 / 62 Indian buffet process with latent scores Predictive distribution of Z n+1 given the latent process X 1,..., X n Books Reader Reader Reader [Caron, 12] F. Caron 4 / 62

28 Prior Draws Generalized Gamma process with τ = 1, γ i = 2. Readers Readers Readers Books Books Books (e) α = 1, σ = 0 (f) α =, σ = 0 (g) α =, σ = 0 Readers Readers Readers Books Books Books (h) α = 2, σ = 0.1 (i) α = 2, σ = 0. (j) α = 2, σ = 0.9 [Brix, 1999, Lijoi et al., 07] F. Caron / 62 Properties of the model Power-law behavior for the generalized gamma process with σ > 0 The total number of books read by n readers is O(n σ ) Asympt., the proportion of books read by m readers is O(m 1 σ ) F. Caron 6 / 62

29 Properties of the model (stable) Beta-Bernoulli/Indian Buffet process G stablebetap Z i G BeP(G) Special case of the latent IBP model when γ i = γ and λ(w) = αγ(1 + c) Γ(1 σ)γ(σ + c) γ(1 e γw ) σ 1 e γw(c+σ) In this case one can marginalize out the latent variables in the predictive distribution to obtain the (stable) Indian Buffet process F. Caron 7 / 62 Model for the interest in reading parameters Fixed γ i : Poisson degree distribution for readers with different rates ( ) α Poisson τ ((τ + γ i) σ τ σ ) Random γ i : conjugate gamma prior γ i Gamma(a γ, b γ ) Degree of readers is mixture of Poisson (heavier tails) F. Caron 8 / 62

30 Bibliography I Brix, A. (1999). Generalized gamma measures and shot-noise Cox processes. Advances in Applied Probability, 31(4): Broderick, T., Jordan, M. I., and Pitman, J. (13a). Cluster and feature modeling from combinatorial stochastic processes. Statistical Science, 28(3): Broderick, T., Pitman, J., and Jordan, M. I. (13b). Feature allocations, probability functions, and paintboxes. Bayesian Analysis, 8(4): Caron, F. (12). Bayesian nonparametric models for bipartite graphs. In NIPS. Fox, E., Sudderth, E., Jordan, M., and Willsky, A. (09). Sharing features among dynamical systems with beta processes. In NIPS, volume 22, pages Griffiths, T. and Ghahramani, Z. (0). Infinite latent feature models and the Indian buffet process. In NIPS. F. Caron 9 / 62 Bibliography II Griffiths, T. and Ghahramani, Z. (11). The Indian buffet process: an introduction and review. Journal of Machine Learning Research, 12(April): Hjort, N. (1990). Nonparametric bayes estimators based on beta processes in models for life history data. The Annals of Statistics, 18(3): James, L., Lijoi, A., and Prünster, I. (09). Posterior analysis for normalized random measures with independent increments. Scandinavian Journal of Statistics, 36(1): James, L. F. (02). Poisson process partition calculus with applications to exchangeable models and bayesian nonparametrics. arxiv preprint math/0093. Kim, Y. (1999). Nonparametric Bayesian estimators for counting processes. Annals of Statistics, pages Kingman, J. (1967). Completely random measures. Pacific Journal of Mathematics, 21(1):9 78. F. Caron 60 / 62

31 Bibliography III Lijoi, A., Mena, R. H., and Prünster, I. (07). Controlling the reinforcement in Bayesian non-parametric mixture models. Journal of the Royal Statistical Society: Series B (Statistical Methodology), 69(4): Lijoi, A. and Prünster, I. (). Models beyond the Dirichlet process. In N. L. Hjort, C. Holmes, P. M. S. G. W., editor, Bayesian Nonparametrics. Cambridge University Press. Meeds, E., Ghahramani, Z., Neal, R., and Roweis, S. (07). Modeling dyadic data with binary latent factors. In NIPS, volume 19, page 977. MIT; Prünster, I. (02). Random probability measures derived from increasing additive processes and their application to Bayesian statistics. PhD thesis, University of Pavia. Regazzini, E., Lijoi, A., and Prünster, I. (03). Distributional results for means of normalized random measures with independent increments. The Annals of Statistics, 31(2):60 8. F. Caron 61 / 62 Bibliography IV Teh, Y. and Görür, D. (09). Indian buffet processes with power-law behavior. In NIPS. Thibaux, R. and Jordan, M. (07). Hierarchical beta processes and the Indian buffet process. In International Conference on Artificial Intelligence and Statistics, volume 11, pages Wood, F. and Griffiths, T. L. (07). Particle filtering for nonparametric Bayesian matrix factorization. In Advances in Neural Information Processing Systems, volume 19, page 13. MIT; F. Caron 62 / 62

Bayesian nonparametric models for bipartite graphs

Bayesian nonparametric models for bipartite graphs Bayesian nonparametric models for bipartite graphs François Caron Department of Statistics, Oxford Statistics Colloquium, Harvard University November 11, 2013 F. Caron 1 / 27 Bipartite networks Readers/Customers

More information

Bayesian nonparametric models for bipartite graphs

Bayesian nonparametric models for bipartite graphs Bayesian nonparametric models for bipartite graphs François Caron INRIA IMB - University of Bordeaux Talence, France Francois.Caron@inria.fr Abstract We develop a novel Bayesian nonparametric model for

More information

Probabilistic Graphical Models

Probabilistic Graphical Models School of Computer Science Probabilistic Graphical Models Infinite Feature Models: The Indian Buffet Process Eric Xing Lecture 21, April 2, 214 Acknowledgement: slides first drafted by Sinead Williamson

More information

Bayesian nonparametric latent feature models

Bayesian nonparametric latent feature models Bayesian nonparametric latent feature models François Caron UBC October 2, 2007 / MLRG François Caron (UBC) Bayes. nonparametric latent feature models October 2, 2007 / MLRG 1 / 29 Overview 1 Introduction

More information

19 : Bayesian Nonparametrics: The Indian Buffet Process. 1 Latent Variable Models and the Indian Buffet Process

19 : Bayesian Nonparametrics: The Indian Buffet Process. 1 Latent Variable Models and the Indian Buffet Process 10-708: Probabilistic Graphical Models, Spring 2015 19 : Bayesian Nonparametrics: The Indian Buffet Process Lecturer: Avinava Dubey Scribes: Rishav Das, Adam Brodie, and Hemank Lamba 1 Latent Variable

More information

Haupthseminar: Machine Learning. Chinese Restaurant Process, Indian Buffet Process

Haupthseminar: Machine Learning. Chinese Restaurant Process, Indian Buffet Process Haupthseminar: Machine Learning Chinese Restaurant Process, Indian Buffet Process Agenda Motivation Chinese Restaurant Process- CRP Dirichlet Process Interlude on CRP Infinite and CRP mixture model Estimation

More information

A Brief Overview of Nonparametric Bayesian Models

A Brief Overview of Nonparametric Bayesian Models A Brief Overview of Nonparametric Bayesian Models Eurandom Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin Also at Machine

More information

Infinite latent feature models and the Indian Buffet Process

Infinite latent feature models and the Indian Buffet Process p.1 Infinite latent feature models and the Indian Buffet Process Tom Griffiths Cognitive and Linguistic Sciences Brown University Joint work with Zoubin Ghahramani p.2 Beyond latent classes Unsupervised

More information

Bayesian Nonparametric Models on Decomposable Graphs

Bayesian Nonparametric Models on Decomposable Graphs Bayesian Nonparametric Models on Decomposable Graphs François Caron INRIA Bordeaux Sud Ouest Institut de Mathématiques de Bordeaux University of Bordeaux, France francois.caron@inria.fr Arnaud Doucet Departments

More information

Bayesian non parametric approaches: an introduction

Bayesian non parametric approaches: an introduction Introduction Latent class models Latent feature models Conclusion & Perspectives Bayesian non parametric approaches: an introduction Pierre CHAINAIS Bordeaux - nov. 2012 Trajectory 1 Bayesian non parametric

More information

Truncation error of a superposed gamma process in a decreasing order representation

Truncation error of a superposed gamma process in a decreasing order representation Truncation error of a superposed gamma process in a decreasing order representation B julyan.arbel@inria.fr Í www.julyanarbel.com Inria, Mistis, Grenoble, France Joint work with Igor Pru nster (Bocconi

More information

Bayesian Nonparametric Models for Ranking Data

Bayesian Nonparametric Models for Ranking Data Bayesian Nonparametric Models for Ranking Data François Caron 1, Yee Whye Teh 1 and Brendan Murphy 2 1 Dept of Statistics, University of Oxford, UK 2 School of Mathematical Sciences, University College

More information

Bayesian Nonparametrics for Speech and Signal Processing

Bayesian Nonparametrics for Speech and Signal Processing Bayesian Nonparametrics for Speech and Signal Processing Michael I. Jordan University of California, Berkeley June 28, 2011 Acknowledgments: Emily Fox, Erik Sudderth, Yee Whye Teh, and Romain Thibaux Computer

More information

Factor Analysis and Indian Buffet Process

Factor Analysis and Indian Buffet Process Factor Analysis and Indian Buffet Process Lecture 4 Peixian Chen pchenac@cse.ust.hk Department of Computer Science and Engineering Hong Kong University of Science and Technology March 25, 2013 Peixian

More information

Priors for Random Count Matrices with Random or Fixed Row Sums

Priors for Random Count Matrices with Random or Fixed Row Sums Priors for Random Count Matrices with Random or Fixed Row Sums Mingyuan Zhou Joint work with Oscar Madrid and James Scott IROM Department, McCombs School of Business Department of Statistics and Data Sciences

More information

Feature Allocations, Probability Functions, and Paintboxes

Feature Allocations, Probability Functions, and Paintboxes Feature Allocations, Probability Functions, and Paintboxes Tamara Broderick, Jim Pitman, Michael I. Jordan Abstract The problem of inferring a clustering of a data set has been the subject of much research

More information

Infinite Latent Feature Models and the Indian Buffet Process

Infinite Latent Feature Models and the Indian Buffet Process Infinite Latent Feature Models and the Indian Buffet Process Thomas L. Griffiths Cognitive and Linguistic Sciences Brown University, Providence RI 292 tom griffiths@brown.edu Zoubin Ghahramani Gatsby Computational

More information

Bayesian Nonparametrics: Dirichlet Process

Bayesian Nonparametrics: Dirichlet Process Bayesian Nonparametrics: Dirichlet Process Yee Whye Teh Gatsby Computational Neuroscience Unit, UCL http://www.gatsby.ucl.ac.uk/~ywteh/teaching/npbayes2012 Dirichlet Process Cornerstone of modern Bayesian

More information

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes

MAD-Bayes: MAP-based Asymptotic Derivations from Bayes MAD-Bayes: MAP-based Asymptotic Derivations from Bayes Tamara Broderick Brian Kulis Michael I. Jordan Cat Clusters Mouse clusters Dog 1 Cat Clusters Dog Mouse Lizard Sheep Picture 1 Picture 2 Picture 3

More information

Nonparametric Bayesian Methods: Models, Algorithms, and Applications (Day 5)

Nonparametric Bayesian Methods: Models, Algorithms, and Applications (Day 5) Nonparametric Bayesian Methods: Models, Algorithms, and Applications (Day 5) Tamara Broderick ITT Career Development Assistant Professor Electrical Engineering & Computer Science MIT Bayes Foundations

More information

Non-Parametric Bayes

Non-Parametric Bayes Non-Parametric Bayes Mark Schmidt UBC Machine Learning Reading Group January 2016 Current Hot Topics in Machine Learning Bayesian learning includes: Gaussian processes. Approximate inference. Bayesian

More information

The Indian Buffet Process: An Introduction and Review

The Indian Buffet Process: An Introduction and Review Journal of Machine Learning Research 12 (2011) 1185-1224 Submitted 3/10; Revised 3/11; Published 4/11 The Indian Buffet Process: An Introduction and Review Thomas L. Griffiths Department of Psychology

More information

Bayesian Nonparametrics: some contributions to construction and properties of prior distributions

Bayesian Nonparametrics: some contributions to construction and properties of prior distributions Bayesian Nonparametrics: some contributions to construction and properties of prior distributions Annalisa Cerquetti Collegio Nuovo, University of Pavia, Italy Interview Day, CETL Lectureship in Statistics,

More information

Lecture 3a: Dirichlet processes

Lecture 3a: Dirichlet processes Lecture 3a: Dirichlet processes Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London c.archambeau@cs.ucl.ac.uk Advanced Topics

More information

Dependent hierarchical processes for multi armed bandits

Dependent hierarchical processes for multi armed bandits Dependent hierarchical processes for multi armed bandits Federico Camerlenghi University of Bologna, BIDSA & Collegio Carlo Alberto First Italian meeting on Probability and Mathematical Statistics, Torino

More information

Nonparametric Factor Analysis with Beta Process Priors

Nonparametric Factor Analysis with Beta Process Priors Nonparametric Factor Analysis with Beta Process Priors John Paisley Lawrence Carin Department of Electrical & Computer Engineering Duke University, Durham, NC 7708 jwp4@ee.duke.edu lcarin@ee.duke.edu Abstract

More information

Truncation error of a superposed gamma process in a decreasing order representation

Truncation error of a superposed gamma process in a decreasing order representation Truncation error of a superposed gamma process in a decreasing order representation Julyan Arbel Inria Grenoble, Université Grenoble Alpes julyan.arbel@inria.fr Igor Prünster Bocconi University, Milan

More information

Hierarchical Models, Nested Models and Completely Random Measures

Hierarchical Models, Nested Models and Completely Random Measures See discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/238729763 Hierarchical Models, Nested Models and Completely Random Measures Article March 2012

More information

Bayesian Nonparametric Learning of Complex Dynamical Phenomena

Bayesian Nonparametric Learning of Complex Dynamical Phenomena Duke University Department of Statistical Science Bayesian Nonparametric Learning of Complex Dynamical Phenomena Emily Fox Joint work with Erik Sudderth (Brown University), Michael Jordan (UC Berkeley),

More information

Nonparametric Probabilistic Modelling

Nonparametric Probabilistic Modelling Nonparametric Probabilistic Modelling Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Signal processing and inference

More information

Nonparametric Bayesian Models for Sparse Matrices and Covariances

Nonparametric Bayesian Models for Sparse Matrices and Covariances Nonparametric Bayesian Models for Sparse Matrices and Covariances Zoubin Ghahramani Department of Engineering University of Cambridge, UK zoubin@eng.cam.ac.uk http://learning.eng.cam.ac.uk/zoubin/ Bayes

More information

Bayesian nonparametrics

Bayesian nonparametrics Bayesian nonparametrics 1 Some preliminaries 1.1 de Finetti s theorem We will start our discussion with this foundational theorem. We will assume throughout all variables are defined on the probability

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Bayesian Nonparametrics

Bayesian Nonparametrics Bayesian Nonparametrics Lorenzo Rosasco 9.520 Class 18 April 11, 2011 About this class Goal To give an overview of some of the basic concepts in Bayesian Nonparametrics. In particular, to discuss Dirichelet

More information

Bayesian Nonparametrics

Bayesian Nonparametrics Bayesian Nonparametrics Peter Orbanz Columbia University PARAMETERS AND PATTERNS Parameters P(X θ) = Probability[data pattern] 3 2 1 0 1 2 3 5 0 5 Inference idea data = underlying pattern + independent

More information

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution

Outline. Binomial, Multinomial, Normal, Beta, Dirichlet. Posterior mean, MAP, credible interval, posterior distribution Outline A short review on Bayesian analysis. Binomial, Multinomial, Normal, Beta, Dirichlet Posterior mean, MAP, credible interval, posterior distribution Gibbs sampling Revisit the Gaussian mixture model

More information

Bayesian Nonparametric Models

Bayesian Nonparametric Models Bayesian Nonparametric Models David M. Blei Columbia University December 15, 2015 Introduction We have been looking at models that posit latent structure in high dimensional data. We use the posterior

More information

arxiv: v2 [stat.ml] 10 Sep 2012

arxiv: v2 [stat.ml] 10 Sep 2012 Distance Dependent Infinite Latent Feature Models arxiv:1110.5454v2 [stat.ml] 10 Sep 2012 Samuel J. Gershman 1, Peter I. Frazier 2 and David M. Blei 3 1 Department of Psychology and Princeton Neuroscience

More information

A marginal sampler for σ-stable Poisson-Kingman mixture models

A marginal sampler for σ-stable Poisson-Kingman mixture models A marginal sampler for σ-stable Poisson-Kingman mixture models joint work with Yee Whye Teh and Stefano Favaro María Lomelí Gatsby Unit, University College London Talk at the BNP 10 Raleigh, North Carolina

More information

Dirichlet Process. Yee Whye Teh, University College London

Dirichlet Process. Yee Whye Teh, University College London Dirichlet Process Yee Whye Teh, University College London Related keywords: Bayesian nonparametrics, stochastic processes, clustering, infinite mixture model, Blackwell-MacQueen urn scheme, Chinese restaurant

More information

Dirichlet Processes: Tutorial and Practical Course

Dirichlet Processes: Tutorial and Practical Course Dirichlet Processes: Tutorial and Practical Course (updated) Yee Whye Teh Gatsby Computational Neuroscience Unit University College London August 2007 / MLSS Yee Whye Teh (Gatsby) DP August 2007 / MLSS

More information

On the posterior structure of NRMI

On the posterior structure of NRMI On the posterior structure of NRMI Igor Prünster University of Turin, Collegio Carlo Alberto and ICER Joint work with L.F. James and A. Lijoi Isaac Newton Institute, BNR Programme, 8th August 2007 Outline

More information

Dirichlet Processes and other non-parametric Bayesian models

Dirichlet Processes and other non-parametric Bayesian models Dirichlet Processes and other non-parametric Bayesian models Zoubin Ghahramani http://learning.eng.cam.ac.uk/zoubin/ zoubin@cs.cmu.edu Statistical Machine Learning CMU 10-702 / 36-702 Spring 2008 Model

More information

Exchangeable random hypergraphs

Exchangeable random hypergraphs Exchangeable random hypergraphs By Danna Zhang and Peter McCullagh Department of Statistics, University of Chicago Abstract: A hypergraph is a generalization of a graph in which an edge may contain more

More information

STAT Advanced Bayesian Inference

STAT Advanced Bayesian Inference 1 / 32 STAT 625 - Advanced Bayesian Inference Meng Li Department of Statistics Jan 23, 218 The Dirichlet distribution 2 / 32 θ Dirichlet(a 1,...,a k ) with density p(θ 1,θ 2,...,θ k ) = k j=1 Γ(a j) Γ(

More information

Bayesian Nonparametrics: Models Based on the Dirichlet Process

Bayesian Nonparametrics: Models Based on the Dirichlet Process Bayesian Nonparametrics: Models Based on the Dirichlet Process Alessandro Panella Department of Computer Science University of Illinois at Chicago Machine Learning Seminar Series February 18, 2013 Alessandro

More information

Stick-Breaking Beta Processes and the Poisson Process

Stick-Breaking Beta Processes and the Poisson Process Stic-Breaing Beta Processes and the Poisson Process John Paisley David M. Blei 3 Michael I. Jordan,2 Department of EECS, 2 Department of Statistics, UC Bereley 3 Computer Science Department, Princeton

More information

Compound Random Measures

Compound Random Measures Compound Random Measures Jim Griffin (joint work with Fabrizio Leisen) University of Kent Introduction: Two clinical studies 3 CALGB8881 3 CALGB916 2 2 β 1 1 β 1 1 1 5 5 β 1 5 5 β Infinite mixture models

More information

A Stick-Breaking Construction of the Beta Process

A Stick-Breaking Construction of the Beta Process John Paisley 1 jwp4@ee.duke.edu Aimee Zaas 2 aimee.zaas@duke.edu Christopher W. Woods 2 woods004@mc.duke.edu Geoffrey S. Ginsburg 2 ginsb005@duke.edu Lawrence Carin 1 lcarin@ee.duke.edu 1 Department of

More information

Non-parametric Bayesian Methods

Non-parametric Bayesian Methods Non-parametric Bayesian Methods Uncertainty in Artificial Intelligence Tutorial July 25 Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK Center for Automated Learning

More information

Image segmentation combining Markov Random Fields and Dirichlet Processes

Image segmentation combining Markov Random Fields and Dirichlet Processes Image segmentation combining Markov Random Fields and Dirichlet Processes Jessica SODJO IMS, Groupe Signal Image, Talence Encadrants : A. Giremus, J.-F. Giovannelli, F. Caron, N. Dobigeon Jessica SODJO

More information

Lecture 16-17: Bayesian Nonparametrics I. STAT 6474 Instructor: Hongxiao Zhu

Lecture 16-17: Bayesian Nonparametrics I. STAT 6474 Instructor: Hongxiao Zhu Lecture 16-17: Bayesian Nonparametrics I STAT 6474 Instructor: Hongxiao Zhu Plan for today Why Bayesian Nonparametrics? Dirichlet Distribution and Dirichlet Processes. 2 Parameter and Patterns Reference:

More information

Part IV: Monte Carlo and nonparametric Bayes

Part IV: Monte Carlo and nonparametric Bayes Part IV: Monte Carlo and nonparametric Bayes Outline Monte Carlo methods Nonparametric Bayesian models Outline Monte Carlo methods Nonparametric Bayesian models The Monte Carlo principle The expectation

More information

Sparsity? A Bayesian view

Sparsity? A Bayesian view Sparsity? A Bayesian view Zoubin Ghahramani Department of Engineering University of Cambridge SPARS Conference Cambridge, July 2015 Sparsity Many people are interested in sparsity. Why? Sparsity Many people

More information

Beta processes, stick-breaking, and power laws

Beta processes, stick-breaking, and power laws Beta processes, stick-breaking, and power laws T. Broderick, M. Jordan, J. Pitman Presented by Jixiong Wang & J. Li November 17, 2011 DP vs. BP Dirichlet Process Beta Process DP vs. BP Dirichlet Process

More information

Applied Nonparametric Bayes

Applied Nonparametric Bayes Applied Nonparametric Bayes Michael I. Jordan Department of Electrical Engineering and Computer Science Department of Statistics University of California, Berkeley http://www.cs.berkeley.edu/ jordan Acknowledgments:

More information

CS281B / Stat 241B : Statistical Learning Theory Lecture: #22 on 19 Apr Dirichlet Process I

CS281B / Stat 241B : Statistical Learning Theory Lecture: #22 on 19 Apr Dirichlet Process I X i Ν CS281B / Stat 241B : Statistical Learning Theory Lecture: #22 on 19 Apr 2004 Dirichlet Process I Lecturer: Prof. Michael Jordan Scribe: Daniel Schonberg dschonbe@eecs.berkeley.edu 22.1 Dirichlet

More information

Non-parametric Clustering with Dirichlet Processes

Non-parametric Clustering with Dirichlet Processes Non-parametric Clustering with Dirichlet Processes Timothy Burns SUNY at Buffalo Mar. 31 2009 T. Burns (SUNY at Buffalo) Non-parametric Clustering with Dirichlet Processes Mar. 31 2009 1 / 24 Introduction

More information

arxiv: v1 [stat.ml] 20 Nov 2012

arxiv: v1 [stat.ml] 20 Nov 2012 A survey of non-exchangeable priors for Bayesian nonparametric models arxiv:1211.4798v1 [stat.ml] 20 Nov 2012 Nicholas J. Foti 1 and Sinead Williamson 2 1 Department of Computer Science, Dartmouth College

More information

Gentle Introduction to Infinite Gaussian Mixture Modeling

Gentle Introduction to Infinite Gaussian Mixture Modeling Gentle Introduction to Infinite Gaussian Mixture Modeling with an application in neuroscience By Frank Wood Rasmussen, NIPS 1999 Neuroscience Application: Spike Sorting Important in neuroscience and for

More information

Bayesian nonparametric models of sparse and exchangeable random graphs

Bayesian nonparametric models of sparse and exchangeable random graphs Bayesian nonparametric models of sparse and exchangeable random graphs F. Caron & E. Fox Technical Report Discussion led by Esther Salazar Duke University May 16, 2014 (Reading group) May 16, 2014 1 /

More information

Properties of Bayesian nonparametric models and priors over trees

Properties of Bayesian nonparametric models and priors over trees Properties of Bayesian nonparametric models and priors over trees David A. Knowles Computer Science Department Stanford University July 24, 2013 Introduction Theory: what characteristics might we want?

More information

Stochastic Processes, Kernel Regression, Infinite Mixture Models

Stochastic Processes, Kernel Regression, Infinite Mixture Models Stochastic Processes, Kernel Regression, Infinite Mixture Models Gabriel Huang (TA for Simon Lacoste-Julien) IFT 6269 : Probabilistic Graphical Models - Fall 2018 Stochastic Process = Random Function 2

More information

Graphical Models for Query-driven Analysis of Multimodal Data

Graphical Models for Query-driven Analysis of Multimodal Data Graphical Models for Query-driven Analysis of Multimodal Data John Fisher Sensing, Learning, & Inference Group Computer Science & Artificial Intelligence Laboratory Massachusetts Institute of Technology

More information

Spatial Bayesian Nonparametrics for Natural Image Segmentation

Spatial Bayesian Nonparametrics for Natural Image Segmentation Spatial Bayesian Nonparametrics for Natural Image Segmentation Erik Sudderth Brown University Joint work with Michael Jordan University of California Soumya Ghosh Brown University Parsing Visual Scenes

More information

An Introduction to Bayesian Nonparametric Modelling

An Introduction to Bayesian Nonparametric Modelling An Introduction to Bayesian Nonparametric Modelling Yee Whye Teh Gatsby Computational Neuroscience Unit University College London October, 2009 / Toronto Outline Some Examples of Parametric Models Bayesian

More information

Distance-Based Probability Distribution for Set Partitions with Applications to Bayesian Nonparametrics

Distance-Based Probability Distribution for Set Partitions with Applications to Bayesian Nonparametrics Distance-Based Probability Distribution for Set Partitions with Applications to Bayesian Nonparametrics David B. Dahl August 5, 2008 Abstract Integration of several types of data is a burgeoning field.

More information

Beta-Negative Binomial Process and Exchangeable Random Partitions for Mixed-Membership Modeling

Beta-Negative Binomial Process and Exchangeable Random Partitions for Mixed-Membership Modeling Beta-Negative Binomial Process and Exchangeable Random Partitions for Mixed-Membership Modeling Mingyuan Zhou IROM Department, McCombs School of Business The University of Texas at Austin, Austin, TX 77,

More information

arxiv: v2 [stat.ml] 4 Aug 2011

arxiv: v2 [stat.ml] 4 Aug 2011 A Tutorial on Bayesian Nonparametric Models Samuel J. Gershman 1 and David M. Blei 2 1 Department of Psychology and Neuroscience Institute, Princeton University 2 Department of Computer Science, Princeton

More information

Non-parametric Bayesian Modeling and Fusion of Spatio-temporal Information Sources

Non-parametric Bayesian Modeling and Fusion of Spatio-temporal Information Sources th International Conference on Information Fusion Chicago, Illinois, USA, July -8, Non-parametric Bayesian Modeling and Fusion of Spatio-temporal Information Sources Priyadip Ray Department of Electrical

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Priors for Random Count Matrices Derived from a Family of Negative Binomial Processes: Supplementary Material

Priors for Random Count Matrices Derived from a Family of Negative Binomial Processes: Supplementary Material Priors for Random Count Matrices Derived from a Family of Negative Binomial Processes: Supplementary Material A The Negative Binomial Process: Details A. Negative binomial process random count matrix To

More information

Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures

Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures 17th Europ. Conf. on Machine Learning, Berlin, Germany, 2006. Variational Bayesian Dirichlet-Multinomial Allocation for Exponential Family Mixtures Shipeng Yu 1,2, Kai Yu 2, Volker Tresp 2, and Hans-Peter

More information

Colouring and breaking sticks, pairwise coincidence losses, and clustering expression profiles

Colouring and breaking sticks, pairwise coincidence losses, and clustering expression profiles Colouring and breaking sticks, pairwise coincidence losses, and clustering expression profiles Peter Green and John Lau University of Bristol P.J.Green@bristol.ac.uk Isaac Newton Institute, 11 December

More information

Foundations of Nonparametric Bayesian Methods

Foundations of Nonparametric Bayesian Methods 1 / 27 Foundations of Nonparametric Bayesian Methods Part II: Models on the Simplex Peter Orbanz http://mlg.eng.cam.ac.uk/porbanz/npb-tutorial.html 2 / 27 Tutorial Overview Part I: Basics Part II: Models

More information

On Simulations form the Two-Parameter. Poisson-Dirichlet Process and the Normalized. Inverse-Gaussian Process

On Simulations form the Two-Parameter. Poisson-Dirichlet Process and the Normalized. Inverse-Gaussian Process On Simulations form the Two-Parameter arxiv:1209.5359v1 [stat.co] 24 Sep 2012 Poisson-Dirichlet Process and the Normalized Inverse-Gaussian Process Luai Al Labadi and Mahmoud Zarepour May 8, 2018 ABSTRACT

More information

On the Truncation Error of a Superposed Gamma Process

On the Truncation Error of a Superposed Gamma Process On the Truncation Error of a Superposed Gamma Process Julyan Arbel and Igor Prünster Abstract Completely random measures (CRMs) form a key ingredient of a wealth of stochastic models, in particular in

More information

Interpretable Latent Variable Models

Interpretable Latent Variable Models Interpretable Latent Variable Models Fernando Perez-Cruz Bell Labs (Nokia) Department of Signal Theory and Communications, University Carlos III in Madrid 1 / 24 Outline 1 Introduction to Machine Learning

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 4 Problem: Density Estimation We have observed data, y 1,..., y n, drawn independently from some unknown

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

The Infinite Factorial Hidden Markov Model

The Infinite Factorial Hidden Markov Model The Infinite Factorial Hidden Markov Model Jurgen Van Gael Department of Engineering University of Cambridge, UK jv279@cam.ac.uk Yee Whye Teh Gatsby Unit University College London, UK ywteh@gatsby.ucl.ac.uk

More information

Bayesian Hidden Markov Models and Extensions

Bayesian Hidden Markov Models and Extensions Bayesian Hidden Markov Models and Extensions Zoubin Ghahramani Department of Engineering University of Cambridge joint work with Matt Beal, Jurgen van Gael, Yunus Saatci, Tom Stepleton, Yee Whye Teh Modeling

More information

Parallel Markov Chain Monte Carlo for Pitman-Yor Mixture Models

Parallel Markov Chain Monte Carlo for Pitman-Yor Mixture Models Parallel Markov Chain Monte Carlo for Pitman-Yor Mixture Models Avinava Dubey School of Computer Science Carnegie Mellon University Pittsburgh, PA 523 Sinead A. Williamson McCombs School of Business University

More information

Bayesian non-parametric model to longitudinally predict churn

Bayesian non-parametric model to longitudinally predict churn Bayesian non-parametric model to longitudinally predict churn Bruno Scarpa Università di Padova Conference of European Statistics Stakeholders Methodologists, Producers and Users of European Statistics

More information

Pattern Recognition and Machine Learning

Pattern Recognition and Machine Learning Christopher M. Bishop Pattern Recognition and Machine Learning ÖSpri inger Contents Preface Mathematical notation Contents vii xi xiii 1 Introduction 1 1.1 Example: Polynomial Curve Fitting 4 1.2 Probability

More information

Introduction to Probabilistic Machine Learning

Introduction to Probabilistic Machine Learning Introduction to Probabilistic Machine Learning Piyush Rai Dept. of CSE, IIT Kanpur (Mini-course 1) Nov 03, 2015 Piyush Rai (IIT Kanpur) Introduction to Probabilistic Machine Learning 1 Machine Learning

More information

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations

More information

Bayesian Mixtures of Bernoulli Distributions

Bayesian Mixtures of Bernoulli Distributions Bayesian Mixtures of Bernoulli Distributions Laurens van der Maaten Department of Computer Science and Engineering University of California, San Diego Introduction The mixture of Bernoulli distributions

More information

Distance dependent Chinese restaurant processes

Distance dependent Chinese restaurant processes David M. Blei Department of Computer Science, Princeton University 35 Olden St., Princeton, NJ 08540 Peter Frazier Department of Operations Research and Information Engineering, Cornell University 232

More information

On some distributional properties of Gibbs-type priors

On some distributional properties of Gibbs-type priors On some distributional properties of Gibbs-type priors Igor Prünster University of Torino & Collegio Carlo Alberto Bayesian Nonparametrics Workshop ICERM, 21st September 2012 Joint work with: P. De Blasi,

More information

28 : Approximate Inference - Distributed MCMC

28 : Approximate Inference - Distributed MCMC 10-708: Probabilistic Graphical Models, Spring 2015 28 : Approximate Inference - Distributed MCMC Lecturer: Avinava Dubey Scribes: Hakim Sidahmed, Aman Gupta 1 Introduction For many interesting problems,

More information

Nonparametric Mixed Membership Models

Nonparametric Mixed Membership Models 5 Nonparametric Mixed Membership Models Daniel Heinz Department of Mathematics and Statistics, Loyola University of Maryland, Baltimore, MD 21210, USA CONTENTS 5.1 Introduction................................................................................

More information

Poisson Latent Feature Calculus for Generalized Indian Buffet Processes

Poisson Latent Feature Calculus for Generalized Indian Buffet Processes Poisson Latent Feature Calculus for Generalized Indian Buffet Processes Lancelot F. James (paper from arxiv [math.st], Dec 14) Discussion by: Piyush Rai January 23, 2015 Lancelot F. James () Poisson Latent

More information

An Introduction to Bayesian Machine Learning

An Introduction to Bayesian Machine Learning 1 An Introduction to Bayesian Machine Learning José Miguel Hernández-Lobato Department of Engineering, Cambridge University April 8, 2013 2 What is Machine Learning? The design of computational systems

More information

arxiv: v2 [math.st] 22 Apr 2016

arxiv: v2 [math.st] 22 Apr 2016 arxiv:1410.6843v2 [math.st] 22 Apr 2016 Posteriors, conjugacy, and exponential families for completely random measures Tamara Broderick Ashia C. Wilson Michael I. Jordan April 25, 2016 Abstract We demonstrate

More information

Defining Predictive Probability Functions for Species Sampling Models

Defining Predictive Probability Functions for Species Sampling Models Defining Predictive Probability Functions for Species Sampling Models Jaeyong Lee Department of Statistics, Seoul National University leejyc@gmail.com Fernando A. Quintana Departamento de Estadísica, Pontificia

More information

Accelerated Gibbs Sampling for Infinite Sparse Factor Analysis

Accelerated Gibbs Sampling for Infinite Sparse Factor Analysis LLNL-TR-499647 Accelerated Gibbs Sampling for Infinite Sparse Factor Analysis D. M. Andrzejewski September 19, 2011 Disclaimer This document was prepared as an account of work sponsored by an agency of

More information

Lecture 10. Announcement. Mixture Models II. Topics of This Lecture. This Lecture: Advanced Machine Learning. Recap: GMMs as Latent Variable Models

Lecture 10. Announcement. Mixture Models II. Topics of This Lecture. This Lecture: Advanced Machine Learning. Recap: GMMs as Latent Variable Models Advanced Machine Learning Lecture 10 Mixture Models II 30.11.2015 Bastian Leibe RWTH Aachen http://www.vision.rwth-aachen.de/ Announcement Exercise sheet 2 online Sampling Rejection Sampling Importance

More information

PMR Learning as Inference

PMR Learning as Inference Outline PMR Learning as Inference Probabilistic Modelling and Reasoning Amos Storkey Modelling 2 The Exponential Family 3 Bayesian Sets School of Informatics, University of Edinburgh Amos Storkey PMR Learning

More information