Markov Chain Monte Carlo (MCMC)

Size: px
Start display at page:

Download "Markov Chain Monte Carlo (MCMC)"

Transcription

1 School of Computer Science Probabilistic Graphical Models Markov Chain Monte Carlo (MCMC) Readings: MacKay Ch. 29 Jordan Ch. 21 Matt Gormley Lecture 16 March 14,

2 Homework 2 Housekeeping Due March 16, 12:00 noon (extended) Midway Project Report Due March 23, 12:00 noon 2

3 Sample 1: 1. Data 2. Model D = {x (n) } N n=1 n v p d n time flies like an arrow p(x ) = 1 Z( ) Y C2C C(x C ) Y 1 Y 2 Y 3 Y 4 Y 5 Sample 2: Sample 3: Sample 4: n n v d n time flies like an arrow n v p n n flies p n n v v with fly with their wings time you will see 3. Objective X 1 X 2 X 3 X 4 X 5 `( ; D) = NX log p(x (n) ) n=1 5. Inference 1. Marginal Inference p(x C )= 2. Partition Function Z( ) = X x 3. MAP Inference ˆx = argmax x X x 0 :x 0 C =x C Y C2C p(x 0 ) C(x C ) p(x ) 4. Learning = argmax `( ; D) 3

4 It s Pi Day 2016 so let s compute π. 4

5 Slide from Ian Murray Properties of Monte Carlo Estimator: f(x)p (x) dx ˆf 1 S S f(x (s) ), x (s) P (x) s=1 Estimator is unbiased: ] E P ({x })[ ˆf (s) = 1 S S E P (x) [f(x)] s=1 = E P (x) [f(x)] Variance shrinks 1/S: ] var P ({x })[ ˆf (s) = 1 S S 2 var P (x) [f(x)] = var P (x) [f(x)] /S Error bars shrink like S s=1 5

6 Slide from Ian Murray A dumb approximation of π P (x, y) = { 1 0<x<1 and 0<y<1 0 otherwise π =4 I ( (x 2 + y 2 ) < 1 ) P (x, y) dx dy octave:1> S=12; a=rand(s,2); 4*mean(sum(a.*a,2)<1) ans = octave:2> S=1e7; a=rand(s,2); 4*mean(sum(a.*a,2)<1) ans =

7 Slide from Ian Murray Aside: don t always sample! Monte Carlo is an extremely bad method; it should be used only when all alternative methods are worse. Alan Sokal, 1996 Example: numerical solutions to (nice) 1D integrals are fast octave:1> 4 * quadl(@(x) sqrt(1-x.^2), 0, 1, tolerance) Gives π to 6 dp s in 108 evaluations, machine precision in (NB Matlab s quadl fails at zero tolerance) 7

8 Slide from Ian Murray Sampling from distributions Draw points uniformly under the curve: P (x) x (2) x (3) x (1) x (4) x Probability mass to left of point Uniform[0,1] 8

9 Slide from Ian Murray Sampling from distributions How to convert samples from a Uniform[0,1] generator: 1 h(y) h(y) = y p(y )dy p(y) Draw mass to left of point: u Uniform[0,1] Sample, y(u) =h 1 (u) 0 Figure from PRML, Bishop (2006) y Although we can t always compute and invert h(y) 9

10 Slide from Ian Murray Rejection sampling Sampling underneath a P (x) P (x) curve is also valid k Q(x) (x i,u i ) k opt Q(x) P (x) Draw underneath a simple curve k Q(x) P (x): Draw x Q(x) height u Uniform[0,k Q(x)] Discard the point if above P, i.e. if u> P (x) (x j,u j ) x (1) x 10

11 Slide from Ian Murray Importance sampling Computing P (x) and Q(x), thenthrowing x away seems wasteful Instead rewrite the integral as an expectation under Q: f(x)p (x) dx = 1 S f(x) P (x) Q(x) dx, Q(x) (Q(x) > 0 if P (x) > 0) S f(x (s) ) P (x(s) ) Q(x (s) ), x(s) Q(x) s=1 This is just simple Monte Carlo again, so it is unbiased. Importance sampling applies when the integral is not an expectation. Divide and multiply any integrand by a convenient distribution. 11

12 Slide from Ian Murray Importance sampling (2) Previous slide assumed we could evaluate P (x) = P (x)/z P f(x)p (x) dx Z Q Z P 1 S 1 S S f(x (s) ) P (x (s) ), x Q(x (s) Q(x) (s) ) s=1 S f(x (s) ) s=1 1 S }{{} r (s) r (s) s r (s ) S f(x (s) )w (s) s=1 This estimator is consistent but biased Exercise: Prove that Z P /Z Q 1 S s r(s) 12

13 Slide from Ian Murray Summary so far Sums and integrals, often expectations, occur frequently in statistics Monte Carlo approximates expectations with a sample average Rejection sampling draws samples from complex distributions Importance sampling applies Monte Carlo to any sum/integral 13

14 Slide from Ian Murray Application to large problems Pitfalls of Monte Carlo Rejection & importance sampling scale badly with dimensionality Example: P (x) =N (0, I), Q(x) =N (0, σ 2 I) Rejection sampling: Requires σ 1. Fraction of proposals accepted = σ D Importance sampling: Variance of importance weights = ( ) σ 2 D/2 2 1/σ 1 2 Infinite / undefined variance if σ 1/ 2 14

15 Outline Review: Monte Carlo MCMC (Basic Methods) Metropolis algorithm Metropolis- Hastings (M- H) algorithm Gibbs Sampling Markov Chains Transition probabilities Invariant distribution Equilibrium distribution Markov chain as a WFSM Constructing Markov chains Why does M- H work? MCMC (Auxiliary Variable Methods) Slice Sampling Hamiltonian Monte Carlo 15

16 Metropolis, Metropolis- Hastings, Gibbs Sampling MCMC (BASIC METHODS) 16

17 MCMC Goal: Draw approximate, correlated samples from a target distribution p(x) MCMC: Performs a biased random walk to explore the distribution 17

18 Simulations of MCMC Visualization of Metroplis- Hastings, Gibbs Sampling, and Hamiltonian MCMC: mcmc/ 18

19 Whiteboard Metropolis Algorithm Metropolis- Hastings Algorithm Gibbs Sampling 19

20 Gibbs Sampling x 2 p(x) x (t+1) x (t) p(x P (x) 1 x (t) 2 ) a) x 1 ( 20

21 Gibbs Sampling x 2 p(x) x (t+2) x (t+1) P (x) p(x 2 x (t+1) x (t) 1 ) a) x 1 ( 21

22 Gibbs Sampling x 2 p(x) x (t+2) x (t+1) x (t+3) x (t+4) x (t) P (x) a) x 1 ( 22

23 Figure from Jordan Ch. 21 Gibbs Sampling Full conditionals only need to condition on the Markov Blanket MRF Bayes Net Must be easy to sample from conditionals Many conditionals are log- concave and are amenable to adaptive rejection sampling 23

24 Whiteboard Gibbs Sampling as M- H 24

25 Figure from Bishop (2006) Random Walk Behavior of M- H For Metropolis- Hastings, a generic proposal distribution is: If ϵ is large, many rejections If ϵ is small, slow mixing q(x x (t) )=N (0, 2 ) max p(x) min q(x x (t) ) 25

26 Figure from Bishop (2006) Random Walk Behavior of M- H For Rejection Sampling, the accepted samples are are independent But for Metropolis- Hastings, the samples are correlated Question: How long must we wait to get effectively independent samples? min max q(x x (t) ) p(x) A: independent states in the M- H random walk are separated by roughly steps ( max / min ) 2 26

27 Definitions and Theoretical Justification for MCMC MARKOV CHAINS 27

28 Whiteboard Markov chains Transition probabilities Invariant distribution Equilibrium distribution Sufficient conditions for MCMC Markov chain as a WFSM 28

29 Detailed Balance S(x 0 x)p(x) =S(x x 0 )p(x 0 ) Detailed balance means that, for each pair of states x and x, arriving at x then x and arriving at x then x are equiprobable. x x x' x' 29

30 Whiteboard Simple Markov chain example Constructing Markov chains Transition Probabilities for MCMC 30

31 Practical Issues Question: Is it better to move along one dimension or many? Answer: For Metropolis- Hasings, it is sometimes better to sample one dimension at a time Q: Given a sequence of 1D proposals, compare rate of movement for one- at- a- time vs. concatenation. Answer: For Gibbs Sampling, sometimes better to sample a block of variables at a time Q: When is it tractable to sample a block of variables? 31

32 Practical Issues Question: How do we assess convergence of the Markov chain? Answer: It s not easy! Compare statistics of multiple independent chains Ex: Compare log- likelihoods Chain 1 Chain 2 Log- likelihood Log- likelihood # of MCMC steps # of MCMC steps 32

33 Practical Issues Question: How do we assess convergence of the Markov chain? Answer: It s not easy! Compare statistics of multiple independent chains Ex: Compare log- likelihoods Chain 1 Chain 2 Log- likelihood Log- likelihood # of MCMC steps # of MCMC steps 33

34 Practical Issues Question: Is one long Markov chain better than many short ones? Note: typical to discard initial samples (aka. burn- in ) since the chain might not yet have mixed Answer: Often a balance is best: Compared to one long chain: More independent samples Compared to many small chains: Less samples discarded for burn- in We can still parallelize Allows us to assess mixing by comparing chains 34

35 Whiteboard Blocked Gibbs Sampling 35

36 Slice Sampling, Hamiltonian Monte Carlo MCMC (AUXILIARY VARIABLE METHODS) 36

37 Slide from Ian Murray Auxiliary variables The point of MCMC is to marginalize out variables, but one can introduce more variables: f(x)p (x) dx = f(x)p (x, v) dx dv 1 S S f(x (s) ), x,v P (x, v) s=1 We might want to do this if P (x v) and P (v x) are simple P (x, v) is otherwise easier to navigate 37

38 Slice Sampling Motivation: Want samples from p(x) and don t know the normalizer Z Choosing a proposal at the correct scale is difficult Properties: Similar to Gibbs Sampling: one- dimensional transitions in the state space Similar to Rejection Sampling: (asymptotically) draws samples from the region under the curve p(x) An MCMC method with an adaptive proposal 38

39 Slide from Ian Murray Slice sampling idea Sample point uniformly under curve P (x) P (x) This is just an auxiliary- variable Gibbs Sampler! P (x) (x, u) u x Problem: Sampling from the conditional p(x u) might be infeasible. p(u x) =Uniform[0, P (x)] p(x u) { 1 P (x) u 0 otherwise = Uniform on the slice 39

40 Figure adapted from MacKay Ch. 29 Slice Sampling 1 40

41 Figure adapted from MacKay Ch. 29 Slice Sampling 1 41

42 Figure adapted from MacKay Ch. 29 Slice Sampling 1 42

43 Slice Sampling Algorithm: Goal: sample (x, u) given (u (t),x (t) ). u Uniform(0,p(x (t) ) Part 1: Stepping Out Sample interval (x l,x r ) enclosing x (t). r Uniform(u, w) (x l,x r )=(x (t) r, x (t) + w r) Expand until endpoints are outside region under curve. while( p(x l ) >u){x l = x l w} while( p(x r ) >u){x r = x r + w} Part 2: Sample x (Shrinking) while(true) { Draw x from within the interval (x l,x r ), then accept or shrink. x Uniform(x l,x r ) if( p(x) >u){break} else if(x >x (t) ){x r = x} else{x l = x} } x (t+1) = x, u (t+1) = u 43

44 Slice Sampling Algorithm: Goal: sample (x, u) given (u (t),x (t) ). u Uniform(0,p(x (t) ) Part 1: Stepping Out Sample interval (x l,x r ) enclosing x (t). r Uniform(u, w) (x l,x r )=(x (t) r, x (t) + w r) Expand until endpoints are outside region under curve. while( p(x l ) >u){x l = x l w} while( p(x r ) >u){x r = x r + w} Part 2: Sample x (Shrinking) while(true) { Draw x from within the interval (x l,x r ), then accept or shrink. x Uniform(x l,x r ) if( p(x) >u){break} else if(x >x (t) ){x r = x} else{x l = x} } x (t+1) = x, u (t+1) = u 44

45 Slice Sampling Algorithm: Goal: sample (x, u) given (u (t),x (t) ). u Uniform(0,p(x (t) ) Part 1: Stepping Out Sample interval (x l,x r ) enclosing x (t). r Uniform(u, w) (x l,x r )=(x (t) r, x (t) + w r) Expand until endpoints are outside region under curve. while( p(x l ) >u){x l = x l w} while( p(x r ) >u){x r = x r + w} Part 2: Sample x (Shrinking) while(true) { Draw x from within the interval (x l,x r ), then accept or shrink. x Uniform(x l,x r ) if( p(x) >u){break} else if(x >x (t) ){x r = x} else{x l = x} } x (t+1) = x, u (t+1) = u 45

46 Slice Sampling Multivariate Distributions Resample each variable x i one- at- a- time (just like Gibbs Sampling) Does not require sampling from p(x i {x j } j6=i ) Only need to evaluate a quantity proportional to the conditional p(x i {x j } j6=i ) / p(x i {x j } j6=i ) 46

47 Hamiltonian Monte Carlo Suppose we have a distribution of the form: p(x) =exp{ E(x)}/Z where x 2 R N We could use random- walk M- H to draw 2 R samples, but it seems a shame to discard gradient information r x E(x) If we can evaluate it, the gradient tells us where to look for high- probability regions! 47

48 Background: Hamiltonian Dyanmics Applications: Following the motion of atoms in a fluid through time Integrating the motion of a solar system over time Considering the evolution of a galaxy (i.e. the motion of its stars) molecular dynamics N- body simulations Properties: Total energy of the system H(x,p) stays constant Dynamics are reversible Important for detailed balance 48

49 Background: Hamiltonian Dyanmics Let x 2 R N p 2 R N Potential energy: Kinetic energy: Total energy: be a position be a momentum E(x) K(p) =p T p/2 H(x, p) =E(x)+K(p) Hamiltonian function Given a starting position x (1) and a starting momentum p (1) we can simulate the Hamiltonian dynamics of the system via: 1. Euler s method 2. Leapfrog method 3. etc. 49

50 Background: Hamiltonian Dyanmics Parameters to tune: 1. Step size, ϵ 2. Number of iterations, L Leapfrog Algorithm: for in 1...L: p = p 2 r xe(x) x = x + p p = p 2 r xe(x) 50

51 Figure from Neal (2011) Background: Hamiltonian Dyanmics (a) Euler s method, stepsize 0.3 (b) Modified Euler s method, stepsize Momentum ( p) 0 Momentum ( p) Position (q) Position (q) (c) Leapfrog method, stepsize 0.3 (d) Leapfrog method, stepsize Momentum ( p) 0 Momentum ( p) Position (q) Position (q) 51

52 Figure from Neal (2011) Goal: Define: Hamiltonian Monte Carlo Preliminaries p(x) =exp{ E(x)}/Z where x 2 R N 2 R K(p) =p T p/2 H(x, p) =E(x)+K(p) p(x, p) =exp{ H(x, p)}/z H =exp{ E(x} exp{ K(p)}/Z H Note: Since p(x,p) is separable ) X p ) X x p(x, p) =exp{ p(x, p) =exp{ E(x}/Z K(x}/Z K Target dist. Gaussian 52

53 Whiteboard Hamiltonian Monte Carlo algorithm (aka. Hybrid Monte Carlo) 53

54 Figure from Neal (2011) Hamiltonian Monte Carlo 2 Position coordinates 2 Momentum coordinates 2.6 Value of Hamiltonian

55 Figure from Neal (2011) M- H vs. HMC Random walk Metropolis Hamiltonian Monte Carlo

56 Simulations of MCMC Visualization of Metroplis- Hastings, Gibbs Sampling, and Hamiltonian MCMC: mcmc/ 56

57 Slide adapted from Daphe Koller Pros MCMC Summary Very general purpose Often easy to implement Good theoretical guarantees as Cons t!1 Lots of tunable parameters / design choices Can be quite slow to converge Difficult to tell whether it's working 57

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Probabilistic Models of Cognition, 2011 http://www.ipam.ucla.edu/programs/gss2011/ Roadmap: Motivation Monte Carlo basics What is MCMC? Metropolis Hastings and Gibbs...more tomorrow.

More information

MCMC and Gibbs Sampling. Kayhan Batmanghelich

MCMC and Gibbs Sampling. Kayhan Batmanghelich MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction

More information

19 : Slice Sampling and HMC

19 : Slice Sampling and HMC 10-708: Probabilistic Graphical Models 10-708, Spring 2018 19 : Slice Sampling and HMC Lecturer: Kayhan Batmanghelich Scribes: Boxiang Lyu 1 MCMC (Auxiliary Variables Methods) In inference, we are often

More information

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

16 : Markov Chain Monte Carlo (MCMC)

16 : Markov Chain Monte Carlo (MCMC) 10-708: Probabilistic Graphical Models 10-708, Spring 2014 16 : Markov Chain Monte Carlo MCMC Lecturer: Matthew Gormley Scribes: Yining Wang, Renato Negrinho 1 Sampling from low-dimensional distributions

More information

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Machine Learning Summer School 2009 http://mlg.eng.cam.ac.uk/mlss09/ Based on slides by: Iain Murray http://www.cs.toronto.edu/~murray/ https://speakerdeck.com/ixkael/data-driven-models-of-the-milky-way-in-the-gaia-era

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs

More information

Monte Carlo Methods. Leon Gu CSD, CMU

Monte Carlo Methods. Leon Gu CSD, CMU Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte

More information

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions

More information

Lecture 7 and 8: Markov Chain Monte Carlo

Lecture 7 and 8: Markov Chain Monte Carlo Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani

More information

Markov Networks.

Markov Networks. Markov Networks www.biostat.wisc.edu/~dpage/cs760/ Goals for the lecture you should understand the following concepts Markov network syntax Markov network semantics Potential functions Partition function

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Markov chain Monte Carlo (MCMC) Gibbs and Metropolis Hastings Slice sampling Practical details Iain Murray http://iainmurray.net/ Reminder Need to sample large, non-standard distributions:

More information

Markov Chain Monte Carlo Inference. Siamak Ravanbakhsh Winter 2018

Markov Chain Monte Carlo Inference. Siamak Ravanbakhsh Winter 2018 Graphical Models Markov Chain Monte Carlo Inference Siamak Ravanbakhsh Winter 2018 Learning objectives Markov chains the idea behind Markov Chain Monte Carlo (MCMC) two important examples: Gibbs sampling

More information

Quantifying Uncertainty

Quantifying Uncertainty Sai Ravela M. I. T Last Updated: Spring 2013 1 Markov Chain Monte Carlo Monte Carlo sampling made for large scale problems via Markov Chains Monte Carlo Sampling Rejection Sampling Importance Sampling

More information

Advanced MCMC methods

Advanced MCMC methods Advanced MCMC methods http://learning.eng.cam.ac.uk/zoubin/tutorials06.html Department of Engineering, University of Cambridge Michaelmas 2006 Iain Murray i.murray+ta@gatsby.ucl.ac.uk Probabilistic modelling

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods: Markov Chain Monte Carlo Group Prof. Daniel Cremers 11. Sampling Methods: Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative

More information

Monte Carlo Inference Methods

Monte Carlo Inference Methods Monte Carlo Inference Methods Iain Murray University of Edinburgh http://iainmurray.net Monte Carlo and Insomnia Enrico Fermi (1901 1954) took great delight in astonishing his colleagues with his remarkably

More information

Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Oliver Schulte - CMPT 419/726. Bishop PRML Ch.

Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Oliver Schulte - CMPT 419/726. Bishop PRML Ch. Sampling Methods Oliver Schulte - CMP 419/726 Bishop PRML Ch. 11 Recall Inference or General Graphs Junction tree algorithm is an exact inference method for arbitrary graphs A particular tree structure

More information

17 : Optimization and Monte Carlo Methods

17 : Optimization and Monte Carlo Methods 10-708: Probabilistic Graphical Models Spring 2017 17 : Optimization and Monte Carlo Methods Lecturer: Avinava Dubey Scribes: Neil Spencer, YJ Choe 1 Recap 1.1 Monte Carlo Monte Carlo methods such as rejection

More information

Approximate Inference using MCMC

Approximate Inference using MCMC Approximate Inference using MCMC 9.520 Class 22 Ruslan Salakhutdinov BCS and CSAIL, MIT 1 Plan 1. Introduction/Notation. 2. Examples of successful Bayesian models. 3. Basic Sampling Algorithms. 4. Markov

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Results: MCMC Dancers, q=10, n=500

Results: MCMC Dancers, q=10, n=500 Motivation Sampling Methods for Bayesian Inference How to track many INTERACTING targets? A Tutorial Frank Dellaert Results: MCMC Dancers, q=10, n=500 1 Probabilistic Topological Maps Results Real-Time

More information

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods Prof. Daniel Cremers 14. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

MCMC algorithms for fitting Bayesian models

MCMC algorithms for fitting Bayesian models MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models

More information

Monte Carlo Methods for Inference and Learning

Monte Carlo Methods for Inference and Learning Monte Carlo Methods for Inference and Learning Ryan Adams University of Toronto CIFAR NCAP Summer School 14 August 2010 http://www.cs.toronto.edu/~rpa Thanks to: Iain Murray, Marc Aurelio Ranzato Overview

More information

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo

Computer Vision Group Prof. Daniel Cremers. 10a. Markov Chain Monte Carlo Group Prof. Daniel Cremers 10a. Markov Chain Monte Carlo Markov Chain Monte Carlo In high-dimensional spaces, rejection sampling and importance sampling are very inefficient An alternative is Markov Chain

More information

Gradient-based Monte Carlo sampling methods

Gradient-based Monte Carlo sampling methods Gradient-based Monte Carlo sampling methods Johannes von Lindheim 31. May 016 Abstract Notes for a 90-minute presentation on gradient-based Monte Carlo sampling methods for the Uncertainty Quantification

More information

14 : Approximate Inference Monte Carlo Methods

14 : Approximate Inference Monte Carlo Methods 10-708: Probabilistic Graphical Models 10-708, Spring 2018 14 : Approximate Inference Monte Carlo Methods Lecturer: Kayhan Batmanghelich Scribes: Biswajit Paria, Prerna Chiersal 1 Introduction We have

More information

Markov chain Monte Carlo Lecture 9

Markov chain Monte Carlo Lecture 9 Markov chain Monte Carlo Lecture 9 David Sontag New York University Slides adapted from Eric Xing and Qirong Ho (CMU) Limitations of Monte Carlo Direct (unconditional) sampling Hard to get rare events

More information

Answers and expectations

Answers and expectations Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E

More information

Probabilistic Graphical Models

Probabilistic Graphical Models 10-708 Probabilistic Graphical Models Homework 3 (v1.1.0) Due Apr 14, 7:00 PM Rules: 1. Homework is due on the due date at 7:00 PM. The homework should be submitted via Gradescope. Solution to each problem

More information

1 Geometry of high dimensional probability distributions

1 Geometry of high dimensional probability distributions Hamiltonian Monte Carlo October 20, 2018 Debdeep Pati References: Neal, Radford M. MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo 2.11 (2011): 2. Betancourt, Michael. A conceptual

More information

Bayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018

Bayesian Networks Introduction to Machine Learning. Matt Gormley Lecture 24 April 9, 2018 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks Matt Gormley Lecture 24 April 9, 2018 1 Homework 7: HMMs Reminders

More information

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods Prof. Daniel Cremers 11. Sampling Methods Sampling Methods Sampling Methods are widely used in Computer Science as an approximation of a deterministic algorithm to represent uncertainty without a parametric

More information

Machine Learning for Data Science (CS4786) Lecture 24

Machine Learning for Data Science (CS4786) Lecture 24 Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each

More information

Statistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling

Statistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling 1 / 27 Statistical Machine Learning Lecture 8: Markov Chain Monte Carlo Sampling Melih Kandemir Özyeğin University, İstanbul, Turkey 2 / 27 Monte Carlo Integration The big question : Evaluate E p(z) [f(z)]

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo 1 Motivation 1.1 Bayesian Learning Markov Chain Monte Carlo Yale Chang In Bayesian learning, given data X, we make assumptions on the generative process of X by introducing hidden variables Z: p(z): prior

More information

MCMC Methods: Gibbs and Metropolis

MCMC Methods: Gibbs and Metropolis MCMC Methods: Gibbs and Metropolis Patrick Breheny February 28 Patrick Breheny BST 701: Bayesian Modeling in Biostatistics 1/30 Introduction As we have seen, the ability to sample from the posterior distribution

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

CS 188: Artificial Intelligence. Bayes Nets

CS 188: Artificial Intelligence. Bayes Nets CS 188: Artificial Intelligence Probabilistic Inference: Enumeration, Variable Elimination, Sampling Pieter Abbeel UC Berkeley Many slides over this course adapted from Dan Klein, Stuart Russell, Andrew

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

MCMC and Gibbs Sampling. Sargur Srihari

MCMC and Gibbs Sampling. Sargur Srihari MCMC and Gibbs Sampling Sargur srihari@cedar.buffalo.edu 1 Topics 1. Markov Chain Monte Carlo 2. Markov Chains 3. Gibbs Sampling 4. Basic Metropolis Algorithm 5. Metropolis-Hastings Algorithm 6. Slice

More information

SAMPLING ALGORITHMS. In general. Inference in Bayesian models

SAMPLING ALGORITHMS. In general. Inference in Bayesian models SAMPLING ALGORITHMS SAMPLING ALGORITHMS In general A sampling algorithm is an algorithm that outputs samples x 1, x 2,... from a given distribution P or density p. Sampling algorithms can for example be

More information

The Origin of Deep Learning. Lili Mou Jan, 2015

The Origin of Deep Learning. Lili Mou Jan, 2015 The Origin of Deep Learning Lili Mou Jan, 2015 Acknowledgment Most of the materials come from G. E. Hinton s online course. Outline Introduction Preliminary Boltzmann Machines and RBMs Deep Belief Nets

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

Sampling Methods. Bishop PRML Ch. 11. Alireza Ghane. Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo

Sampling Methods. Bishop PRML Ch. 11. Alireza Ghane. Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo Sampling Methods Bishop PRML h. 11 Alireza Ghane Sampling Methods A. Ghane /. Möller / G. Mori 1 Recall Inference or General Graphs Junction tree algorithm is an exact inference method for arbitrary graphs

More information

Cheng Soon Ong & Christian Walder. Canberra February June 2018

Cheng Soon Ong & Christian Walder. Canberra February June 2018 Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 Outlines Overview Introduction Linear Algebra Probability Linear Regression

More information

Introduction to Hamiltonian Monte Carlo Method

Introduction to Hamiltonian Monte Carlo Method Introduction to Hamiltonian Monte Carlo Method Mingwei Tang Department of Statistics University of Washington mingwt@uw.edu November 14, 2017 1 Hamiltonian System Notation: q R d : position vector, p R

More information

Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Machine Learning. Torsten Möller.

Sampling Rejection Sampling Importance Sampling Markov Chain Monte Carlo. Sampling Methods. Machine Learning. Torsten Möller. Sampling Methods Machine Learning orsten Möller Möller/Mori 1 Recall Inference or General Graphs Junction tree algorithm is an exact inference method for arbitrary graphs A particular tree structure defined

More information

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence Bayesian Inference in GLMs Frequentists typically base inferences on MLEs, asymptotic confidence limits, and log-likelihood ratio tests Bayesians base inferences on the posterior distribution of the unknowns

More information

16 : Approximate Inference: Markov Chain Monte Carlo

16 : Approximate Inference: Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models 10-708, Spring 2017 16 : Approximate Inference: Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Yuan Yang, Chao-Ming Yen 1 Introduction As the target distribution

More information

Brief introduction to Markov Chain Monte Carlo

Brief introduction to Markov Chain Monte Carlo Brief introduction to Department of Probability and Mathematical Statistics seminar Stochastic modeling in economics and finance November 7, 2011 Brief introduction to Content 1 and motivation Classical

More information

Monte Carlo Methods. Geoff Gordon February 9, 2006

Monte Carlo Methods. Geoff Gordon February 9, 2006 Monte Carlo Methods Geoff Gordon ggordon@cs.cmu.edu February 9, 2006 Numerical integration problem 5 4 3 f(x,y) 2 1 1 0 0.5 0 X 0.5 1 1 0.8 0.6 0.4 Y 0.2 0 0.2 0.4 0.6 0.8 1 x X f(x)dx Used for: function

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Robert Collins Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Robert Collins A Brief Overview of Sampling Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll

More information

Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method

Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method Slice Sampling with Adaptive Multivariate Steps: The Shrinking-Rank Method Madeleine B. Thompson Radford M. Neal Abstract The shrinking rank method is a variation of slice sampling that is efficient at

More information

Approximate inference in Energy-Based Models

Approximate inference in Energy-Based Models CSC 2535: 2013 Lecture 3b Approximate inference in Energy-Based Models Geoffrey Hinton Two types of density model Stochastic generative model using directed acyclic graph (e.g. Bayes Net) Energy-based

More information

Part IV: Monte Carlo and nonparametric Bayes

Part IV: Monte Carlo and nonparametric Bayes Part IV: Monte Carlo and nonparametric Bayes Outline Monte Carlo methods Nonparametric Bayesian models Outline Monte Carlo methods Nonparametric Bayesian models The Monte Carlo principle The expectation

More information

Sampling Methods (11/30/04)

Sampling Methods (11/30/04) CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with

More information

CS 343: Artificial Intelligence

CS 343: Artificial Intelligence CS 343: Artificial Intelligence Bayes Nets: Sampling Prof. Scott Niekum The University of Texas at Austin [These slides based on those of Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley.

More information

Introduction to MCMC. DB Breakfast 09/30/2011 Guozhang Wang

Introduction to MCMC. DB Breakfast 09/30/2011 Guozhang Wang Introduction to MCMC DB Breakfast 09/30/2011 Guozhang Wang Motivation: Statistical Inference Joint Distribution Sleeps Well Playground Sunny Bike Ride Pleasant dinner Productive day Posterior Estimation

More information

Hamiltonian Monte Carlo for Scalable Deep Learning

Hamiltonian Monte Carlo for Scalable Deep Learning Hamiltonian Monte Carlo for Scalable Deep Learning Isaac Robson Department of Statistics and Operations Research, University of North Carolina at Chapel Hill isrobson@email.unc.edu BIOS 740 May 4, 2018

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Reminder of some Markov Chain properties:

Reminder of some Markov Chain properties: Reminder of some Markov Chain properties: 1. a transition from one state to another occurs probabilistically 2. only state that matters is where you currently are (i.e. given present, future is independent

More information

RANDOM TOPICS. stochastic gradient descent & Monte Carlo

RANDOM TOPICS. stochastic gradient descent & Monte Carlo RANDOM TOPICS stochastic gradient descent & Monte Carlo MASSIVE MODEL FITTING nx minimize f(x) = 1 n i=1 f i (x) Big! (over 100K) minimize 1 least squares 2 kax bk2 = X i 1 2 (a ix b i ) 2 minimize 1 SVM

More information

Intelligent Systems I

Intelligent Systems I 1, Intelligent Systems I 12 SAMPLING METHODS THE LAST THING YOU SHOULD EVER TRY Philipp Hennig & Stefan Harmeling Max Planck Institute for Intelligent Systems 23. January 2014 Dptmt. of Empirical Inference

More information

SC7/SM6 Bayes Methods HT18 Lecturer: Geoff Nicholls Lecture 2: Monte Carlo Methods Notes and Problem sheets are available at http://www.stats.ox.ac.uk/~nicholls/bayesmethods/ and via the MSc weblearn pages.

More information

an introduction to bayesian inference

an introduction to bayesian inference with an application to network analysis http://jakehofman.com january 13, 2010 motivation would like models that: provide predictive and explanatory power are complex enough to describe observed phenomena

More information

Sampling Algorithms for Probabilistic Graphical models

Sampling Algorithms for Probabilistic Graphical models Sampling Algorithms for Probabilistic Graphical models Vibhav Gogate University of Washington References: Chapter 12 of Probabilistic Graphical models: Principles and Techniques by Daphne Koller and Nir

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems

Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems Bayesian Methods and Uncertainty Quantification for Nonlinear Inverse Problems John Bardsley, University of Montana Collaborators: H. Haario, J. Kaipio, M. Laine, Y. Marzouk, A. Seppänen, A. Solonen, Z.

More information

Paul Karapanagiotidis ECO4060

Paul Karapanagiotidis ECO4060 Paul Karapanagiotidis ECO4060 The way forward 1) Motivate why Markov-Chain Monte Carlo (MCMC) is useful for econometric modeling 2) Introduce Markov-Chain Monte Carlo (MCMC) - Metropolis-Hastings (MH)

More information

Markov Chains and MCMC

Markov Chains and MCMC Markov Chains and MCMC CompSci 590.02 Instructor: AshwinMachanavajjhala Lecture 4 : 590.02 Spring 13 1 Recap: Monte Carlo Method If U is a universe of items, and G is a subset satisfying some property,

More information

Monte Carlo methods for sampling-based Stochastic Optimization

Monte Carlo methods for sampling-based Stochastic Optimization Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from

More information

Monte Carlo in Bayesian Statistics

Monte Carlo in Bayesian Statistics Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview

More information

18 : Advanced topics in MCMC. 1 Gibbs Sampling (Continued from the last lecture)

18 : Advanced topics in MCMC. 1 Gibbs Sampling (Continued from the last lecture) 10-708: Probabilistic Graphical Models 10-708, Spring 2014 18 : Advanced topics in MCMC Lecturer: Eric P. Xing Scribes: Jessica Chemali, Seungwhan Moon 1 Gibbs Sampling (Continued from the last lecture)

More information

Lecture 6: Markov Chain Monte Carlo

Lecture 6: Markov Chain Monte Carlo Lecture 6: Markov Chain Monte Carlo D. Jason Koskinen koskinen@nbi.ku.dk Photo by Howard Jackman University of Copenhagen Advanced Methods in Applied Statistics Feb - Apr 2016 Niels Bohr Institute 2 Outline

More information

Bayesian Networks (Part II)

Bayesian Networks (Part II) 10-601 Introduction to Machine Learning Machine Learning Department School of Computer Science Carnegie Mellon University Bayesian Networks (Part II) Graphical Model Readings: Murphy 10 10.2.1 Bishop 8.1,

More information

Use of probability gradients in hybrid MCMC and a new convergence test

Use of probability gradients in hybrid MCMC and a new convergence test Use of probability gradients in hybrid MCMC and a new convergence test Ken Hanson Methods for Advanced Scientific Simulations Group This presentation available under http://www.lanl.gov/home/kmh/ June

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

Basic Sampling Methods

Basic Sampling Methods Basic Sampling Methods Sargur Srihari srihari@cedar.buffalo.edu 1 1. Motivation Topics Intractability in ML How sampling can help 2. Ancestral Sampling Using BNs 3. Transforming a Uniform Distribution

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos Contents Markov Chain Monte Carlo Methods Sampling Rejection Importance Hastings-Metropolis Gibbs Markov Chains

More information

Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.

Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J. Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox fox@physics.otago.ac.nz Richard A. Norton, J. Andrés Christen Topics... Backstory (?) Sampling in linear-gaussian hierarchical

More information

Doing Bayesian Integrals

Doing Bayesian Integrals ASTR509-13 Doing Bayesian Integrals The Reverend Thomas Bayes (c.1702 1761) Philosopher, theologian, mathematician Presbyterian (non-conformist) minister Tunbridge Wells, UK Elected FRS, perhaps due to

More information

Hamiltonian Monte Carlo with Fewer Momentum Reversals

Hamiltonian Monte Carlo with Fewer Momentum Reversals Hamiltonian Monte Carlo with ewer Momentum Reversals Jascha Sohl-Dickstein December 6, 2 Hamiltonian dynamics with partial momentum refreshment, in the style of Horowitz, Phys. ett. B, 99, explore the

More information

Markov Chain Monte Carlo, Numerical Integration

Markov Chain Monte Carlo, Numerical Integration Markov Chain Monte Carlo, Numerical Integration (See Statistics) Trevor Gallen Fall 2015 1 / 1 Agenda Numerical Integration: MCMC methods Estimating Markov Chains Estimating latent variables 2 / 1 Numerical

More information

STAT 425: Introduction to Bayesian Analysis

STAT 425: Introduction to Bayesian Analysis STAT 425: Introduction to Bayesian Analysis Marina Vannucci Rice University, USA Fall 2017 Marina Vannucci (Rice University, USA) Bayesian Analysis (Part 2) Fall 2017 1 / 19 Part 2: Markov chain Monte

More information

Inverse Problems in the Bayesian Framework

Inverse Problems in the Bayesian Framework Inverse Problems in the Bayesian Framework Daniela Calvetti Case Western Reserve University Cleveland, Ohio Raleigh, NC, July 2016 Bayes Formula Stochastic model: Two random variables X R n, B R m, where

More information

Markov Chain Monte Carlo

Markov Chain Monte Carlo Markov Chain Monte Carlo Recall: To compute the expectation E ( h(y ) ) we use the approximation E(h(Y )) 1 n n h(y ) t=1 with Y (1),..., Y (n) h(y). Thus our aim is to sample Y (1),..., Y (n) from f(y).

More information

Learning Energy-Based Models of High-Dimensional Data

Learning Energy-Based Models of High-Dimensional Data Learning Energy-Based Models of High-Dimensional Data Geoffrey Hinton Max Welling Yee-Whye Teh Simon Osindero www.cs.toronto.edu/~hinton/energybasedmodelsweb.htm Discovering causal structure as a goal

More information