A stochastic formulation of a dynamical singly constrained spatial interaction model

Size: px
Start display at page:

Download "A stochastic formulation of a dynamical singly constrained spatial interaction model"

Transcription

1 A stochastic formulation of a dynamical singly constrained spatial interaction model Mark Girolami Department of Mathematics, Imperial College London The Alan Turing Institute, British Library Lloyds Register Foundation Programme on Data-Centric Engineering Brunel University London Modern Statistical Methods in Health and Environment Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

2 Louis Ellam Mark Girolami Greg Pavliotis Alan Wilson ICL & ATI ICL & ATI ICL UCL & ATI Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

3 Motivation Question: What will cities and regions look like in the future? Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

4 Urban Analytics How will cities and regions develop as time progresses? Observational data from Twitter, Android location tracking, camera monitoring, travel ticketing, demographics, etc. Simple statistical summaries and models yielding insights the hallmark of Urban Analytics. Mathematical modelling long history in urban and regional analysis and planning. Build upon these mathematical formalisms with this new found data (throwing baby out with bath water). Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

5 Complex Urban Systems We re uncertain! Urban and regional systems are complex. Actions of individuals give rise to an emergent behaviour: Phase transitions; Path dependence; and Multiple equilibria. Mechanistic models of complex systems = model error. Uncertainty should be addressed. This talk: Improving insights into urban and regional development by addressing uncertainties arising in the modelling process. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

6 The London Retail System Figure: London retail structure for 2008 (left) and 2012 (right). The locations of retail zones and residential zones are red and blue, respectively. Sizes are in proportion to floorspace and spending power, respectively. N = 625 and M = 201. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

7 The London Retail System Stratford and Shepherds Bush have risen up the rankings as a result of the opening of Westfield Stratford City and Westfield London. Knightsbridge appears to have experienced a significant reduction in town centre floorspace. Ilford and Harrow also experienced declines in town centre floorspace. The West End remains the largest centre in London. Croydon, Kingston, Romford, Canary Wharf, Camden Town, Kings Road East and Angel showed strong growth in total town centre floorspace over the period. Source: 2013 London Town Centre Health Check Analysis Report. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

8 Airports in England Figure: Airports by number of passengers for 2007 (left) and 2016 (right). The locations of airports and residential zones are red and blue, respectively. N = 33 and M = 25. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

9 Urban and Regional Modelling Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

10 Spatial Interaction Model Assume N origin zones and M destination zones. Origin quantities {O i } N i=1. Destination quantities {D j } M j=1. {T ij } N,M i,j=1 denote the flows from zone i to j, respectively. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

11 Assumptions Flows from origin zones: O i = M T ij, i = 1,..., N. j=1 Flows to destination zones: D j = N T ij, j = 1,..., M. i=1 Known as a singly-constrained or production-constrained system since {O i } N i=1 is fixed and {D j} M j=1 is undetermined. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

12 Assumptions Fixed transport benefit: N M T ij ln W j = X, i=1 j=1 where W j is the size, and X j := ln W j is the attractiveness, of zone j. Fixed transport cost: N M T ij c ij = C, i=1 j=1 where c ij is the cost of transporting a unit from zone i to zone j. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

13 A Singly-Constrained Spatial Interaction Model Maximization of an entropy function. The destination flows are given by D j = N i=1 O i exp ( α ln W j βc ij ) M k=1 exp ( α ln W k βc ik ). X j := ln W j is the attractiveness of j. c ij is the inconvenience of transporting from zone i to j. α is the attractiveness scaling parameter. β is the cost scaling parameter. αx j βc ij is the net utility from transporting from zone i to j. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

14 A Dynamical Urban and Regional Model Harris and Wilson Model The sizes of the destination zones satisfy the (modified) competitive Lotka-Volterra equations dw j dt for some initial condition W (0) = w. Definitions: ) = ɛ (D j κw j W j, j = 1,..., M, κ > 0 is the cost rate per unit size; and ɛ > 0 is the responsiveness parameter. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

15 Stochastic Urban and Regional Modelling Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

16 Potential Energy Function In what proceeds, it is more natural to work in terms of attractiveness X j := ln W j. The energy of a configuration X = (X 1,..., X M ) T is given by an interaction potential U(X ). The force acting on each site is given by a conservative vector field U(X ). Interaction Potential γ 1 U(X ) = κ M exp(x j ) α 1 j=1 }{{} Running cost N i=1 M M O i ln exp(αx j βc ij ) δ j=1 } {{ } Net utility j=1 X j }{{} Additional. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

17 More General Modelling Assumptions We assume that: 1 There exists a potential function U C 2 (R M, R) such that γ 1 U(X ) = R j (X ) κ exp(x j ) + δ j (X ). 2 The drift coefficients U(X ) are locally Lipshitz and satisfy the one-sided growth condition in that there exist K 1, K 2 > 0 such that U(X ) X K 1 + K 2 X 2, X R M. 3 U satisfies the super-linear growth condition in that there exists K 1, K 2 > 0 such that U(X ) K 1 + K 2 X, X R M. 4 The initial condition X (0) = x has finite variance E x 2 <. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

18 Key Results under the Modelling Assumptions Urban dynamics, in terms of attractiveness, satisfies the overdamped Langevin dynamics dx dt = U(X ) + 2γ 1 db, X (0) = x. dt The SDE has a unique solution X C([0, T ]; R M ) that does not explode on [0, T ], almost surely. Since δ j = δ > 0 and U C 2 with a growth condition, X (t) is ergodic with respect to a well-defined Boltzmann distribution, whose density is given by π (X ) = 1 Z exp ( γu(x ) ), Z := exp ( γu(x ) ) dx <. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

19 A Stochastic Formulation of the Spatial Interaction Model With our specification of U(X ), overdamped Langevin dynamics give the Harris and Wilson model, plus multiplicative noise. Stochastic Urban Retail Model Floorspace dynamics is a stochastic process that satisfies the following Stratonovich SDE. dw j dt ) = ɛw j (D j κw j + δ + σw j db j dt, where ( B 1,..., B M ) T is standard M-dimensional Brownian motion and σ = 2/ɛγ > 0 is the volatility parameter. Fluctuations (missing dynamics) are modelled as noise that we interpret in the Stratonovich sense. The extra parameter (or function) δ to represents local economic stimulus to prevent zones from collapsing (needed for ergodicity). Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

20 Metropolised Time-Stepping methods Metropolis-Adjusted Langevin Truncated Algorithm (MALTA) Metropolis Hastings algorithm with an Euler-Maruyama proposal, with truncated drift: ˆX := ˆX n τ U( ˆX n ) 1 τ U( ˆX n ) + 2τγ 1 ξ n, ξ n N (0, I ). Truncated drift: U(X ) τ gives an Euler-Maruyama proposal; and U(X ) > τ gives an Euler-Maruyama proposal, but with unit drift. Resulting Markov chain is geometrically ergodic with respect to π. Converges strongly to the Markov process {X (t) : t [0, T ]} on finite time intervals. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

21 Illustrative Stochastic Urban Retail Model M = 2 and N = 20: Figure: Sample path of prototype example with parameter values α = 1.8, β = 1, γ = 1, δ = 0.5 and κ = 1. Under this parameter regime the two sites do not coexist and there are frequent phase transitions. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

22 The Boltzmann Distribution Insights Probability distribution that encodes our knowledge of spatial interactions. Configurations are confined to a metastable state (path dependence). As t, system moves towards lowest energy configurations. Sampling Challenges π is high-dimensional, anisotropic, multi-modal and involves rare events. α = 1.8, β = 1, δ = 0.5 α = 1.8, β = 0, δ = 0.1 α = 1.1, β = 10, δ = 0.2 Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

23 Sampling from the Boltzmann Distribution A Simplified Initial Distribution Define the density π 0 (X ) exp( γ 0 U 0 (X )) with U 0 (X ) = { M j=1 κ exp(x j ) }{{} Running cost } M O i )X j, ( δ + 1 M j=1 }{{} Equal net utility for some γ 0 < γ. We can sample from π 0 exactly and easily. Bridging Distributions Define a temperature schedule 0 = t 0 < t 1 < < t T = 1 with bridging distributions π j = 1 π 1 t j 0 π t j, where Z j := π j dx, for j = 1,..., T. Z j Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

24 Bridging distributions Figure: {π j } T =9 j=1 from left to right, then top to bottom. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

25 The London Retail System: Invariant Distribution Figure: A sample from the invariant distribution showing behaviour in the limit t (most stable configurations). The largest zone is Kensington with 12% of total floorspace and the West End continues to be significant with 5% of total floorspace. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

26 A Statistical Model of Urban Retail Structure Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

27 Hierarchical Modelling Given observation data Y = (Y 1,..., Y M ), of log-sizes, infer the parameter values θ = (α, β) T R 2 + and corresponding latent variables X R M. Assumption (Data generating process) Assume that each observation Y 1,..., Y M is an independent and identical realization of the following hierarchical model: Y 1,..., Y M X, σ N (X, σ 2 I ), X θ π(x θ) exp( U(X ; θ)), θ π(θ). Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

28 Joint Posterior Distribution The joint posterior is given by π(x, θ Y ) = π(y X, θ)π(x, θ), π(y ) = π(y ) π(y X, θ)π(x, θ)dxdθ. We have a hierarchical prior given by π(x, θ) = π(θ) exp( U(X ; θ)), Z(θ) = Z(θ) exp( U(X ; θ))dx. The joint posterior is doubly-intractable π(x, θ Y ) = π(y X, θ) exp( U(X ; θ))π(θ). π(y )Z(θ) Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

29 Metropolis-within-Gibbs with Block Updates We are interested in low-order summary statistics of the form E X,θ Y [h(x, θ)] = h(x, θ)π(x, θ Y )dxdθ. X and θ are highly coupled, so we use Metropolis-within-Gibbs with block updates. X -update. Propose X Q X and accept with probability { min 1, π(y X, θ) exp( U(X ; θ)) q(x X } ) π(y X, θ) exp( U(X ; θ)) q(x. X ) θ-update. Propose θ Q θ and accept with probability { min 1, π(y X, θ ) Z(θ) exp( U(X ; θ )) q(θ θ } ) π(y X, θ) Z(θ ) exp( U(X ; θ)) q(θ. θ) Unfortunately the ratio Z(θ )/Z(θ) ratio is intractable! Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

30 Unbiased Estimates of the Partition Function We can use the Psuedo-Marginal MCMC framework if we have an unbiased, positive estimate of π(x θ), denoted ˆπ(X θ, u), satisfying π(x θ) = ˆπ(X θ, u)π(u θ)du. The Forward Coupling estimator (FCE) gives an unbiased estimate of 1/Z: E[S] = 1/Z. The idea is to find two sequences of consistent estimators {Y (i) }, {Ỹ (i) }, each with the same distribution, such that Y (i) and Ỹ (i 1) are coupled. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

31 Unbiased Estimates of the Partition Function Requires N estimates of 1/Z using annealed importance sampling, for a random stopping time N. Coupling between Y (i) and Ỹ (i 1) is introduced with a Markov chain that shares random numbers. Variance reduction technique. Then the unbiased estimate is given by S := Y (0) + N i=1 Y (i) Ỹ (i 1). Pr(N i) Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

32 The Sign Problem S can be negative when Y (i) < Ỹ (i 1) for many i. This is known as the sign problem. Rejecting when S is negative would introduce a bias. Instead, we note that E[h(X, θ)] = 1 π(y ) h(x, θ)π(y X, θ)ˆπ(x θ, u)π(θ)π(u)dudθdx, = h(x, θ)σ(x θ, u)ˇπ(x, θ, u Y )dudθdx σ(x θ, u)ˇπ(x, θ, u Y )dudθdx, where σ is the sign function and we have defined ˇπ(X, θ, u Y ) = π(y X, θ) ˆπ(X θ, u) π(θ)π(u) π(y X, θ) ˆπ(X θ, u) π(θ)π(u)dudθdx. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

33 Pseudo-Marginal Markov Chain We can sample from ˇπ(X, θ, u Y ) using Metropolis-within-Gibbs with block updates. X -update. Propose X Q X and accept with probability { min 1, π(y X, θ) exp( U(X ; θ)) q(x X } ) π(y X, θ) exp( U(X ; θ)) q(x. X ) (θ, u)-update. Propose (θ, u ) Q θ,u and accept with probability { min 1, π(y X, θ ) S(θ, u) exp( U(X ; θ ))π(θ )π(u ) q(θ θ } ) π(y X, θ) S(θ, u ) exp( U(X ; θ))π(θ)π(u)q(θ. θ) Posterior expectations are estimated using N i=1 E X,θ Y [h(x, θ)] = lim h(x i, θ i )σ(x i θ i, u i ) N N i=1 σ(x. i θ i, u i ) Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

34 Illustration: Airports in England Figure: Left: Trace plots of α and β. 10.1% of the σ values are negative. The posterior mean of α and β are 0.97 and 0.94, respectively, indicating a moderate attraction towards to larger but fairly local airports. Right: 5th and 95th percentiles of the latent states X. There tends to be more variability in larger airports such as Heathrow and Manchester. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

35 Conclusions and Outlook Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

36 Further work and extensions Investigation of new data assimilation methodologies to calibrate models to data available at different scales. For example: Population data; Cost matrix; or Time dependent parameters. Deployment of new methodology to a global problem to provide new insights into urban retail structure. Improved mechanistic models of urban and regional phenomena. Melding of data and models takes us beyond data analytics. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

37 Summary We have revisited the Harris and Wilson model to formally account for uncertainties in the modelling process. A stochastic extension of the model, under certain conditions, may be expressed as a Langevin diffusion. The invariant distribution provides new insights into urban and regional dynamics, and suggests the introduction of a new parameter in the model. We presented a Bayesian hierarchical model for urban and regional systems. We have demonstrated our approach using an example of airports in England. Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

38 Further Reading / References 1 Equilibrium values and dynamics of attractiveness terms, B. Harris and A. Wilson (1978) 2 Explorations in urban and regional dynamics, J. Dearden and A. Wilson (2015) 3 Exponential convergence of Langevin distributions and their discrete approximations, G. Roberts and R. Tweedie (1996) 4 Pathwise accuracy and ergodicity of metropolized integrators for SDEs, N. Bou-Rabee and E. Vanden-Eijnden (2009) 5 On Russian roulette estimates for Bayesian inference with doubly-intractable likelihoods, A. Lyne, et al (2015). 6 Markov Chain Truncation for Doubly-Intractable Inference, C. Wei, and I. Murray (2016). Mark Girolami (Imperial, ATI) Stochastic formulation of SIM / 38

Retail Planning in Future Cities A Stochastic Dynamical Singly Constrained Spatial Interaction Model

Retail Planning in Future Cities A Stochastic Dynamical Singly Constrained Spatial Interaction Model Retail Planning in Future Cities A Stochastic Dynamical Singly Constrained Spatial Interaction Model Mark Girolami Department of Mathematics, Imperial College London The Alan Turing Institute Lloyds Register

More information

Stochastic modelling of urban structure

Stochastic modelling of urban structure Stochastic modelling of urban structure Louis Ellam Department of Mathematics, Imperial College London The Alan Turing Institute https://iconicmath.org/ IPAM, UCLA Uncertainty quantification for stochastic

More information

Markov Chain Monte Carlo (MCMC)

Markov Chain Monte Carlo (MCMC) Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can

More information

Zig-Zag Monte Carlo. Delft University of Technology. Joris Bierkens February 7, 2017

Zig-Zag Monte Carlo. Delft University of Technology. Joris Bierkens February 7, 2017 Zig-Zag Monte Carlo Delft University of Technology Joris Bierkens February 7, 2017 Joris Bierkens (TU Delft) Zig-Zag Monte Carlo February 7, 2017 1 / 33 Acknowledgements Collaborators Andrew Duncan Paul

More information

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait

A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling. Christopher Jennison. Adriana Ibrahim. Seminar at University of Kuwait A Search and Jump Algorithm for Markov Chain Monte Carlo Sampling Christopher Jennison Department of Mathematical Sciences, University of Bath, UK http://people.bath.ac.uk/mascj Adriana Ibrahim Institute

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

Computer intensive statistical methods

Computer intensive statistical methods Lecture 11 Markov Chain Monte Carlo cont. October 6, 2015 Jonas Wallin jonwal@chalmers.se Chalmers, Gothenburg university The two stage Gibbs sampler If the conditional distributions are easy to sample

More information

Riemann Manifold Methods in Bayesian Statistics

Riemann Manifold Methods in Bayesian Statistics Ricardo Ehlers ehlers@icmc.usp.br Applied Maths and Stats University of São Paulo, Brazil Working Group in Statistical Learning University College Dublin September 2015 Bayesian inference is based on Bayes

More information

Bayesian inference for stochastic differential mixed effects models - initial steps

Bayesian inference for stochastic differential mixed effects models - initial steps Bayesian inference for stochastic differential ixed effects odels - initial steps Gavin Whitaker 2nd May 2012 Supervisors: RJB and AG Outline Mixed Effects Stochastic Differential Equations (SDEs) Bayesian

More information

Nonparametric Drift Estimation for Stochastic Differential Equations

Nonparametric Drift Estimation for Stochastic Differential Equations Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,

More information

Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference

Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference Bayesian Inference for Discretely Sampled Diffusion Processes: A New MCMC Based Approach to Inference Osnat Stramer 1 and Matthew Bognar 1 Department of Statistics and Actuarial Science, University of

More information

Lecture 8: The Metropolis-Hastings Algorithm

Lecture 8: The Metropolis-Hastings Algorithm 30.10.2008 What we have seen last time: Gibbs sampler Key idea: Generate a Markov chain by updating the component of (X 1,..., X p ) in turn by drawing from the full conditionals: X (t) j Two drawbacks:

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Monte Carlo in Bayesian Statistics

Monte Carlo in Bayesian Statistics Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview

More information

Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J.

Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox Richard A. Norton, J. Deblurring Jupiter (sampling in GLIP faster than regularized inversion) Colin Fox fox@physics.otago.ac.nz Richard A. Norton, J. Andrés Christen Topics... Backstory (?) Sampling in linear-gaussian hierarchical

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Effective dynamics for the (overdamped) Langevin equation

Effective dynamics for the (overdamped) Langevin equation Effective dynamics for the (overdamped) Langevin equation Frédéric Legoll ENPC and INRIA joint work with T. Lelièvre (ENPC and INRIA) Enumath conference, MS Numerical methods for molecular dynamics EnuMath

More information

Bayesian estimation of complex networks and dynamic choice in the music industry

Bayesian estimation of complex networks and dynamic choice in the music industry Bayesian estimation of complex networks and dynamic choice in the music industry Stefano Nasini Víctor Martínez-de-Albéniz Dept. of Production, Technology and Operations Management, IESE Business School,

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

An ABC interpretation of the multiple auxiliary variable method

An ABC interpretation of the multiple auxiliary variable method School of Mathematical and Physical Sciences Department of Mathematics and Statistics Preprint MPS-2016-07 27 April 2016 An ABC interpretation of the multiple auxiliary variable method by Dennis Prangle

More information

Evidence estimation for Markov random fields: a triply intractable problem

Evidence estimation for Markov random fields: a triply intractable problem Evidence estimation for Markov random fields: a triply intractable problem January 7th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers

More information

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland),

The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), The University of Auckland Applied Mathematics Bayesian Methods for Inverse Problems : why and how Colin Fox Tiangang Cui, Mike O Sullivan (Auckland), Geoff Nicholls (Statistics, Oxford) fox@math.auckland.ac.nz

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

Manifold Monte Carlo Methods

Manifold Monte Carlo Methods Manifold Monte Carlo Methods Mark Girolami Department of Statistical Science University College London Joint work with Ben Calderhead Research Section Ordinary Meeting The Royal Statistical Society October

More information

Minicourse on: Markov Chain Monte Carlo: Simulation Techniques in Statistics

Minicourse on: Markov Chain Monte Carlo: Simulation Techniques in Statistics Minicourse on: Markov Chain Monte Carlo: Simulation Techniques in Statistics Eric Slud, Statistics Program Lecture 1: Metropolis-Hastings Algorithm, plus background in Simulation and Markov Chains. Lecture

More information

THE GENERAL URBAN MODEL: RETROSPECT AND PROSPECT. Alan Wilson Centre for Advanced Spatial Analysis University College London

THE GENERAL URBAN MODEL: RETROSPECT AND PROSPECT. Alan Wilson Centre for Advanced Spatial Analysis University College London THE GENERAL URBAN MODEL: RETROSPECT AND PROSPECT Alan Wilson Centre for Advanced Spatial Analysis University College London PART 1: RETROSPECT Achievements and insights INTRODUCTION anniversaries: beyond

More information

MH I. Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution

MH I. Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution MH I Metropolis-Hastings (MH) algorithm is the most popular method of getting dependent samples from a probability distribution a lot of Bayesian mehods rely on the use of MH algorithm and it s famous

More information

Advances and Applications in Perfect Sampling

Advances and Applications in Perfect Sampling and Applications in Perfect Sampling Ph.D. Dissertation Defense Ulrike Schneider advisor: Jem Corcoran May 8, 2003 Department of Applied Mathematics University of Colorado Outline Introduction (1) MCMC

More information

Monte Carlo Methods for Computation and Optimization (048715)

Monte Carlo Methods for Computation and Optimization (048715) Technion Department of Electrical Engineering Monte Carlo Methods for Computation and Optimization (048715) Lecture Notes Prof. Nahum Shimkin Spring 2015 i PREFACE These lecture notes are intended for

More information

Introduction to Bayesian methods in inverse problems

Introduction to Bayesian methods in inverse problems Introduction to Bayesian methods in inverse problems Ville Kolehmainen 1 1 Department of Applied Physics, University of Eastern Finland, Kuopio, Finland March 4 2013 Manchester, UK. Contents Introduction

More information

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,

More information

VCMC: Variational Consensus Monte Carlo

VCMC: Variational Consensus Monte Carlo VCMC: Variational Consensus Monte Carlo Maxim Rabinovich, Elaine Angelino, Michael I. Jordan Berkeley Vision and Learning Center September 22, 2015 probabilistic models! sky fog bridge water grass object

More information

Inexact approximations for doubly and triply intractable problems

Inexact approximations for doubly and triply intractable problems Inexact approximations for doubly and triply intractable problems March 27th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers of) interacting

More information

Sequential Monte Carlo Methods in High Dimensions

Sequential Monte Carlo Methods in High Dimensions Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,

More information

Introduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra.

Introduction Probabilistic Programming ProPPA Inference Results Conclusions. Embedding Machine Learning in Stochastic Process Algebra. Embedding Machine Learning in Stochastic Process Algebra Jane Hillston Joint work with Anastasis Georgoulas and Guido Sanguinetti, School of Informatics, University of Edinburgh 16th August 2017 quan col....

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

Metropolis-Hastings Algorithm

Metropolis-Hastings Algorithm Strength of the Gibbs sampler Metropolis-Hastings Algorithm Easy algorithm to think about. Exploits the factorization properties of the joint probability distribution. No difficult choices to be made to

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Chapter 11. Stochastic Methods Rooted in Statistical Mechanics

Chapter 11. Stochastic Methods Rooted in Statistical Mechanics Chapter 11. Stochastic Methods Rooted in Statistical Mechanics Neural Networks and Learning Machines (Haykin) Lecture Notes on Self-learning Neural Algorithms Byoung-Tak Zhang School of Computer Science

More information

A = {(x, u) : 0 u f(x)},

A = {(x, u) : 0 u f(x)}, Draw x uniformly from the region {x : f(x) u }. Markov Chain Monte Carlo Lecture 5 Slice sampler: Suppose that one is interested in sampling from a density f(x), x X. Recall that sampling x f(x) is equivalent

More information

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence

Bayesian Inference in GLMs. Frequentists typically base inferences on MLEs, asymptotic confidence Bayesian Inference in GLMs Frequentists typically base inferences on MLEs, asymptotic confidence limits, and log-likelihood ratio tests Bayesians base inferences on the posterior distribution of the unknowns

More information

Lattice Gaussian Sampling with Markov Chain Monte Carlo (MCMC)

Lattice Gaussian Sampling with Markov Chain Monte Carlo (MCMC) Lattice Gaussian Sampling with Markov Chain Monte Carlo (MCMC) Cong Ling Imperial College London aaa joint work with Zheng Wang (Huawei Technologies Shanghai) aaa September 20, 2016 Cong Ling (ICL) MCMC

More information

STA 294: Stochastic Processes & Bayesian Nonparametrics

STA 294: Stochastic Processes & Bayesian Nonparametrics MARKOV CHAINS AND CONVERGENCE CONCEPTS Markov chains are among the simplest stochastic processes, just one step beyond iid sequences of random variables. Traditionally they ve been used in modelling a

More information

Monte Carlo Inference Methods

Monte Carlo Inference Methods Monte Carlo Inference Methods Iain Murray University of Edinburgh http://iainmurray.net Monte Carlo and Insomnia Enrico Fermi (1901 1954) took great delight in astonishing his colleagues with his remarkably

More information

Bayesian Estimation of Input Output Tables for Russia

Bayesian Estimation of Input Output Tables for Russia Bayesian Estimation of Input Output Tables for Russia Oleg Lugovoy (EDF, RANE) Andrey Polbin (RANE) Vladimir Potashnikov (RANE) WIOD Conference April 24, 2012 Groningen Outline Motivation Objectives Bayesian

More information

1 Geometry of high dimensional probability distributions

1 Geometry of high dimensional probability distributions Hamiltonian Monte Carlo October 20, 2018 Debdeep Pati References: Neal, Radford M. MCMC using Hamiltonian dynamics. Handbook of Markov Chain Monte Carlo 2.11 (2011): 2. Betancourt, Michael. A conceptual

More information

Characterizing Travel Time Reliability and Passenger Path Choice in a Metro Network

Characterizing Travel Time Reliability and Passenger Path Choice in a Metro Network Characterizing Travel Time Reliability and Passenger Path Choice in a Metro Network Lijun SUN Future Cities Laboratory, Singapore-ETH Centre lijun.sun@ivt.baug.ethz.ch National University of Singapore

More information

Lecture 7 and 8: Markov Chain Monte Carlo

Lecture 7 and 8: Markov Chain Monte Carlo Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani

More information

MCMC algorithms for fitting Bayesian models

MCMC algorithms for fitting Bayesian models MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods By Oleg Makhnin 1 Introduction a b c M = d e f g h i 0 f(x)dx 1.1 Motivation 1.1.1 Just here Supresses numbering 1.1.2 After this 1.2 Literature 2 Method 2.1 New math As

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Likelihood-free inference and approximate Bayesian computation for stochastic modelling

Likelihood-free inference and approximate Bayesian computation for stochastic modelling Likelihood-free inference and approximate Bayesian computation for stochastic modelling Master Thesis April of 2013 September of 2013 Written by Oskar Nilsson Supervised by Umberto Picchini Centre for

More information

Bayesian Learning in Undirected Graphical Models

Bayesian Learning in Undirected Graphical Models Bayesian Learning in Undirected Graphical Models Zoubin Ghahramani Gatsby Computational Neuroscience Unit University College London, UK http://www.gatsby.ucl.ac.uk/ Work with: Iain Murray and Hyun-Chul

More information

MARKOV CHAIN MONTE CARLO

MARKOV CHAIN MONTE CARLO MARKOV CHAIN MONTE CARLO RYAN WANG Abstract. This paper gives a brief introduction to Markov Chain Monte Carlo methods, which offer a general framework for calculating difficult integrals. We start with

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Likelihood-free MCMC

Likelihood-free MCMC Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

Paul Karapanagiotidis ECO4060

Paul Karapanagiotidis ECO4060 Paul Karapanagiotidis ECO4060 The way forward 1) Motivate why Markov-Chain Monte Carlo (MCMC) is useful for econometric modeling 2) Introduce Markov-Chain Monte Carlo (MCMC) - Metropolis-Hastings (MH)

More information

Some Results on the Ergodicity of Adaptive MCMC Algorithms

Some Results on the Ergodicity of Adaptive MCMC Algorithms Some Results on the Ergodicity of Adaptive MCMC Algorithms Omar Khalil Supervisor: Jeffrey Rosenthal September 2, 2011 1 Contents 1 Andrieu-Moulines 4 2 Roberts-Rosenthal 7 3 Atchadé and Fort 8 4 Relationship

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Pseudo-marginal MCMC methods for inference in latent variable models

Pseudo-marginal MCMC methods for inference in latent variable models Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Probabilistic Models in Theoretical Neuroscience

Probabilistic Models in Theoretical Neuroscience Probabilistic Models in Theoretical Neuroscience visible unit Boltzmann machine semi-restricted Boltzmann machine restricted Boltzmann machine hidden unit Neural models of probabilistic sampling: introduction

More information

Answers and expectations

Answers and expectations Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E

More information

Gradient-based Monte Carlo sampling methods

Gradient-based Monte Carlo sampling methods Gradient-based Monte Carlo sampling methods Johannes von Lindheim 31. May 016 Abstract Notes for a 90-minute presentation on gradient-based Monte Carlo sampling methods for the Uncertainty Quantification

More information

Monte Carlo methods for sampling-based Stochastic Optimization

Monte Carlo methods for sampling-based Stochastic Optimization Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from

More information

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Probabilistic Models of Cognition, 2011 http://www.ipam.ucla.edu/programs/gss2011/ Roadmap: Motivation Monte Carlo basics What is MCMC? Metropolis Hastings and Gibbs...more tomorrow.

More information

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling

CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling CS242: Probabilistic Graphical Models Lecture 7B: Markov Chain Monte Carlo & Gibbs Sampling Professor Erik Sudderth Brown University Computer Science October 27, 2016 Some figures and materials courtesy

More information

Speeding up Convergence to Equilibrium for Diffusion Processes

Speeding up Convergence to Equilibrium for Diffusion Processes Speeding up Convergence to Equilibrium for Diffusion Processes G.A. Pavliotis Department of Mathematics Imperial College London Joint Work with T. Lelievre(CERMICS), F. Nier (CERMICS), M. Ottobre (Warwick),

More information

Stat 451 Lecture Notes Monte Carlo Integration

Stat 451 Lecture Notes Monte Carlo Integration Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:

More information

CSC 2541: Bayesian Methods for Machine Learning

CSC 2541: Bayesian Methods for Machine Learning CSC 2541: Bayesian Methods for Machine Learning Radford M. Neal, University of Toronto, 2011 Lecture 3 More Markov Chain Monte Carlo Methods The Metropolis algorithm isn t the only way to do MCMC. We ll

More information

Gradient-based Adaptive Stochastic Search

Gradient-based Adaptive Stochastic Search 1 / 41 Gradient-based Adaptive Stochastic Search Enlu Zhou H. Milton Stewart School of Industrial and Systems Engineering Georgia Institute of Technology November 5, 2014 Outline 2 / 41 1 Introduction

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

On Markov chain Monte Carlo methods for tall data

On Markov chain Monte Carlo methods for tall data On Markov chain Monte Carlo methods for tall data Remi Bardenet, Arnaud Doucet, Chris Holmes Paper review by: David Carlson October 29, 2016 Introduction Many data sets in machine learning and computational

More information

Basic math for biology

Basic math for biology Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood

More information

Gaussian processes for inference in stochastic differential equations

Gaussian processes for inference in stochastic differential equations Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017

More information

On Bayesian Computation

On Bayesian Computation On Bayesian Computation Michael I. Jordan with Elaine Angelino, Maxim Rabinovich, Martin Wainwright and Yun Yang Previous Work: Information Constraints on Inference Minimize the minimax risk under constraints

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Karl Oskar Ekvall Galin L. Jones University of Minnesota March 12, 2019 Abstract Practically relevant statistical models often give rise to probability distributions that are analytically

More information

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model UNIVERSITY OF TEXAS AT SAN ANTONIO Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model Liang Jing April 2010 1 1 ABSTRACT In this paper, common MCMC algorithms are introduced

More information

Bayesian Linear Regression

Bayesian Linear Regression Bayesian Linear Regression Sudipto Banerjee 1 Biostatistics, School of Public Health, University of Minnesota, Minneapolis, Minnesota, U.S.A. September 15, 2010 1 Linear regression models: a Bayesian perspective

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Computer intensive statistical methods

Computer intensive statistical methods Lecture 13 MCMC, Hybrid chains October 13, 2015 Jonas Wallin jonwal@chalmers.se Chalmers, Gothenburg university MH algorithm, Chap:6.3 The metropolis hastings requires three objects, the distribution of

More information

Consistency of the maximum likelihood estimator for general hidden Markov models

Consistency of the maximum likelihood estimator for general hidden Markov models Consistency of the maximum likelihood estimator for general hidden Markov models Jimmy Olsson Centre for Mathematical Sciences Lund University Nordstat 2012 Umeå, Sweden Collaborators Hidden Markov models

More information

Hmms with variable dimension structures and extensions

Hmms with variable dimension structures and extensions Hmm days/enst/january 21, 2002 1 Hmms with variable dimension structures and extensions Christian P. Robert Université Paris Dauphine www.ceremade.dauphine.fr/ xian Hmm days/enst/january 21, 2002 2 1 Estimating

More information

Overlapping block proposals for latent Gaussian Markov random fields

Overlapping block proposals for latent Gaussian Markov random fields NORGES TEKNISK-NATURVITENSKAPELIGE UNIVERSITET Overlapping block proposals for latent Gaussian Markov random fields by Ingelin Steinsland and Håvard Rue PREPRINT STATISTICS NO. 8/3 NORWEGIAN UNIVERSITY

More information

A SCALED STOCHASTIC NEWTON ALGORITHM FOR MARKOV CHAIN MONTE CARLO SIMULATIONS

A SCALED STOCHASTIC NEWTON ALGORITHM FOR MARKOV CHAIN MONTE CARLO SIMULATIONS A SCALED STOCHASTIC NEWTON ALGORITHM FOR MARKOV CHAIN MONTE CARLO SIMULATIONS TAN BUI-THANH AND OMAR GHATTAS Abstract. We propose a scaled stochastic Newton algorithm ssn) for local Metropolis-Hastings

More information

Monte Carlo Methods. Leon Gu CSD, CMU

Monte Carlo Methods. Leon Gu CSD, CMU Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte

More information

ST 740: Markov Chain Monte Carlo

ST 740: Markov Chain Monte Carlo ST 740: Markov Chain Monte Carlo Alyson Wilson Department of Statistics North Carolina State University October 14, 2012 A. Wilson (NCSU Stsatistics) MCMC October 14, 2012 1 / 20 Convergence Diagnostics:

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

Density estimation. Computing, and avoiding, partition functions. Iain Murray

Density estimation. Computing, and avoiding, partition functions. Iain Murray Density estimation Computing, and avoiding, partition functions Roadmap: Motivation: density estimation Understanding annealing/tempering NADE Iain Murray School of Informatics, University of Edinburgh

More information

Monte Carlo. Lecture 15 4/9/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky

Monte Carlo. Lecture 15 4/9/18. Harvard SEAS AP 275 Atomistic Modeling of Materials Boris Kozinsky Monte Carlo Lecture 15 4/9/18 1 Sampling with dynamics In Molecular Dynamics we simulate evolution of a system over time according to Newton s equations, conserving energy Averages (thermodynamic properties)

More information

Approximate Bayesian Computation and Particle Filters

Approximate Bayesian Computation and Particle Filters Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra

More information

Bayesian inference and model selection for stochastic epidemics and other coupled hidden Markov models

Bayesian inference and model selection for stochastic epidemics and other coupled hidden Markov models Bayesian inference and model selection for stochastic epidemics and other coupled hidden Markov models (with special attention to epidemics of Escherichia coli O157:H7 in cattle) Simon Spencer 3rd May

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information