Inference in state-space models with multiple paths from conditional SMC

Size: px
Start display at page:

Download "Inference in state-space models with multiple paths from conditional SMC"

Transcription

1 Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September 20, / 36

2 State-space models f θ f θ f θ f θ f θ X 1 X 2 X t 1 X t g θ g θ g θ g θ Y 1 Y 2 Y t 1 Y t {X t, t 1} is a Markov chain X 1 µ θ ( ) and X t (X t 1 = x t 1 ) f θ ( x t 1 ) We only have access to a conditionally independent process {Y t, t 1} Y t (X t = x t ) g θ ( x t ) 2 / 36

3 The Metropolis-Hastings algorithm We are interested in sampling from the posterior distribution of θ given y 1:n : where η(θ) is the prior density for θ p(θ y 1:n) η(θ)p θ (y 1:n) Metropolis-Hastings (MH) Given θ, 1 Propose θ q(θ, ) 2 Accept θ with probability { min otherwise reject and keep θ 1, q(θ, θ) q(θ, θ ) η(θ ) η(θ) } p θ (y 1:n), p θ (y 1:n) 3 / 36

4 Intractability The MH acceptance ratio: r(θ, θ ) := q(θ, θ) η(θ ) p θ (y 1:n ) q(θ, θ ) η(θ) p θ (y 1:n ) Intractable likelihood: n p θ (y 1:n) = µ θ (x 1)g θ (y 1 x 1) g θ (y i x i)f θ (x i x i 1)dx 1:n X n i=2 There are effective methods to estimate p θ (y 1:n) unbiasedly, based on sequential Monte Carlo (SMC), aka the particle filter 4 / 36

5 Pseudo-marginal approach for state-space models Replace p θ (y 1:n ) with an non-negative unbiased estimator obtained from SMC E[ˆp θ (y 1:n)] = p θ (y 1:n), θ Θ Pseudo-marginal MH (Andrieu and Roberts, 2009; Andrieu et al, 2010) Given θ and ˆp θ (y 1:n ), 1 Propose θ q(θ, ) 2 Run an SMC algorithm to obtain ˆp θ (y 1:n ) 3 Accept (θ, ˆp θ (y 1:n )) with probability { min 1, q(θ, θ) q(θ, θ ) otherwise reject and keep (θ, ˆp θ (y 1:n )) η(θ ) η(θ) } ˆp θ (y 1:n ) ˆp θ (y 1:n ) 5 / 36

6 Scalability issues with pseudo-marginal algorithms In pseudo-marginal algorithms, one estimates p θ (y 1:n) and p θ (y 1:n) independently For fixed number of particles, the variability of ˆp θ (y 1:n ) increases with n, As θ θ the variability in ˆp θ (y 1:n)/ˆp θ (y 1:n) does not vanish Variability of the acceptance ratio q(θ, θ) η(θ ) ˆp θ (y 1:n) q(θ, θ ) η(θ) ˆp θ (y 1:n ) has a big impact on the performance (Andrieu and Vihola, 2014, 2015) Alternatives: Correlated pseudo-marginal algorithm of Deligiannidis et al (2018) Estimate p θ (y 1:n)/p θ (y 1:n) directly 6 / 36

7 Estimating the likelihood ratio Shorthand notation: Let x = x 1:n and y = y 1:n Given θ and θ, choose θ Θ, eg θ = (θ + θ )/2 If R θ is a Markov transition kernel leaving p θ ( y) invariant, then p θ (y) p θ (y) p θ (y) p = θ (y) pθ (x p θ (y) p θ (y) =, y) p θ (x, y) p θ(x y)r θ (x, x )dxdx Variance reduction (Crooks, 1998; Jarzynski, 1997; Neal, 2001, 1996) If further we have reversibility p θ (x y)r θ (x, x ) = p θ (x y)r θ (x, x), then we have a valid algorithm for p(θ, x y) η(θ)p θ (x, y) (Neal, 2004) 7 / 36

8 AIS based MCMC with one-step annealing AIS MCMC for SSM 1 Given (θ, x), sample θ q(θ, ) 2 Let θ = (θ + θ )/2 and sample x R θ (x, ) 3 Accept (θ, x ) with probability { min 1, q(θ, θ) η(θ ) q(θ, θ ) η(θ) otherwise reject and keep (θ, x) p θ (x, y) p θ (x, y) } p θ (x, y) p θ (x ;, y) 8 / 36

9 Reversible kernel: conditional SMC (Andrieu et al, 2010) 9 / 36

10 csmc and path degeneracy state time index state time index Picture created by Olivier Cappé 10 / 36

11 csmc with backward sampling (Whiteley, 2010) csmc(m, θ, x): csmc at θ with M particles conditional on the path x Let R M θ (x, ) be the Markov kernel of csmc(m, θ, x)+backward sampling R M θ (x, ) is a p θ (x y) invariant Markov kernel (Andrieu et al, 2010) More importantly, R M θ is reversible (Chopin and Singh, 2015) 11 / 36

12 Numerical experiments Non-linear benchmark model The latent variables and observations of the state-space model are defined as X t = X t 1 /2 + 25X t 1 /(1 + X 2 t 1) + 8 cos(12t) + V t, t 2, Y t = X 2 t /20 + W t, t 1, where X 1 N (0, 10), V t iid N (0, σ 2 v ), W t iid N (0, σ 2 w ) The static parameter of the model is then θ = (σ 2 v, σ 2 w ) 12 / 36

13 Comparisons with fixed N and varying n M = 200 for MwPG (Lindsten et al, 2015) and AIS MCMC and M = 2000 for PMMH Integrated autocorrelation (IAC) times averaged over 200 runs: AIS MCMC csmc+bs MwPG PMMH σv 2 σw 2 σv 2 σw 2 σv 2 σw 2 n = n = n = n = IAC times for MwPG with M = 200 IAC times for AIS MCMC with M = 200 IAC times for PMMH with M = 2000 σ v σ v σ v n n n σ w σ w σ w n n n 13 / 36

14 Comparison on a sticky model (Pitt and Shephard, 1999) X t = 098X t 1 + V t, V t N (0, 002), Y t = X t + µ + W t, W t N (0, 01), For this comparison, R θ,θ,k(x, ) = π θ,θ,k is used IAC times averaged over 100 runs for each for the standard deviation σ µ of the symmetric RW proposal 5000 n = n = n = n = AIS MCMC C SMC BS with K = MwG IAC time σ 10 µ σ σ µ µ σ µ 14 / 36

15 Preliminary theoretical results Considered the situation p θ (y 1:n ) := n p θ (y t ) = t=1 n p θ (x t, y t )dx t X t=1 We make some assumptions including 1 Θ R is compact, 2 For any x, y X Y, θ log p θ (x, y) is three times differentiable with derivatives uniformly bounded in θ, x, y 3 θ = θ + ϵ n with ϵ q 0 ( ), a symmetric distribution with density bounded away from 0 4 θ = θ+θ 2 5 AIS MCMC that uses a product of p θ ( y)-invariant reversible Markov transition kernels R M n θ,y with lim sup M (θ,x,y) Θ X Y R M θ,y( x, ) pθ ( y) tv = 0 15 / 36

16 Some asymptotics ξ = {X t, X t } t 1, ω = {Y t} t 1, E ω n : conditional expectations given observations ω r n (θ, θ ; ξ): estimated accepted ratio used in AIS MCMC r n(θ, θ ): The exact acceptance ratio of the MH 16 / 36

17 Some asymptotics ξ = {X t, X t } t 1, ω = {Y t} t 1, E ω n : conditional expectations given observations ω r n (θ, θ ; ξ): estimated accepted ratio used in AIS MCMC r n(θ, θ ): The exact acceptance ratio of the MH Theorem Under some assumptions P as, for any ε 0 > 0 there exist n 0, M 0 N such that for any sequence { } M n N N satisfying M n M 0 for n n 0, [ sup E ω n min{1, rn (θ, θ, ξ)} ] [ E ω n min{1, rn (θ, θ ) exp(z)} ] ε0, n n 0 where ( ) Z (θ, θ, ω) N σ2 (θ, θ ), σ 2 (θ, θ ), 2 [ σ 2 (θ, θ ) := (θ θ) 2 E 2 θ 2 θ log p θ (X Y )], and θ = (θ + θ )/2 17 / 36

18 More general AIS MCMC with K 1 intermediate steps Consider P θ,θ,k := {π θ,θ,k, k = 0,, K + 1} where π θ,θ,k(x) := p θk (x y) p θk (x, y) =: γ θ,θ,k(x), with annealing θ k = θk/(k + 1) + θ (1 k/(k + 1)) 18 / 36

19 More general AIS MCMC with K 1 intermediate steps Consider P θ,θ,k := {π θ,θ,k, k = 0,, K + 1} where π θ,θ,k(x) := p θk (x y) p θk (x, y) =: γ θ,θ,k(x), with annealing θ k = θk/(k + 1) + θ (1 k/(k + 1)) Also, consider the associated Markov kernels R θ,θ,k := { R θ,θ,k : X n X n [0, 1], k = 1,, K } 19 / 36

20 More general AIS MCMC with K 1 intermediate steps Consider P θ,θ,k := {π θ,θ,k, k = 0,, K + 1} where π θ,θ,k(x) := p θk (x y) p θk (x, y) =: γ θ,θ,k(x), with annealing θ k = θk/(k + 1) + θ (1 k/(k + 1)) Also, consider the associated Markov kernels R θ,θ,k := { R θ,θ,k : X n X n [0, 1], k = 1,, K } In this case the estimator is of the form K γ θ,θ,k+1(x k ) γ θ,θ,k(x k ) k=0 where x 1 R θ,θ,k(x 0, ) and x k R θ,θ,k(x k 1, ), k = 2,, K 20 / 36

21 More general AIS MCMC with K 1 intermediate steps Consider P θ,θ,k := {π θ,θ,k, k = 0,, K + 1} where π θ,θ,k(x) := p θk (x y) p θk (x, y) =: γ θ,θ,k(x), with annealing θ k = θk/(k + 1) + θ (1 k/(k + 1)) Also, consider the associated Markov kernels R θ,θ,k := { R θ,θ,k : X n X n [0, 1], k = 1,, K } In this case the estimator is of the form K γ θ,θ,k+1(x k ) γ θ,θ,k(x k ) k=0 where x 1 R θ,θ,k(x 0, ) and x k R θ,θ,k(x k 1, ), k = 2,, K With θ, x K being the proposal, we can design a correct algorithm 21 / 36

22 Response to increasing budget: fixed n and varying M, K We fixed n = 500, run each combination 200 times On each row: MCMC AIS with M 0 = 500 particles K intermediate steps, MwPG and PMMH algorithms for M = KM 0 particles AIS MCMC csmc+bs MwPG PMMH σv 2 σw 2 σv 2 σw 2 σv 2 σw 2 K = K = K = K = K = K = K = K = K = K = / 36

23 Using all paths of csmc Law of k := (k 1,, k n) in backward sampling conditional on ζ = x (1:M) 1:n : Resulting path: ζ (k) ϕ θ(k ζ) For any θ, θ, θ Θ, and paths x, x X n, define r x,x (θ, θ ; θ) = q(θ, θ) η(θ ) p θ (x, y)p θ(x, y) q(θ, θ ) η(θ) p θ (x, y)p θ (x, y) Theorem: Unbiased estimator of acceptance ratio Let x p θ ( y), ζ x csmc(m, θ, x) The expectation k [M] n ϕ θ (k ζ) r x,ζ (k)(θ, θ ; θ) is an unbiased estimator of r(θ, θ ) 23 / 36

24 1 Sample θ q(θ, ) and v U(0, 1) 2 Set θ = (θ + θ )/2 3 if v 1/2 then 4 Run a csmc(m, θ, x) to obtain ζ 5 Set x = ζ (k) wp ϕ θ (k ζ) r x,ζ (k)(θ, θ ; θ) 6 Accept (θ, x ) with probability min 1, k [M] n ϕ θ (k ζ) r x,ζ (k)(θ, θ ; θ) otherwise reject 24 / 36

25 1 Sample θ q(θ, ) and v U(0, 1) 2 Set θ = (θ + θ )/2 3 if v 1/2 then 4 Run a csmc(m, θ, x) to obtain ζ 5 Set x = ζ (k) wp ϕ θ (k ζ) r x,ζ (k)(θ, θ ; θ) 6 Accept (θ, x ) with probability otherwise reject 7 else 8 Run a csmc(m, θ, x) to obtain ζ 9 Set x = ζ (k) with probability ϕ θ(k ζ) 10 Accept (θ, x ) with probability min 1, ϕ θ (k ζ) r x,ζ (k)(θ, θ ; θ) k [M] n 1 min 1, ϕ θ (k ζ) r x,ζ (k)(θ, θ; θ), k [M] n otherwise reject 25 / 36

26 Theoretical results Theorem: Exactness of the algorithm The presented algorithm targets p(θ, x y) Corollary (to the exactness of the algorithm) Let x p θ ( y), ζ x csmc(m, θ, x) Let x be drawn with backward sampling conditional upon ζ Then, ϕ θ (k ζ) r x,ζ (k)(θ, θ; θ) k [M] n is an unbiased estimator of r(θ, θ ) 1 26 / 36

27 Computation load The computations needed can be performed with a complexity of O(M 2 n) Acceptance ratios can be performed by a sum-product algorithm Sampling a path ζ (k) can be performed with a forward-filtering backward-sampling algorithm However, O(M 2 n) can still be overwhelming, especially when M is large 27 / 36

28 Reduced computation via subsampling Let u (1),, u (N) be independently sampled paths via backward sampling conditional on ζ We can still target p(θ, x y) using 1 N N r x,u (i)(θ, θ ; θ), i=1 which is an unbiased estimator of r(θ, θ ) Computational complexity: O(NMn) per iteration instead of O(M 2 n); Moreover, sampling N paths can be parallelised 28 / 36

29 1 Sample θ q(θ, ) and v U(0, 1) 2 Set θ = (θ + θ )/2 3 if v 1/2 then 4 Run a csmc(m, θ, x) to obtain particles ζ 5 Draw N paths with backward sampling, u (1),, u (N) 6 Set x = u (k) wp r x,u (k)(θ, θ ; θ) 7 Accept (θ, x ) with probability min { 1, 1 N otherwise reject, and keep (θ, x) } N r x,u (i)(θ, θ ; θ) ; i=1 29 / 36

30 1 Sample θ q(θ, ) and v U(0, 1) 2 Set θ = (θ + θ )/2 3 if v 1/2 then 4 Run a csmc(m, θ, x) to obtain particles ζ 5 Draw N paths with backward sampling, u (1),, u (N) 6 Set x = u (k) wp r x,u (k)(θ, θ ; θ) 7 Accept (θ, x ) with probability min { 1, 1 N } N r x,u (i)(θ, θ ; θ) ; i=1 otherwise reject, and keep (θ, x) 8 else 9 Run a csmc(m, θ, x) to obtain particles ζ 10 Set u (1) = x 11 Draw N paths with backward sampling x, u (2), u (3),, u (N) 12 Accept (θ, x ) with probability min { 1, [ 1 N ] 1 } N r x,u (i) (θ, θ; θ) ; i=1 otherwise reject, and keep (θ, x) 30 / 36

31 Theorem: Exactness The presented algorithm targets p(θ, x y) Corollary (to the exactness of the algorithm) Let x p θ ( y), ζ x csmc(m, θ, x) Let u (1) = x and x, u (2), u (3),, u (N) be N independent paths sampled with backward sampling conditioned on ζ Then [ 1 N N r x,u (i)(θ, θ; θ) i=1 is an unbiased estimator of r(θ, θ ) ] 1 31 / 36

32 Numerical example: IAC time σ v IAC times of algorithms, n = 500, M = 100 MwPG N = 1 N = 10 N = 50 σ w MwPG N = 1 N = 10 N = / 36

33 Numerical example: convergence vs IAC time ensemble average for σ w 2(k) conv to post mean - single csmc N = 1 N = 10 N = IAC times for σ 2 w - multiple csmcs means: 3787, 2928, iterations ensemble average for σ w 4(k) conv to post mean square - single csmc conv to post median - single csmc iterations 20 ensemble median for σ w 2(k) 005 IAC times for σ 4 w - multiple csmcs means: , 2798, iterations N = 1 N = 10 N = 100 N = 1 N = 10 N = 100 IAC times for σ w 2 - single csmc means: 3551, 2816, 2510 IAC times for σ 4 w - single csmc means: 3420, 2682, N = 1 N = 10 N = 100 N = 1 N = 10 N = / 36

34 Discussion Discussed AIS based MCMC for state-space models Scalable Monte Carlo inference for state-space models, arxiv: Discussed the use of multiple, (or all possible) paths from csmc for AIS based MCMC for state-space models If done in parallel, using multiple paths from csmc can be beneficial in terms of convergence time IAC time (?) The methodology presented for state-space models are a special case in a more general framework: Designing MH algorithms with asymmetric acceptance ratios can be useful in other models such as general latent variable models trans-dimensional models doubly intractable models On the utility of Metropolis-Hastings with asymmetric acceptance ratio, arxiv: / 36

35 Thank you! 35 / 36

36 References: Andrieu, C, Doucet, A, and Holenstein, R (2010) Particle Markov chain Monte Carlo methods Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72: Andrieu, C and Roberts, G O (2009) The pseudo-marginal approach for efficient Monte Carlo computations Annals of Statistics, 37(2): Andrieu, C and Vihola, M (2014) Establishing some order amongst exact approximations of MCMCs arxiv: Andrieu, C and Vihola, M (2015) Convergence properties of pseudo-marginal Markov chain Monte Carlo algorithms The Annals of Applied Probability, 25(2): Chopin, N and Singh, S (2015) On particle Gibbs sampling Bernoulli, 21(3): Crooks, G (1998) Nonequilibrium measurements of free energy differences for microscopically reversible markovian systems Journal of Statistical Physics, 90(5-6): Deligiannidis, G, Doucet, A, and Pitt, M K (2018) The correlated pseudomarginal method Journal of the Royal Statistical Society: Series B (Statistical Methodology) (forthcoming) Jarzynski, C (1997) Equilibrium free-energy differences from nonequilibrium measurements: A master-equation approach Phys Rev E, 56:5018?5035 Lindsten, F, Douc, R,, and Moulines, E (2015) Uniform ergodicity of the particle Gibbs sampler Scandinavian Journal of Statistics, 42(3):775?797 Neal, R (1996) Sampling from multimodal distributions using tempered transitions Statistics and Computing, 6(4): Neal, R (2001) Annealed importance sampling Statistics and Computing, 11: Neal, R M (2004) Taking bigger Metropolis steps by dragging fast variables Technical report, University of Toronto Pitt, M K and Shephard, N (1999) Analytic convergence rates and parametrization issues for the Gibbs sampler applied to state space models Journal of Time Series Analysis, 20(1):63 85 Whiteley, N (2010) Discussion of Particle Markov chain Monte Carlo methods by Andrieu et al Journal of the Royal Statistical Society: Series B (Statistical Methodology), 72(3): / 36

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

arxiv: v1 [stat.co] 1 Jun 2015

arxiv: v1 [stat.co] 1 Jun 2015 arxiv:1506.00570v1 [stat.co] 1 Jun 2015 Towards automatic calibration of the number of state particles within the SMC 2 algorithm N. Chopin J. Ridgway M. Gerber O. Papaspiliopoulos CREST-ENSAE, Malakoff,

More information

Pseudo-marginal MCMC methods for inference in latent variable models

Pseudo-marginal MCMC methods for inference in latent variable models Pseudo-marginal MCMC methods for inference in latent variable models Arnaud Doucet Department of Statistics, Oxford University Joint work with George Deligiannidis (Oxford) & Mike Pitt (Kings) MCQMC, 19/08/2016

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

PSEUDO-MARGINAL METROPOLIS-HASTINGS APPROACH AND ITS APPLICATION TO BAYESIAN COPULA MODEL

PSEUDO-MARGINAL METROPOLIS-HASTINGS APPROACH AND ITS APPLICATION TO BAYESIAN COPULA MODEL PSEUDO-MARGINAL METROPOLIS-HASTINGS APPROACH AND ITS APPLICATION TO BAYESIAN COPULA MODEL Xuebin Zheng Supervisor: Associate Professor Josef Dick Co-Supervisor: Dr. David Gunawan School of Mathematics

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

Particle Metropolis-Hastings using gradient and Hessian information

Particle Metropolis-Hastings using gradient and Hessian information Particle Metropolis-Hastings using gradient and Hessian information Johan Dahlin, Fredrik Lindsten and Thomas B. Schön September 19, 2014 Abstract Particle Metropolis-Hastings (PMH) allows for Bayesian

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

Inexact approximations for doubly and triply intractable problems

Inexact approximations for doubly and triply intractable problems Inexact approximations for doubly and triply intractable problems March 27th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers of) interacting

More information

One Pseudo-Sample is Enough in Approximate Bayesian Computation MCMC

One Pseudo-Sample is Enough in Approximate Bayesian Computation MCMC Biometrika (?), 99, 1, pp. 1 10 Printed in Great Britain Submitted to Biometrika One Pseudo-Sample is Enough in Approximate Bayesian Computation MCMC BY LUKE BORNN, NATESH PILLAI Department of Statistics,

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations

More information

Sequential Monte Carlo methods for system identification

Sequential Monte Carlo methods for system identification Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

An ABC interpretation of the multiple auxiliary variable method

An ABC interpretation of the multiple auxiliary variable method School of Mathematical and Physical Sciences Department of Mathematics and Statistics Preprint MPS-2016-07 27 April 2016 An ABC interpretation of the multiple auxiliary variable method by Dennis Prangle

More information

Flexible Particle Markov chain Monte Carlo methods with an application to a factor stochastic volatility model

Flexible Particle Markov chain Monte Carlo methods with an application to a factor stochastic volatility model Flexible Particle Markov chain Monte Carlo methods with an application to a factor stochastic volatility model arxiv:1401.1667v4 [stat.co] 23 Jan 2018 Eduardo F. Mendes School of Applied Mathematics Fundação

More information

Introduction to Machine Learning CMU-10701

Introduction to Machine Learning CMU-10701 Introduction to Machine Learning CMU-10701 Markov Chain Monte Carlo Methods Barnabás Póczos & Aarti Singh Contents Markov Chain Monte Carlo Methods Goal & Motivation Sampling Rejection Importance Markov

More information

Perfect simulation algorithm of a trajectory under a Feynman-Kac law

Perfect simulation algorithm of a trajectory under a Feynman-Kac law Perfect simulation algorithm of a trajectory under a Feynman-Kac law Data Assimilation, 24th-28th September 212, Oxford-Man Institute C. Andrieu, N. Chopin, A. Doucet, S. Rubenthaler University of Bristol,

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Evidence estimation for Markov random fields: a triply intractable problem

Evidence estimation for Markov random fields: a triply intractable problem Evidence estimation for Markov random fields: a triply intractable problem January 7th, 2014 Markov random fields Interacting objects Markov random fields (MRFs) are used for modelling (often large numbers

More information

MCMC algorithms for fitting Bayesian models

MCMC algorithms for fitting Bayesian models MCMC algorithms for fitting Bayesian models p. 1/1 MCMC algorithms for fitting Bayesian models Sudipto Banerjee sudiptob@biostat.umn.edu University of Minnesota MCMC algorithms for fitting Bayesian models

More information

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory

Pseudo-marginal Metropolis-Hastings: a simple explanation and (partial) review of theory Pseudo-arginal Metropolis-Hastings: a siple explanation and (partial) review of theory Chris Sherlock Motivation Iagine a stochastic process V which arises fro soe distribution with density p(v θ ). Iagine

More information

LTCC: Advanced Computational Methods in Statistics

LTCC: Advanced Computational Methods in Statistics LTCC: Advanced Computational Methods in Statistics Advanced Particle Methods & Parameter estimation for HMMs N. Kantas Notes at http://wwwf.imperial.ac.uk/~nkantas/notes4ltcc.pdf Slides at http://wwwf.imperial.ac.uk/~nkantas/slides4.pdf

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions

More information

Particle Markov chain Monte Carlo methods

Particle Markov chain Monte Carlo methods Proofs subject to correction. Not to be reproduced without permission. Contributions to the discussion must not exceed 00 words. Contributions longer than 00 words will be cut by the editor. 0 0 0 J. R.

More information

On Markov chain Monte Carlo methods for tall data

On Markov chain Monte Carlo methods for tall data On Markov chain Monte Carlo methods for tall data Remi Bardenet, Arnaud Doucet, Chris Holmes Paper review by: David Carlson October 29, 2016 Introduction Many data sets in machine learning and computational

More information

Sequential Monte Carlo samplers

Sequential Monte Carlo samplers J. R. Statist. Soc. B (2006) 68, Part 3, pp. 411 436 Sequential Monte Carlo samplers Pierre Del Moral, Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Vancouver,

More information

Reminder of some Markov Chain properties:

Reminder of some Markov Chain properties: Reminder of some Markov Chain properties: 1. a transition from one state to another occurs probabilistically 2. only state that matters is where you currently are (i.e. given present, future is independent

More information

Markov Chain Monte Carlo methods

Markov Chain Monte Carlo methods Markov Chain Monte Carlo methods Tomas McKelvey and Lennart Svensson Signal Processing Group Department of Signals and Systems Chalmers University of Technology, Sweden November 26, 2012 Today s learning

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,

More information

Bayesian Computations for DSGE Models

Bayesian Computations for DSGE Models Bayesian Computations for DSGE Models Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 This Lecture is Based on Bayesian Estimation of DSGE Models Edward P. Herbst &

More information

List of projects. FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 2016

List of projects. FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 2016 List of projects FMS020F NAMS002 Statistical inference for partially observed stochastic processes, 206 Work in groups of two (if this is absolutely not possible for some reason, please let the lecturers

More information

Likelihood-free MCMC

Likelihood-free MCMC Bayesian inference for stable distributions with applications in finance Department of Mathematics University of Leicester September 2, 2011 MSc project final presentation Outline 1 2 3 4 Classical Monte

More information

Adaptive Monte Carlo methods

Adaptive Monte Carlo methods Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert

More information

Quantifying Uncertainty

Quantifying Uncertainty Sai Ravela M. I. T Last Updated: Spring 2013 1 Markov Chain Monte Carlo Monte Carlo sampling made for large scale problems via Markov Chains Monte Carlo Sampling Rejection Sampling Importance Sampling

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik

More information

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,

More information

Interacting Particle Markov Chain Monte Carlo

Interacting Particle Markov Chain Monte Carlo Technical report arxiv:62528v [statco] 6 Feb 26 Interacting Particle Markov Chain Monte Carlo Tom Rainforth*, Christian A Naesseth*, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet

More information

arxiv: v5 [stat.co] 16 Feb 2016

arxiv: v5 [stat.co] 16 Feb 2016 The Use of a Single Pseudo-Sample in Approximate Bayesian Computation Luke Bornn 1,2, Natesh S. Pillai 2, Aaron Smith 3, and Dawn Woodard 4 arxiv:1404.6298v5 [stat.co] 16 Feb 2016 1 Department of Statistics

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

An intro to particle methods for parameter inference in state-space models

An intro to particle methods for parameter inference in state-space models An intro to particle methods for parameter inference in state-space models PhD course FMS020F NAMS002 Statistical inference for partially observed stochastic processes, Lund University http://goo.gl/sx8vu9

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Sequential Monte Carlo samplers for Bayesian DSGE models

Sequential Monte Carlo samplers for Bayesian DSGE models Sequential Monte Carlo samplers for Bayesian DSGE models Drew Creal Department of Econometrics, Vrije Universitiet Amsterdam, NL-8 HV Amsterdam dcreal@feweb.vu.nl August 7 Abstract Bayesian estimation

More information

A = {(x, u) : 0 u f(x)},

A = {(x, u) : 0 u f(x)}, Draw x uniformly from the region {x : f(x) u }. Markov Chain Monte Carlo Lecture 5 Slice sampler: Suppose that one is interested in sampling from a density f(x), x X. Recall that sampling x f(x) is equivalent

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 18-16th March Arnaud Doucet

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 18-16th March Arnaud Doucet Stat 535 C - Statistical Computing & Monte Carlo Methods Lecture 18-16th March 2006 Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Trans-dimensional Markov chain Monte Carlo. Bayesian model for autoregressions.

More information

Monte Carlo Inference Methods

Monte Carlo Inference Methods Monte Carlo Inference Methods Iain Murray University of Edinburgh http://iainmurray.net Monte Carlo and Insomnia Enrico Fermi (1901 1954) took great delight in astonishing his colleagues with his remarkably

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Markov Chain Monte Carlo Methods

Markov Chain Monte Carlo Methods Markov Chain Monte Carlo Methods John Geweke University of Iowa, USA 2005 Institute on Computational Economics University of Chicago - Argonne National Laboaratories July 22, 2005 The problem p (θ, ω I)

More information

Notes on pseudo-marginal methods, variational Bayes and ABC

Notes on pseudo-marginal methods, variational Bayes and ABC Notes on pseudo-marginal methods, variational Bayes and ABC Christian Andersson Naesseth October 3, 2016 The Pseudo-Marginal Framework Assume we are interested in sampling from the posterior distribution

More information

Monte Carlo in Bayesian Statistics

Monte Carlo in Bayesian Statistics Monte Carlo in Bayesian Statistics Matthew Thomas SAMBa - University of Bath m.l.thomas@bath.ac.uk December 4, 2014 Matthew Thomas (SAMBa) Monte Carlo in Bayesian Statistics December 4, 2014 1 / 16 Overview

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Particle MCMC for Bayesian Microwave Control

Particle MCMC for Bayesian Microwave Control Particle MCMC for Bayesian Microwave Control P. Minvielle 1, A. Todeschini 2, F. Caron 3, P. Del Moral 4, 1 CEA-CESTA, 33114 Le Barp, France 2 INRIA Bordeaux Sud-Ouest, 351, cours de la Liberation, 33405

More information

On-Line Parameter Estimation in General State-Space Models

On-Line Parameter Estimation in General State-Space Models On-Line Parameter Estimation in General State-Space Models Christophe Andrieu School of Mathematics University of Bristol, UK. c.andrieu@bris.ac.u Arnaud Doucet Dpts of CS and Statistics Univ. of British

More information

SAMPLING ALGORITHMS. In general. Inference in Bayesian models

SAMPLING ALGORITHMS. In general. Inference in Bayesian models SAMPLING ALGORITHMS SAMPLING ALGORITHMS In general A sampling algorithm is an algorithm that outputs samples x 1, x 2,... from a given distribution P or density p. Sampling algorithms can for example be

More information

Sequential Monte Carlo samplers for Bayesian DSGE models

Sequential Monte Carlo samplers for Bayesian DSGE models Sequential Monte Carlo samplers for Bayesian DSGE models Drew Creal First version: February 8, 27 Current version: March 27, 27 Abstract Dynamic stochastic general equilibrium models have become a popular

More information

Learning the hyper-parameters. Luca Martino

Learning the hyper-parameters. Luca Martino Learning the hyper-parameters Luca Martino 2017 2017 1 / 28 Parameters and hyper-parameters 1. All the described methods depend on some choice of hyper-parameters... 2. For instance, do you recall λ (bandwidth

More information

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference

Bayesian Estimation of DSGE Models 1 Chapter 3: A Crash Course in Bayesian Inference 1 The views expressed in this paper are those of the authors and do not necessarily reflect the views of the Federal Reserve Board of Governors or the Federal Reserve System. Bayesian Estimation of DSGE

More information

Monte Carlo Approximation of Monte Carlo Filters

Monte Carlo Approximation of Monte Carlo Filters Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space

More information

Markov Chain Monte Carlo (MCMC)

Markov Chain Monte Carlo (MCMC) Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can

More information

General Construction of Irreversible Kernel in Markov Chain Monte Carlo

General Construction of Irreversible Kernel in Markov Chain Monte Carlo General Construction of Irreversible Kernel in Markov Chain Monte Carlo Metropolis heat bath Suwa Todo Department of Applied Physics, The University of Tokyo Department of Physics, Boston University (from

More information

Eco517 Fall 2013 C. Sims MCMC. October 8, 2013

Eco517 Fall 2013 C. Sims MCMC. October 8, 2013 Eco517 Fall 2013 C. Sims MCMC October 8, 2013 c 2013 by Christopher A. Sims. This document may be reproduced for educational and research purposes, so long as the copies contain this notice and are retained

More information

Bayesian Inference and MCMC

Bayesian Inference and MCMC Bayesian Inference and MCMC Aryan Arbabi Partly based on MCMC slides from CSC412 Fall 2018 1 / 18 Bayesian Inference - Motivation Consider we have a data set D = {x 1,..., x n }. E.g each x i can be the

More information

Markov Chain Monte Carlo Using the Ratio-of-Uniforms Transformation. Luke Tierney Department of Statistics & Actuarial Science University of Iowa

Markov Chain Monte Carlo Using the Ratio-of-Uniforms Transformation. Luke Tierney Department of Statistics & Actuarial Science University of Iowa Markov Chain Monte Carlo Using the Ratio-of-Uniforms Transformation Luke Tierney Department of Statistics & Actuarial Science University of Iowa Basic Ratio of Uniforms Method Introduced by Kinderman and

More information

Introduction to Markov Chain Monte Carlo & Gibbs Sampling

Introduction to Markov Chain Monte Carlo & Gibbs Sampling Introduction to Markov Chain Monte Carlo & Gibbs Sampling Prof. Nicholas Zabaras Sibley School of Mechanical and Aerospace Engineering 101 Frank H. T. Rhodes Hall Ithaca, NY 14853-3801 Email: zabaras@cornell.edu

More information

Particle Metropolis-adjusted Langevin algorithms

Particle Metropolis-adjusted Langevin algorithms Particle Metropolis-adjusted Langevin algorithms Christopher Nemeth, Chris Sherlock and Paul Fearnhead arxiv:1412.7299v3 [stat.me] 27 May 2016 Department of Mathematics and Statistics, Lancaster University,

More information

Monte Carlo methods for sampling-based Stochastic Optimization

Monte Carlo methods for sampling-based Stochastic Optimization Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet.

Stat 535 C - Statistical Computing & Monte Carlo Methods. Arnaud Doucet. Stat 535 C - Statistical Computing & Monte Carlo Methods Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Introduction to Markov chain Monte Carlo The Gibbs Sampler Examples Overview of the Lecture

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs

More information

Data-driven Particle Filters for Particle Markov Chain Monte Carlo

Data-driven Particle Filters for Particle Markov Chain Monte Carlo Data-driven Particle Filters for Particle Markov Chain Monte Carlo Patrick Leung, Catherine S. Forbes, Gael M. Martin and Brendan McCabe September 13, 2016 Abstract This paper proposes new automated proposal

More information

On some properties of Markov chain Monte Carlo simulation methods based on the particle filter

On some properties of Markov chain Monte Carlo simulation methods based on the particle filter On some properties of Markov chain Monte Carlo simulation methods based on the particle filter Michael K. Pitt Economics Department University of Warwick m.pitt@warwick.ac.uk Ralph S. Silva School of Economics

More information

arxiv: v1 [stat.co] 2 Nov 2017

arxiv: v1 [stat.co] 2 Nov 2017 Binary Bouncy Particle Sampler arxiv:1711.922v1 [stat.co] 2 Nov 217 Ari Pakman Department of Statistics Center for Theoretical Neuroscience Grossman Center for the Statistics of Mind Columbia University

More information

Sequential Monte Carlo Methods in High Dimensions

Sequential Monte Carlo Methods in High Dimensions Sequential Monte Carlo Methods in High Dimensions Alexandros Beskos Statistical Science, UCL Oxford, 24th September 2012 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Andrew Stuart Imperial College,

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

arxiv: v1 [stat.co] 1 Jan 2014

arxiv: v1 [stat.co] 1 Jan 2014 Approximate Bayesian Computation for a Class of Time Series Models BY AJAY JASRA Department of Statistics & Applied Probability, National University of Singapore, Singapore, 117546, SG. E-Mail: staja@nus.edu.sg

More information

Weak convergence of Markov chain Monte Carlo II

Weak convergence of Markov chain Monte Carlo II Weak convergence of Markov chain Monte Carlo II KAMATANI, Kengo Mar 2011 at Le Mans Background Markov chain Monte Carlo (MCMC) method is widely used in Statistical Science. It is easy to use, but difficult

More information

Negative Association, Ordering and Convergence of Resampling Methods

Negative Association, Ordering and Convergence of Resampling Methods Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal

More information

Tutorial on ABC Algorithms

Tutorial on ABC Algorithms Tutorial on ABC Algorithms Dr Chris Drovandi Queensland University of Technology, Australia c.drovandi@qut.edu.au July 3, 2014 Notation Model parameter θ with prior π(θ) Likelihood is f(ý θ) with observed

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 7 Approximate

More information

Surveying the Characteristics of Population Monte Carlo

Surveying the Characteristics of Population Monte Carlo International Research Journal of Applied and Basic Sciences 2013 Available online at www.irjabs.com ISSN 2251-838X / Vol, 7 (9): 522-527 Science Explorer Publications Surveying the Characteristics of

More information

Markov chain Monte Carlo

Markov chain Monte Carlo 1 / 26 Markov chain Monte Carlo Timothy Hanson 1 and Alejandro Jara 2 1 Division of Biostatistics, University of Minnesota, USA 2 Department of Statistics, Universidad de Concepción, Chile IAP-Workshop

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods University of Pennsylvania Bradley Visitor Lectures October 23, 2017 Introduction Unfortunately, standard MCMC can be inaccurate, especially in medium and large-scale DSGE models: disentangling importance

More information

Adaptive Population Monte Carlo

Adaptive Population Monte Carlo Adaptive Population Monte Carlo Olivier Cappé Centre Nat. de la Recherche Scientifique & Télécom Paris 46 rue Barrault, 75634 Paris cedex 13, France http://www.tsi.enst.fr/~cappe/ Recent Advances in Monte

More information

On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers. Working Paper Research. by Arnaud Dufays. September 2014 No 263

On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers. Working Paper Research. by Arnaud Dufays. September 2014 No 263 On the conjugacy of off-line and on-line Sequential Monte Carlo Samplers Working Paper Research by Arnaud Dufays September 2014 No 263 Editorial Director Jan Smets, Member of the Board of Directors of

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Principles of Bayesian Inference

Principles of Bayesian Inference Principles of Bayesian Inference Sudipto Banerjee University of Minnesota July 20th, 2008 1 Bayesian Principles Classical statistics: model parameters are fixed and unknown. A Bayesian thinks of parameters

More information

The zig-zag and super-efficient sampling for Bayesian analysis of big data

The zig-zag and super-efficient sampling for Bayesian analysis of big data The zig-zag and super-efficient sampling for Bayesian analysis of big data LMS-CRiSM Summer School on Computational Statistics 15th July 2018 Gareth Roberts, University of Warwick Joint work with Joris

More information

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA

BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA BAYESIAN METHODS FOR VARIABLE SELECTION WITH APPLICATIONS TO HIGH-DIMENSIONAL DATA Intro: Course Outline and Brief Intro to Marina Vannucci Rice University, USA PASI-CIMAT 04/28-30/2010 Marina Vannucci

More information

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US

Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Online appendix to On the stability of the excess sensitivity of aggregate consumption growth in the US Gerdie Everaert 1, Lorenzo Pozzi 2, and Ruben Schoonackers 3 1 Ghent University & SHERPPA 2 Erasmus

More information

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 15-7th March Arnaud Doucet

Stat 535 C - Statistical Computing & Monte Carlo Methods. Lecture 15-7th March Arnaud Doucet Stat 535 C - Statistical Computing & Monte Carlo Methods Lecture 15-7th March 2006 Arnaud Doucet Email: arnaud@cs.ubc.ca 1 1.1 Outline Mixture and composition of kernels. Hybrid algorithms. Examples Overview

More information

Markov chain Monte Carlo

Markov chain Monte Carlo Markov chain Monte Carlo Karl Oskar Ekvall Galin L. Jones University of Minnesota March 12, 2019 Abstract Practically relevant statistical models often give rise to probability distributions that are analytically

More information