Monte Carlo Approximation of Monte Carlo Filters

Size: px
Start display at page:

Download "Monte Carlo Approximation of Monte Carlo Filters"

Transcription

1 Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014

2 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Rao-Blackwellized Particle Filters [AD02, CL00] Block-Sampling Particle Filters [DBS06] Exact Approximation of Monte Carlo Algorithms: Particle MCMC [ADH10] SMC 2 [CJP13] Approximated Rao-Blackwellized Particle Filters [CSOL11] Exactly-approximated RBPFs [JWD12] Approximating the BSPF Local SMC [JD14]

3 Particle MCMC MCMC algorithms which employ SMC proposals [ADH10] SMC algorithm as a collection of RVs Values Weights Ancestral Lines Construct MCMC algorithms: With many auxiliary variables Exactly invariant for distribution on extended space Standard MCMC arguments justify strategy SMC 2 employs the same approach within an SMC setting. What else does this allow us to do with SMC?

4 Ancestral Trees t=1 t=2 t=3 a 1 3 =1 a 4 3 =3 a 1 2 =1 a 4 2 =3 b 2 3,1:3 =(1, 1, 2) b 4 3,1:3 =(3, 3, 4) b 6 3,1:3 =(4, 5, 6)

5 SMC Distributions We ll need the SMC Distribution: ( ) an L+2:n, x n L+1:n, k; x n L = ψ M n,l [ M i=1 ] q(x i n L+1 xn L ) n p=n L+2 [ M ( r(a p w p 1 ) q x i p x ai p p 1) ] r(k w n ) and the conditional SMC Distribution: ( ) ψ n,l M ã k n L+2:n, x k n L+1:n ; x n L b k n L+1:n 1, k, x k n L+1:n = ( q x b k n,n L+1 n L+1 x n L r p=n L+2 i=1 ψn,l M (ã n L+2:n, x n L+1:n, k; x n L ) ) [ n ) ( ( bk n,p w p 1 q x b k n,p p x b n n,p 1 p 1 ) ] r(k w n )

6 A (Rather Broad) Class of Hidden Markov Models y 1 x 1 z 1 x 2 z 2 y 2 x 3 z 3 y 3 Unobserved Markov chain {(X n, Z n )} transition f. Observed process {Y n } conditional density g. Density: n p(x 1:n, z 1:n, y 1:n ) = f 1 (x 1, z 1 )g(y 1 x 1, z 1 ) f(x i, z i x i 1, z i 1 )g(y i x i, z i ). i=2

7 Formal Solutions Filtering and Prediction Recursions: p(x n, z n y 1:n 1 )g(y n x n, z n ) p(x n, z n y 1:n ) = p(x n, z n y 1:n 1 )g(y n x n, z n)d(x n, z n) p(x n+1, z n+1 y 1:n ) = p(x n, z n y 1:n )f(x n+1, z n+1 x n, z n )d(x n, z n ) Smoothing: p((x, z) 1:n y 1:n ) p((x, z) 1:n 1 y 1:n 1 )f((x, z) n (x, z) n 1 )g(y n (x, z) n )

8 A Simple SIR Filter Algorithmically, at iteration n: Given {Wn 1 i, (X, Z)i 1:n 1 } for i = 1,..., N: Resample, obtaining { 1 N, ( X, Z) i 1:n 1 }. Sample (X, Z) i n q n ( ( X, Z) i n 1) Weight W i n f((x,z)i n ( X, Z) i n 1 )g(yn (X,Z)i n) q n((x,z) i n ( X, Z) i n 1 ) Actually: Resample efficiently. Only resample when necessary....

9 A Rao-Blackwellized SIR Filter Algorithmically, at iteration n: Given {W X,i n 1, (Xi 1:n 1, p(z 1:n 1 X i 1:n 1, y 1:n 1)} Resample, obtaining { 1 N, ( X i 1:n 1, p(z 1:n 1 X i 1:n 1, y 1:n 1))}. For i = 1,..., N: Sample Xn i q n ( X n 1) i Set X1:n i ( X 1:n 1, i Xn). i Weight Wn X,i p(xi n,yn X i n 1 ) q n(xn i X n 1 i ) Compute p(z1:n y 1:n, X1:n). i Requires analytically tractable substructure.

10 An Approximate Rao-Blackwellized SIR Filter Algorithmically, at iteration n: Given {W X,i n 1, (Xi 1:n 1, p(z 1:n 1 X i 1:n 1, y 1:n 1)} Resample, obtaining { 1 N, ( X i 1:n 1, p(z 1:n 1 X i 1:n 1, y 1:n 1))}. For i = 1,..., N: Sample Xn i q n ( X n 1) i Set X1:n i ( X 1:n 1, i Xn). i Weight Wn X,i p(xi n,yn X i n 1 ) q n(xn i X n 1 i ) Compute p(z1:n y 1:n, X1:n). i Is approximate; how does error accumulate?

11 Exactly Approximated Rao-Blackwellized SIR Filter At time n = 1 Sample, X i 1 qx ( y 1 ). Sample, Z i,j 1 q z ( X i 1, y 1). Compute and normalise the local weights w z 1 ( X i 1, Z i,j 1 ) := p(xi 1, y 1, Z i,j ( 1 ) q z ), W z,i,j X1 i, y 1 := 1 Z i,j 1 define p(x i 1, y 1 ) := 1 M M j=1 w z 1 ( X i 1, Z i,j 1 Compute and normalise the top-level weights w x 1 ( ) X i p(x i 1 := 1, y 1) q ( x X1 i y ), W x,i 1 := 1 w z 1 ( X i 1, Zi,j 1 ) M k=1 wz 1 (X i 1, Zi,k 1 ). ( X i 1 ) w1 x N k=1 wx 1 ( X k 1 ). At times n 2, resample and do essentially the same again... )

12 Toy Example: Model We use a simulated sequence of 100 observations from the model defined by the densities: (( ) ( ) [ ]) x µ(x 1, z 1 ) =N ; z (( ) ( ) [ ]) xn xn f(x n, z n x n 1, z n 1 ) =N ;, z n z n ( ( ) [ ]) xn σ 2 g(y n x n, z n ) =N y n ;, x 0 z n 0 σz 2 Consider IMSE (relative to optimal filter) of filtering estimate of first coordinate marginals.

13 Approximation of the RBPF Mean Squared Filtering Error 10 1 N = 10 N = 20 N = 40 N = N = Number of Lower Level Particles, M For σ 2 x = σ 2 z = 1.

14 Computational Performance 10 0 Mean Squared Filtering Error N = 10 N = 20 N = 40 N = 80 N = Computational Cost, N(M+1) For σ 2 x = σ 2 z = 1.

15 Computational Performance Mean Squared Filtering Error N = 10 N = 20 N = 40 N = 80 N = Computational Cost, N(M+1) For σ 2 x = 10 2 and σ 2 z =

16 What About Other HMMs / Algorithms? Returning to: x 1 x 2 x 3 x 4 x 5 x 6 y 1 y 2 y 3 y 4 y 5 y 6 Unobserved Markov chain {X n } transition f. Observed process {Y n } conditional density g. Density: n p(x 1:n, y 1:n ) = f 1 (x 1 )g(y 1 x 1 ) f(x i x i 1 )g(y i x i ). i=2

17 Block Sampling: An Idealised Approach At time n, given x 1:n 1 ; discard x n L+1:n 1 : Sample from q(x n L+1:n x n L, y n L+1:n ). Weight with W (x 1:n ) = Optimally, p(x 1:n y 1:n ) p(x 1:n L y 1:n 1 )q(x n L+1:n x n L, y 1:n L+1:n ) q(x n L+1:n x n L, y n L+1:n ) =p(x n L+1:n x n L, y n L+1:n ) W (x 1:n ) p(x 1:n L y 1:n ) p(x 1:n L y 1:n 1 ) =p(y n x 1:n L, y n L+1:n 1 ) Typically intractable; auxiliary variable approach in [DBS06].

18 Local Particle Filtering: Current Trajectories

19 Local Particle Filtering: First Particle

20 Local Particle Filtering: SMC Proposal

21 Local Particle Filtering: CSMC Auxiliary Proposal

22 Local SMC Propose from: U n 1 1:M (b 1:n 2, k)p(x 1:n 1 y 1:n 1 )ψn,l(a M n L+2:n, x n L+1:n, k; x n L ) ψ n 1,L 1 M (ã k n L+2:n 1, x k n L+1:n 1 ; x ) n L b n L+2:n 1, x n L+1:n 1 Target: 1:M (b 1:n L, b k n,n L+1:n 1, k)p(x 1:n L, x b k n L+1:n ( a k n L+2:n, x k n L+1:n ; x n L b k n,n L+1:n, x b U n ψ M n,l ψ M n 1,L 1 (ã n L+2:n 1, x n L+1:n 1, k; x n L ). Weight: Zn L+1:n / Z n L+1:n 1. n,n L+1:n y 1:n) k n,n L+1:n n L+1:n )

23 Proposal: q(x t x t 1, y t ) = f(x t x t 1) Stochastic Volatility Bootstrap Local SMC Model: f(x i x i 1 ) =N ( φx i 1, σ 2) g(y i x i ) =N ( 0, β 2 exp(x i ) ) Top Level: Local SMC proposal. Stratified resampling when ESS< N/2. Local SMC Proposal: Weighting: W (x t 1, x t ) f(x t x t 1 )g(y t x t ) f(x t x t 1 ) = g(y t x t ) Resample residually every iteration.

24 SV Exchange Rata Data 5 Exchange Rate Data Observed Values n

25 SV Bootstrap Local SMC: M= N = 100, M = 100 Average Number of Unique Values L = 2 L = 4 L = 6 L = n

26 SV Bootstrap Local SMC: M= N = 100, M = 1000 Average Number of Unique Values L = 2 L = 4 L = 6 L = 10 L = 14 L = n

27 SV Bootstrap Local SMC: M= N=100, M=10,000 Average Number of Unique Values L = 2 L = 4 L = 6 L = 10 L = 14 L = 18 L = 22 L = n

28 Some Heuristics Recent calculations suggest that, under appropriate assumptions, at fixed cost (2L 1) M N: Optimal L is determined solely by the mixing of the HMM. Optimal M is a linear function of L. N can then be obtained from M, L and available budget. In practice: L can be chosen using pilot runs, and M fine-tuned once L is chosen.

29 In Conclusion SMC can be used hierarchically. Software implementation is not difficult [Joh09, Zho13]. The Rao-Blackwellized particle filter can be approximated exactly Can reduce estimator variance at fixed cost. Attractive for distributed/parallel implementation. Allows combination of different sorts of particle filter. Can be combined with other techniques for parameter estimation etc.. The optimal block-sampling particle filter can be approximated exactly Requiring only simulation from the transition and evaluation of the likelihood Easy to parallelise Low storage cost

30 References I [AD02] [ADH10] C. Andrieu and A. Doucet. Particle filtering for partially observed Gaussian state space models. Journal of the Royal Statistical Society B, 64(4): , C. Andrieu, A. Doucet, and R. Holenstein. Particle Markov chain Monte Carlo. Journal of the Royal Statistical Society B, 72(3): , [CJP13] N. Chopin, P. Jacob, and O. Papaspiliopoulos. SMC 2. Journal of the Royal Statistical Society B, 75(3): , [CL00] R. Chen and J. S. Liu. Mixture Kalman filters. Journal of the Royal Statistical Society B, 62(3): , [CSOL11] [DBS06] [GSS93] [JD14] T. Chen, T. Schön, H. Ohlsson, and L. Ljung. Decentralized particle filter with arbitrary state decomposition. IEEE Transactions on Signal Processing, 59(2): , February A. Doucet, M. Briers, and S. Sénécal. Efficient block sampling strategies for sequential Monte Carlo methods. Journal of Computational and Graphical Statistics, 15(3): , N. J. Gordon, S. J. Salmond, and A. F. M. Smith. Novel approach to nonlinear/non-gaussian Bayesian state estimation. IEE Proceedings-F, 140(2): , April A. M. Johansen and A. Doucet. Hierarchical particle sampling for intractable state-space models. CRiSM working paper, University of Warwick, In preparation. [Joh09] A. M. Johansen. SMCTC: Sequential Monte Carlo in C++. Journal of Statistical Software, 30(6):1 41, April [JWD12] A. M. Johansen, N. Whiteley, and A. Doucet. Exact approximation of Rao-Blackwellised particle filters. In Proceedings of 16th IFAC Symposium on Systems Identification, Brussels, Belgium, July IFAC, IFAC. [Zho13] Y. Zhou. vsmc: Parallel sequential Monte Carlo in C++. Mathematics e-print , ArXiv, 2013.

31 Key Identity = ψ M n,l (a n L+2:n, x n L+1:n, k; x n L ) p(x n L+1:n x n L, y n L+1:n ) ψ n,l M (a k ( ) [ q x bk n,n L+1 n L+1 x n n L =Ẑn L+1:n/p(y n L+1:n x n L ) n L+2:n, x k r ( ( ) b k n,p w p 1 q p=n L+2 p(x n L+1:n x n L, y n L+1:n ) n L+1:n, k; x n L... ) ) ] r(k w n ) x bk n,p p x bn n,p 1 p 1

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Divide-and-Conquer Sequential Monte Carlo

Divide-and-Conquer Sequential Monte Carlo Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick a.m.johansen@warwick.ac.uk http://go.warwick.ac.uk/amjohansen/talks/

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

Package RcppSMC. March 18, 2018

Package RcppSMC. March 18, 2018 Type Package Title Rcpp Bindings for Sequential Monte Carlo Version 0.2.1 Date 2018-03-18 Package RcppSMC March 18, 2018 Author Dirk Eddelbuettel, Adam M. Johansen and Leah F. South Maintainer Dirk Eddelbuettel

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

An Introduction to Particle Filtering

An Introduction to Particle Filtering An Introduction to Particle Filtering Author: Lisa Turner Supervisor: Dr. Christopher Sherlock 10th May 2013 Abstract This report introduces the ideas behind particle filters, looking at the Kalman filter

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Monte Carlo Solution of Integral Equations

Monte Carlo Solution of Integral Equations 1 Monte Carlo Solution of Integral Equations of the second kind Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ March 7th, 2011 MIRaW Monte

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

Adversarial Sequential Monte Carlo

Adversarial Sequential Monte Carlo Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer

More information

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France. AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER Qi Cheng and Pascal Bondon CNRS UMR 8506, Université Paris XI, France. August 27, 2011 Abstract We present a modified bootstrap filter to draw

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

arxiv: v1 [stat.co] 1 Jun 2015

arxiv: v1 [stat.co] 1 Jun 2015 arxiv:1506.00570v1 [stat.co] 1 Jun 2015 Towards automatic calibration of the number of state particles within the SMC 2 algorithm N. Chopin J. Ridgway M. Gerber O. Papaspiliopoulos CREST-ENSAE, Malakoff,

More information

Negative Association, Ordering and Convergence of Resampling Methods

Negative Association, Ordering and Convergence of Resampling Methods Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden. System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

Lecture Particle Filters. Magnus Wiktorsson

Lecture Particle Filters. Magnus Wiktorsson Lecture Particle Filters Magnus Wiktorsson Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace p(x)

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

Sequential Monte Carlo methods for system identification

Sequential Monte Carlo methods for system identification Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou John A.D. Aston and Adam M. Johansen 6th January 2014 Y. Zhou J. A. D. Aston and A. M. Johansen Outline Positron emission tomography

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Recent Developments in Auxiliary Particle Filtering

Recent Developments in Auxiliary Particle Filtering Chapter 3 Recent Developments in Auxiliary Particle Filtering Nick Whiteley 1 and Adam M. Johansen 2 3.1 Background 3.1.1 State Space Models State space models (SSMs; sometimes termed hidden Markov models,

More information

IN particle filter (PF) applications, knowledge of the computational

IN particle filter (PF) applications, knowledge of the computational Complexity Analysis of the Marginalized Particle Filter Rickard Karlsson, Thomas Schön and Fredrik Gustafsson, Member IEEE Abstract In this paper the computational complexity of the marginalized particle

More information

LTCC: Advanced Computational Methods in Statistics

LTCC: Advanced Computational Methods in Statistics LTCC: Advanced Computational Methods in Statistics Advanced Particle Methods & Parameter estimation for HMMs N. Kantas Notes at http://wwwf.imperial.ac.uk/~nkantas/notes4ltcc.pdf Slides at http://wwwf.imperial.ac.uk/~nkantas/slides4.pdf

More information

Blind Equalization via Particle Filtering

Blind Equalization via Particle Filtering Blind Equalization via Particle Filtering Yuki Yoshida, Kazunori Hayashi, Hideaki Sakai Department of System Science, Graduate School of Informatics, Kyoto University Historical Remarks A sequential Monte

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

MCMC and Gibbs Sampling. Kayhan Batmanghelich

MCMC and Gibbs Sampling. Kayhan Batmanghelich MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction

More information

Lecture Particle Filters

Lecture Particle Filters FMS161/MASM18 Financial Statistics November 29, 2010 Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace

More information

An Introduction to Sequential Monte Carlo for Filtering and Smoothing

An Introduction to Sequential Monte Carlo for Filtering and Smoothing An Introduction to Sequential Monte Carlo for Filtering and Smoothing Olivier Cappé LTCI, TELECOM ParisTech & CNRS http://perso.telecom-paristech.fr/ cappe/ Acknowlegdment: Eric Moulines (TELECOM ParisTech)

More information

Introduction. log p θ (y k y 1:k 1 ), k=1

Introduction. log p θ (y k y 1:k 1 ), k=1 ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

Rao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models

Rao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models Technical report from Automatic Control at Linköpings universitet Rao-Blackwellised particle smoothers for mixed linear/nonlinear state-space models Fredrik Lindsten, Thomas B. Schön Division of Automatic

More information

Particle MCMC for Bayesian Microwave Control

Particle MCMC for Bayesian Microwave Control Particle MCMC for Bayesian Microwave Control P. Minvielle 1, A. Todeschini 2, F. Caron 3, P. Del Moral 4, 1 CEA-CESTA, 33114 Le Barp, France 2 INRIA Bordeaux Sud-Ouest, 351, cours de la Liberation, 33405

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

Particle Learning and Smoothing

Particle Learning and Smoothing Particle Learning and Smoothing Carlos Carvalho, Michael Johannes, Hedibert Lopes and Nicholas Polson This version: September 2009 First draft: December 2007 Abstract In this paper we develop particle

More information

A FEASIBILITY STUDY OF PARTICLE FILTERS FOR MOBILE STATION RECEIVERS. Michael Lunglmayr, Martin Krueger, Mario Huemer

A FEASIBILITY STUDY OF PARTICLE FILTERS FOR MOBILE STATION RECEIVERS. Michael Lunglmayr, Martin Krueger, Mario Huemer A FEASIBILITY STUDY OF PARTICLE FILTERS FOR MOBILE STATION RECEIVERS Michael Lunglmayr, Martin Krueger, Mario Huemer Michael Lunglmayr and Martin Krueger are with Infineon Technologies AG, Munich email:

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 7 Approximate

More information

Note Set 5: Hidden Markov Models

Note Set 5: Hidden Markov Models Note Set 5: Hidden Markov Models Probabilistic Learning: Theory and Algorithms, CS 274A, Winter 2016 1 Hidden Markov Models (HMMs) 1.1 Introduction Consider observed data vectors x t that are d-dimensional

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING PARTICLE FILTERS WITH INDEPENDENT RESAMPLING Roland Lamberti 1, Yohan Petetin 1, François Septier, François Desbouvries 1 (1) Samovar, Telecom Sudparis, CNRS, Université Paris-Saclay, 9 rue Charles Fourier,

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

Sampling Algorithms for Probabilistic Graphical models

Sampling Algorithms for Probabilistic Graphical models Sampling Algorithms for Probabilistic Graphical models Vibhav Gogate University of Washington References: Chapter 12 of Probabilistic Graphical models: Principles and Techniques by Daphne Koller and Nir

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Bayesian Computations for DSGE Models

Bayesian Computations for DSGE Models Bayesian Computations for DSGE Models Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 This Lecture is Based on Bayesian Estimation of DSGE Models Edward P. Herbst &

More information

Sequential Bayesian Inference for Dynamic State Space. Model Parameters

Sequential Bayesian Inference for Dynamic State Space. Model Parameters Sequential Bayesian Inference for Dynamic State Space Model Parameters Arnab Bhattacharya and Simon Wilson 1. INTRODUCTION Dynamic state-space models [24], consisting of a latent Markov process X 0,X 1,...

More information

An ABC interpretation of the multiple auxiliary variable method

An ABC interpretation of the multiple auxiliary variable method School of Mathematical and Physical Sciences Department of Mathematics and Statistics Preprint MPS-2016-07 27 April 2016 An ABC interpretation of the multiple auxiliary variable method by Dennis Prangle

More information

USING PARALLEL PARTICLE FILTERS FOR INFERENCES IN HIDDEN MARKOV MODELS

USING PARALLEL PARTICLE FILTERS FOR INFERENCES IN HIDDEN MARKOV MODELS USING PARALLEL PARTICLE FILTERS FOR INFERENCES IN HIDDEN MARKOV MODELS HENG CHIANG WEE (M. Sc, NUS) SUPERVISORS: PROFESSOR CHAN HOCK PENG & ASSOCIATE PROFESSOR AJAY JASRA A THESIS SUBMITTED FOR THE DEGREE

More information

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems Modeling CSC752 Autonomous Robotic Systems Ubbo Visser Department of Computer Science University of Miami February 21, 2017 Outline 1 Modeling and state estimation 2 Examples 3 State estimation 4 Probabilities

More information

The chopthin algorithm for resampling

The chopthin algorithm for resampling The chopthin algorithm for resampling Axel Gandy F. Din-Houn Lau Department of Mathematics, Imperial College London Abstract Resampling is a standard step in particle filters and more generally sequential

More information

Interacting Particle Markov Chain Monte Carlo

Interacting Particle Markov Chain Monte Carlo Technical report arxiv:62528v [statco] 6 Feb 26 Interacting Particle Markov Chain Monte Carlo Tom Rainforth*, Christian A Naesseth*, Fredrik Lindsten, Brooks Paige, Jan-Willem van de Meent, Arnaud Doucet

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Beyond the Kalman Filter: Particle filters for tracking applications

Beyond the Kalman Filter: Particle filters for tracking applications Beyond the Kalman Filter: Particle filters for tracking applications N. J. Gordon Tracking and Sensor Fusion Group Intelligence, Surveillance and Reconnaissance Division Defence Science and Technology

More information

Available online at ScienceDirect. IFAC PapersOnLine (2018)

Available online at   ScienceDirect. IFAC PapersOnLine (2018) Available online at www.sciencedirect.com ScienceDirect IFAC PapersOnLine 51-15 (218) 67 675 Improving the particle filter in high dimensions using conjugate artificial process noise Anna Wigren Lawrence

More information

Sensitivity analysis in HMMs with application to likelihood maximization

Sensitivity analysis in HMMs with application to likelihood maximization Sensitivity analysis in HMMs with application to likelihood maximization Pierre-Arnaud Coquelin, Vekia, Lille, France pacoquelin@vekia.fr Romain Deguest Columbia University, New York City, NY 10027 rd2304@columbia.edu

More information

Evolution Strategies Based Particle Filters for Fault Detection

Evolution Strategies Based Particle Filters for Fault Detection Evolution Strategies Based Particle Filters for Fault Detection Katsuji Uosaki, Member, IEEE, and Toshiharu Hatanaka, Member, IEEE Abstract Recent massive increase of the computational power has allowed

More information

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions

More information

Machine Learning for Data Science (CS4786) Lecture 24

Machine Learning for Data Science (CS4786) Lecture 24 Machine Learning for Data Science (CS4786) Lecture 24 Graphical Models: Approximate Inference Course Webpage : http://www.cs.cornell.edu/courses/cs4786/2016sp/ BELIEF PROPAGATION OR MESSAGE PASSING Each

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Preprints of the 9th World Congress The International Federation of Automatic Control Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik

More information

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,

More information

Divide-and-Conquer with Sequential Monte Carlo

Divide-and-Conquer with Sequential Monte Carlo Divide-and-Conquer with Sequential Monte Carlo F. Lindsten,3, A. M. Johansen 2, C. A. Naesseth 3, B. Kirkpatrick 4, T. B. Schön 5, J. A. D. Aston, and A. Bouchard-Côté 6 arxiv:46.4993v2 [stat.co] 3 Jun

More information

Perfect simulation algorithm of a trajectory under a Feynman-Kac law

Perfect simulation algorithm of a trajectory under a Feynman-Kac law Perfect simulation algorithm of a trajectory under a Feynman-Kac law Data Assimilation, 24th-28th September 212, Oxford-Man Institute C. Andrieu, N. Chopin, A. Doucet, S. Rubenthaler University of Bristol,

More information

Graphical model inference: Sequential Monte Carlo meets deterministic approximations

Graphical model inference: Sequential Monte Carlo meets deterministic approximations Graphical model inference: Sequential Monte Carlo meets deterministic approximations Fredrik Lindsten Department of Information Technology Uppsala University Uppsala, Sweden fredrik.lindsten@it.uu.se Jouni

More information

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17

MCMC for big data. Geir Storvik. BigInsight lunch - May Geir Storvik MCMC for big data BigInsight lunch - May / 17 MCMC for big data Geir Storvik BigInsight lunch - May 2 2018 Geir Storvik MCMC for big data BigInsight lunch - May 2 2018 1 / 17 Outline Why ordinary MCMC is not scalable Different approaches for making

More information

28 : Approximate Inference - Distributed MCMC

28 : Approximate Inference - Distributed MCMC 10-708: Probabilistic Graphical Models, Spring 2015 28 : Approximate Inference - Distributed MCMC Lecturer: Avinava Dubey Scribes: Hakim Sidahmed, Aman Gupta 1 Introduction For many interesting problems,

More information