Auxiliary Particle Methods

Size: px
Start display at page:

Download "Auxiliary Particle Methods"

Transcription

1 Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May Collaborators include: Arnaud Doucet, Nick Whiteley 1

2 2 Introduction

3 3 Outline Outline Background: Particle Filters Hidden Markov Models & Filtering Particle Filters & Sequential Importance Resampling Auxiliary Particle Filters Interpretation Auxiliary Particle Filters are SIR Algorithms Theoretical Considerations Practical Implications Applications Auxiliary SMC Samplers The Probability Hypothesis Density On Stratification Conclusions

4 4 Outline Outline Background: Particle Filters Hidden Markov Models & Filtering Particle Filters & Sequential Importance Resampling Auxiliary Particle Filters Interpretation Auxiliary Particle Filters are SIR Algorithms Theoretical Considerations Practical Implications Applications Auxiliary SMC Samplers The Probability Hypothesis Density On Stratification Conclusions

5 5 Outline Outline Background: Particle Filters Hidden Markov Models & Filtering Particle Filters & Sequential Importance Resampling Auxiliary Particle Filters Interpretation Auxiliary Particle Filters are SIR Algorithms Theoretical Considerations Practical Implications Applications Auxiliary SMC Samplers The Probability Hypothesis Density On Stratification Conclusions

6 6 Outline Outline Background: Particle Filters Hidden Markov Models & Filtering Particle Filters & Sequential Importance Resampling Auxiliary Particle Filters Interpretation Auxiliary Particle Filters are SIR Algorithms Theoretical Considerations Practical Implications Applications Auxiliary SMC Samplers The Probability Hypothesis Density On Stratification Conclusions

7 7 HMMs & Particle Filters Hidden Markov Models x 1 x 2 x 3 x 4 x 5 x 6 y 1 y 2 y 3 y 4 y 5 y 6 X n is a X -valued Markov Chain with transition density f: X n {X n 1 = x n 1 } f( x n 1 ) Y n is a Y-valued stochastic process: Y n {X n = x n } g( x n )

8 8 HMMs & Particle Filters Optimal Filtering The filtering distribution may be expressed recursively as: p(x n y 1:n 1 ) = p(x n 1 y 1:n 1 )f(x n x n 1 )dx n 1 Prediction p(x n y 1:n ) = p(x n y 1:n 1 )g(y n x n ) p(xn y 1:n 1 )g(y n x n )dx n Update. The smoothing distribution may be expressed recursively as: p(x 1:n y 1:n 1 ) = p(x 1:n 1 y 1:n 1 )f n (x n x n 1 ) p(x 1:n y 1:n ) = p(x 1:n y 1:n 1 )g n (y n x n ) p(x1:n y 1:n 1 )g n (y n x n )dx 1:n Prediction Update.

9 9 HMMs & Particle Filters Optimal Filtering The filtering distribution may be expressed recursively as: p(x n y 1:n 1 ) = p(x n 1 y 1:n 1 )f(x n x n 1 )dx n 1 Prediction p(x n y 1:n ) = p(x n y 1:n 1 )g(y n x n ) p(xn y 1:n 1 )g(y n x n )dx n Update. The smoothing distribution may be expressed recursively as: p(x 1:n y 1:n 1 ) = p(x 1:n 1 y 1:n 1 )f n (x n x n 1 ) p(x 1:n y 1:n ) = p(x 1:n y 1:n 1 )g n (y n x n ) p(x1:n y 1:n 1 )g n (y n x n )dx 1:n Prediction Update.

10 10 HMMs & Particle Filters Particle Filters: Sequential Importance Resampling At time n = 1: Sample X i 1,1 q 1( ). Weight Resample. At times n > 1, iterate: W i 1 f(x i 1,1)g(y 1 X i 1,1)/q 1 (X i 1,1) Sample X i n,n q n ( X i n 1,n 1 ). Set Xi n,1:n 1 = Xi n 1. Weight Resample. W i n f(xi n,n X i n,n 1 )g(xi n,n y n ) q n (X n,n X n,1:n 1 )

11 11 Auxiliary Particle Filters Auxiliary [v] Particle Filters (Pitt & Shephard 99) If we have access to the next observation before resampling, we could use this structure: Pre-weight every particle with λ (i) n ˆp(y n X (i) n 1 ). Propose new states, from the mixture distribution N i=1 λ (i) n q( X (i) n 1 )/ N i=1 λ (i) n. Weight samples, correcting for the pre-weighting. Resample particle set. W i n f(xi n,n X i n,n 1 )g(xi n,n y n ) λ i nq n (X n,n X n,1:n 1 )

12 12 Auxiliary Particle Filters Some Well Known Refinements We can tidy things up a bit: 1. The auxiliary variable step is equivalent to multinomial resampling. 2. So, there s no need to resample before the pre-weighting. Now we have: Pre-weight every particle with λ (i) n Resample Propose new states ˆp(y n X (i) n 1 ). Weight samples, correcting for the pre-weighting. W i n f(xi n,n X i n,n 1 )g(xi n,n y n ) λ i nq n (X n,n X n,1:n 1 )

13 13 Interpretation

14 14 APFs without Auxiliary Variables General SIR Algorithms The SIR algorithm can be used somewhat more generally. Given {π n } defined on E n = X n : Sample X i n,n q n ( X i n 1 ). Set Xi n,1:n 1 = Xi n 1. Weight Resample. W i n π n (X i n) π n 1 (X n,1:n 1 )q n (X n,n X n,1:n 1 )

15 15 APFs without Auxiliary Variables An Interpretation of the APF If we move the first step at time n + 1 to the last at time n, we get: Resample Propose new states Weight samples, correcting earlier pre-weighting. Pre-weight every particle with λ (i) n+1 ˆp(y n+1 X (i) n ). which is an SIR algorithm targetting the sequence of distributions η n (x n ) p(x 1:n y 1:n )ˆp(y n+1 x n ) which allows estimation under the actually interesting distribution via importance sampling.

16 16 Theoretical Considerations Theoretical Considerations Direct analysis of the APF is largely unnecessary. Results can be obtained by considering the associated SIR algorithm. SIR has a (discrete time) Feynman-Kac interpretation.

17 17 Theoretical Considerations For example... Proposition. Under standard regularity conditions N ( ϕ N n,ap F ϕ n ) N ( 0, σ 2 n (ϕ n ) ) where, Z σ 2 p ( x1 1 (ϕ y 1 ) 2 1) = (ϕ 1 (x 1 ) ϕ 1 ) 2 dx 1 q 1 (x 1 ) Z σ 2 p(x1 n (ϕn) = y 1:n ) 2 Z «2 ϕ n(x 1:n )p(x 2:n y 2:n, x 1 )dx 2:n ϕ n dx 1 q 1 (x 1 ) t 1 X Z p(x 1:k y 1:n ) 2 Z «2 + ϕ n(x 1:n )p(x k+1:n y k+1:n, x k )dx k+1:n ϕ n dx 1:k bp(x k=2 1:k 1 y 1:k )q k (x k x k 1 ) Z p(x 1:n y 1:n ) 2 + bp(x 1:n 1 y 1:n )q n(x n x n 1 ) (ϕn(x 1:n) ϕ n) 2 dx 1:n.

18 18 Practical Implications Practical Implications It means we re doing importance sampling. Choosing p(y n x n 1 ) = p (y n x n = E [X n x n 1 ]) is dangerous. A safer choice would be ensure that g(y n x n )f(x n x n 1 ) sup x n 1,x n p(y n x n 1 )q(x n x n 1 ) < Using APF doesn t ensure superior performance.

19 19 Practical Implications A Contrived Illustration Consider the following binary state-space model with common state and observation spaces: X = {0, 1} p(x 1 = 0) = 0.5 p(x n = x n 1 ) = 1 δ Y = X p(y n = x n ) = 1 ε. δ controls ergodicity of the state process. ɛ controls the information contained in observations. Consider estimating E(X 2 Y 1:2 = (0, 1)).

20 Practical Implications Variance of SIR - Variance of APF ǫ δ

21 21 Practical Implications Variance Comparison at ǫ = 0.25 SISR APF SISR Asymptotic APF Asymptotic 0.35 Variance δ

22 22 Applications & Extensions

23 23 Auxiliary SMC Samplers SMC Samplers (Del Moral, Doucet & Jasra, 2006) SIR Techniques can be adapted for any sequence of distributions {π n }. Define Sample at time n, n 1 π n (x 1:n ) = π n (x n ) L k (x k+1, x k ). k=1 X i n K n (X i n 1, ) Weight Resample. W i n π n(x i n)l n 1 (X i n, X i n 1 ) π n 1 (X i n 1 )K n(x i n 1, Xi n)

24 24 Auxiliary SMC Samplers Auxiliary SMC Samplers An auxiliary SMC sampler for a sequence of distributions π n comprises: An SMC sampler targeting some auxiliary sequence of distributions µ n. A sequence of importance weight functions Generic approaches: w n (x n ) dπ n dµ n (x n ). Choose µ n (x n ) π n (x n )V n (x n ). Incorporate as much information as possible prior to resampling.

25 25 Auxiliary SMC Samplers Auxiliary SMC Samplers An auxiliary SMC sampler for a sequence of distributions π n comprises: An SMC sampler targeting some auxiliary sequence of distributions µ n. A sequence of importance weight functions Generic approaches: w n (x n ) dπ n dµ n (x n ). Choose µ n (x n ) π n (x n )V n (x n ). Incorporate as much information as possible prior to resampling.

26 26 Auxiliary SMC Samplers Resample-Move Approaches A common approach in SMC Samplers: Choose K n (x n 1, x n ) to be π n -invariant. Set Leading to L n 1 (x n, x n 1 ) = π n(x n 1 )K n (x n 1, x n ). π n (x n ) W n (x n 1, x n ) = π n(x n 1 ) π n 1 (x n 1 ). It would make more sense to resample and then move...

27 27 Auxiliary SMC Samplers An Interpretation of Pure Resample-Move It s SIR with auxiliary distributions... µ n (x n ) = π n+1 (x n ) L n 1 (x n, x n 1 ) = µ n 1(x n 1 )K n (x n 1, x n ) µ n 1 (x n ) w n (x n 1:n ) = µ n(x n ) µ n 1 (x n ) = π n+1(x n ) π n (x n ) w n (x n ) = µ n 1(x n ) µ n (x n ) = π n(x n ) π n+1 (x n ). = π n(x n 1 )K n (x n 1, x n ) π n (x n )

28 28 Auxiliary SMC Samplers Piecewise-Deterministic Processes y x

29 29 Auxiliary SMC Samplers Filtering of Piecewise-Deterministic Processes (Whiteley, Johansen & Godsill, 2007) X n = (k n, τ n,1:kn, θ n,0:kn ) specifies a continuous time trajectory (ζ t ) t [0,tn]. ζ t observed in the presence of noise, e.g. Y n = ζ tn + V n Target sequence of distributions {π n } on nested spaces: π n (k, τ 1:k, θ 0:k ) q(θ 0 ) k f(τ j τ j 1 )q(θ j τ j, θ j 1, τ j 1 ) j=1 S(t n, τ k ) n g(y p ζ tp ) p=1

30 30 Auxiliary SMC Samplers Filtering of Piecewise-Deterministic Processes (Whiteley, Johansen & Godsill, 2007) π n yields posterior marginal distributions for recent history of ζ t Employ auxiliary SMC samplers Choose µ n (k, τ 1:k, θ 0:k ) π n (k, τ 1:k, θ 0:k )V n (τ k, θ k ) Kernel proposes new pairs (τ j, θ j ) and adjusts recent pairs Applications in object-tracking and finance

31 31 The Auxiliary SMC-PHD Filter Probability Hypothesis Density Filtering (Mahler, 2003; Vo et al. 2005) Approximates the optimal filter for a class of spatial point process-valued HMMs A recursion for intensity functions: α n (x n ) = f(x n x n 1 )p S (x n 1 ) α n 1 (x n 1 )dx n 1 + γ(x n ) E [ ] m n ψ n,r (x n ) α n (x n ) = 1 p D (x n ) + α n (x n ). Z n,r r=1 Approximate the intensity functions { α n } n 0 using SMC

32 32 The Auxiliary SMC-PHD Filter An ASMC Implementation (Whiteley et al., 2007) α n 1 (dx n 1 ) α N n 1 (dx n 1) = 1 N N i=1 W i n 1 δ X i n 1 (dx n 1) In a simple case, target integral at nth iteration: E m n E r=1 ϕ(x n ) ψ n,r(x n ) Z n,r f(x n x n 1 )p S (x n 1 )dx n α N n 1(dx n 1 ) Proposal distribution q n (x n, x n 1, r n) built from α N n 1, potential functions {V n,r } and factorises: N q n (x n x i=1 V n,r(x (i) n 1 )W n 1δ (i) X (dx n 1 ) n 1 n 1, r) N i=1 V n,r(x (i) n 1 )W q n (r) n 1 and also estimate normalising constants {Z n,r } by IS

33 33 On Stratification Stratification... Back to HMM, let (A p ) M p=1 denote a partition of X Introduce stratum indicator variable R n = M p=1 pi A p (X n ) Define extended model: p(x 1:n, r 1:n y 1:n ) g(y n x n )f(x n r n, x n 1 )f(r n x n 1 ) p(x 1:n 1, r 1:n 1 y 1:n 1 ) An auxiliary SMC filter resamples from distribution with N M support points, over particles strata Applications in switching state space models

34 34 Conclusions

35 35 Conclusions Conclusions The APF is a standard SIR algorithm for nonstandard distributions. This interpretation... allows standard results to be applied directly, provides guidance on implementation, and allows the same technique to be applied in more general settings. Thanks for listening... any questions?

36 36 Conclusions Conclusions The APF is a standard SIR algorithm for nonstandard distributions. This interpretation... allows standard results to be applied directly, provides guidance on implementation, and allows the same technique to be applied in more general settings. Thanks for listening... any questions?

37 37 Conclusions Conclusions The APF is a standard SIR algorithm for nonstandard distributions. This interpretation... allows standard results to be applied directly, provides guidance on implementation, and allows the same technique to be applied in more general settings. Thanks for listening... any questions?

38 38 Conclusions Conclusions The APF is a standard SIR algorithm for nonstandard distributions. This interpretation... allows standard results to be applied directly, provides guidance on implementation, and allows the same technique to be applied in more general settings. Thanks for listening... any questions?

39 39 Conclusions Conclusions The APF is a standard SIR algorithm for nonstandard distributions. This interpretation... allows standard results to be applied directly, provides guidance on implementation, and allows the same technique to be applied in more general settings. Thanks for listening... any questions?

40 40 Conclusions Conclusions The APF is a standard SIR algorithm for nonstandard distributions. This interpretation... allows standard results to be applied directly, provides guidance on implementation, and allows the same technique to be applied in more general settings. Thanks for listening... any questions?

41 41 Conclusions Conclusions The APF is a standard SIR algorithm for nonstandard distributions. This interpretation... allows standard results to be applied directly, provides guidance on implementation, and allows the same technique to be applied in more general settings. Thanks for listening... any questions?

42 References [1] P. Del Moral, A. Doucet, and A. Jasra. Sequential Monte Carlo samplers. Journal of the Royal Statistical Society B, 63(3): , [2] A. M. Johansen and A. Doucet. Auxiliary variable sequential Monte Carlo methods. Technical Report 07:09, University of Bristol, Department of Mathematics Statistics Group, University Walk, Bristol, BS8 1TW, UK, July [3] A. M. Johansen and A. Doucet. A note on the auxiliary particle filter. Statistics and Probability Letters, To appear. [4] A. M. Johansen and N. Whiteley. A modern perspective on auxiliary particle filters. In Proceedings of Workshop on Inference and Estimation in Probabilistic Time Series Models. Isaac Newton Institute, June To appear. [5] R. P. S. Mahler. Multitarget Bayes filtering via first-order multitarget moments. IEEE Transactions on Aerospace and Electronic Systems, 39(4):1152, October [6] M. K. Pitt and N. Shephard. Filtering via simulation: Auxiliary particle filters. Journal of the American Statistical Association, 94(446): , [7] B. Vo, S.S. Singh, and A. Doucet. Sequential Monte Carlo methods for multi-target filtering with random finite sets. IEEE Transactions on Aerospace and Electronic Systems, 41(4): , October [8] N. Whiteley, A. M. Johansen, and S. Godsill. Monte Carlo filtering of piecewise-deterministic processes. Technical Report CUED/F-INFENG/TR-592, University of Cambridge, Department of Engineering, Cambridge University Engineering Department, Trumpington Street, Cambridge, CB2 1PZ, December [9] N. Whiteley, S. Singh, and S. Godsill. Auxiliary particle implementation of the probability hypothesis density filter. Technical Report CUED F-INFENG/590, University of Cambridge, Department of Engineering, Trumpington Street, Cambridge,CB1 2PZ, United Kingdom, 2007.

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

Recent Developments in Auxiliary Particle Filtering

Recent Developments in Auxiliary Particle Filtering Chapter 3 Recent Developments in Auxiliary Particle Filtering Nick Whiteley 1 and Adam M. Johansen 2 3.1 Background 3.1.1 State Space Models State space models (SSMs; sometimes termed hidden Markov models,

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Monte Carlo Approximation of Monte Carlo Filters

Monte Carlo Approximation of Monte Carlo Filters Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Monte Carlo Filtering of Piecewise Deterministic Processes

Monte Carlo Filtering of Piecewise Deterministic Processes Monte Carlo Filtering of Piecewise Deterministic Processes Nick Whiteley, Adam M. Johansen and Simon Godsill Abstract We present efficient Monte Carlo algorithms for performing Bayesian inference in a

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

A Backward Particle Interpretation of Feynman-Kac Formulae

A Backward Particle Interpretation of Feynman-Kac Formulae A Backward Particle Interpretation of Feynman-Kac Formulae P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop on Filtering, Cambridge Univ., June 14-15th 2010 Preprints (with hyperlinks), joint

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

LTCC: Advanced Computational Methods in Statistics

LTCC: Advanced Computational Methods in Statistics LTCC: Advanced Computational Methods in Statistics Advanced Particle Methods & Parameter estimation for HMMs N. Kantas Notes at http://wwwf.imperial.ac.uk/~nkantas/notes4ltcc.pdf Slides at http://wwwf.imperial.ac.uk/~nkantas/slides4.pdf

More information

Answers and expectations

Answers and expectations Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Adversarial Sequential Monte Carlo

Adversarial Sequential Monte Carlo Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

Sequential Monte Carlo samplers

Sequential Monte Carlo samplers J. R. Statist. Soc. B (2006) 68, Part 3, pp. 411 436 Sequential Monte Carlo samplers Pierre Del Moral, Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Vancouver,

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France. AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER Qi Cheng and Pascal Bondon CNRS UMR 8506, Université Paris XI, France. August 27, 2011 Abstract We present a modified bootstrap filter to draw

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Lecture 7 and 8: Markov Chain Monte Carlo

Lecture 7 and 8: Markov Chain Monte Carlo Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani

More information

Perfect simulation algorithm of a trajectory under a Feynman-Kac law

Perfect simulation algorithm of a trajectory under a Feynman-Kac law Perfect simulation algorithm of a trajectory under a Feynman-Kac law Data Assimilation, 24th-28th September 212, Oxford-Man Institute C. Andrieu, N. Chopin, A. Doucet, S. Rubenthaler University of Bristol,

More information

Negative Association, Ordering and Convergence of Resampling Methods

Negative Association, Ordering and Convergence of Resampling Methods Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting

More information

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP Advanced Monte Carlo integration methods P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP MCQMC 2012, Sydney, Sunday Tutorial 12-th 2012 Some hyper-refs Feynman-Kac formulae,

More information

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision The Particle Filter Non-parametric implementation of Bayes filter Represents the belief (posterior) random state samples. by a set of This representation is approximate. Can represent distributions that

More information

Fast Sequential Monte Carlo PHD Smoothing

Fast Sequential Monte Carlo PHD Smoothing 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 11 Fast Sequential Monte Carlo PHD Smoothing Sharad Nagappa and Daniel E. Clark School of EPS Heriot Watt University

More information

Sequential Monte Carlo Methods

Sequential Monte Carlo Methods University of Pennsylvania Bradley Visitor Lectures October 23, 2017 Introduction Unfortunately, standard MCMC can be inaccurate, especially in medium and large-scale DSGE models: disentangling importance

More information

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou John A.D. Aston and Adam M. Johansen 6th January 2014 Y. Zhou J. A. D. Aston and A. M. Johansen Outline Positron emission tomography

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Divide-and-Conquer Sequential Monte Carlo

Divide-and-Conquer Sequential Monte Carlo Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick a.m.johansen@warwick.ac.uk http://go.warwick.ac.uk/amjohansen/talks/

More information

Introduction. log p θ (y k y 1:k 1 ), k=1

Introduction. log p θ (y k y 1:k 1 ), k=1 ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

An Introduction to Particle Filtering

An Introduction to Particle Filtering An Introduction to Particle Filtering Author: Lisa Turner Supervisor: Dr. Christopher Sherlock 10th May 2013 Abstract This report introduces the ideas behind particle filters, looking at the Kalman filter

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

Monte Carlo Solution of Integral Equations

Monte Carlo Solution of Integral Equations 1 Monte Carlo Solution of Integral Equations of the second kind Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ March 7th, 2011 MIRaW Monte

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

A new class of interacting Markov Chain Monte Carlo methods

A new class of interacting Markov Chain Monte Carlo methods A new class of interacting Marov Chain Monte Carlo methods P Del Moral, A Doucet INRIA Bordeaux & UBC Vancouver Worshop on Numerics and Stochastics, Helsini, August 2008 Outline 1 Introduction Stochastic

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

Linear Dynamical Systems (Kalman filter)

Linear Dynamical Systems (Kalman filter) Linear Dynamical Systems (Kalman filter) (a) Overview of HMMs (b) From HMMs to Linear Dynamical Systems (LDS) 1 Markov Chains with Discrete Random Variables x 1 x 2 x 3 x T Let s assume we have discrete

More information

On-Line Parameter Estimation in General State-Space Models

On-Line Parameter Estimation in General State-Space Models On-Line Parameter Estimation in General State-Space Models Christophe Andrieu School of Mathematics University of Bristol, UK. c.andrieu@bris.ac.u Arnaud Doucet Dpts of CS and Statistics Univ. of British

More information

Sequential Monte Carlo methods for system identification

Sequential Monte Carlo methods for system identification Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,

More information

An Introduction to Sequential Monte Carlo for Filtering and Smoothing

An Introduction to Sequential Monte Carlo for Filtering and Smoothing An Introduction to Sequential Monte Carlo for Filtering and Smoothing Olivier Cappé LTCI, TELECOM ParisTech & CNRS http://perso.telecom-paristech.fr/ cappe/ Acknowlegdment: Eric Moulines (TELECOM ParisTech)

More information

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning

April 20th, Advanced Topics in Machine Learning California Institute of Technology. Markov Chain Monte Carlo for Machine Learning for for Advanced Topics in California Institute of Technology April 20th, 2017 1 / 50 Table of Contents for 1 2 3 4 2 / 50 History of methods for Enrico Fermi used to calculate incredibly accurate predictions

More information

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods

Pattern Recognition and Machine Learning. Bishop Chapter 11: Sampling Methods Pattern Recognition and Machine Learning Chapter 11: Sampling Methods Elise Arnaud Jakob Verbeek May 22, 2008 Outline of the chapter 11.1 Basic Sampling Algorithms 11.2 Markov Chain Monte Carlo 11.3 Gibbs

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING PARTICLE FILTERS WITH INDEPENDENT RESAMPLING Roland Lamberti 1, Yohan Petetin 1, François Septier, François Desbouvries 1 (1) Samovar, Telecom Sudparis, CNRS, Université Paris-Saclay, 9 rue Charles Fourier,

More information

Monte Carlo methods for sampling-based Stochastic Optimization

Monte Carlo methods for sampling-based Stochastic Optimization Monte Carlo methods for sampling-based Stochastic Optimization Gersende FORT LTCI CNRS & Telecom ParisTech Paris, France Joint works with B. Jourdain, T. Lelièvre, G. Stoltz from ENPC and E. Kuhn from

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

Dynamical Models for Tracking with the Variable Rate Particle Filter

Dynamical Models for Tracking with the Variable Rate Particle Filter Dynamical Models for Tracking with the Variable Rate Particle Filter Pete Bunch and Simon Godsill Department of Engineering University of Cambridge Cambridge, UK Email: {pb44, sjg}@eng.cam.ac.uk Abstract

More information

Particle Learning and Smoothing

Particle Learning and Smoothing Particle Learning and Smoothing Carlos Carvalho, Michael Johannes, Hedibert Lopes and Nicholas Polson This version: September 2009 First draft: December 2007 Abstract In this paper we develop particle

More information

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems

Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz 1 Multilevel Sequential 2 Monte Carlo for Bayesian Inverse Problems Jonas Latz Technische Universität München Fakultät für Mathematik Lehrstuhl für Numerische Mathematik jonas.latz@tum.de November

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Efficient Particle Filtering for Jump Markov Systems. Application to Time-Varying Autoregressions

Efficient Particle Filtering for Jump Markov Systems. Application to Time-Varying Autoregressions 1762 IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 7, JULY 2003 Efficient Particle Filtering for Jump Markov Systems. Application to Time-Varying Autoregressions Christophe Andrieu, Manuel Davy,

More information

Particle Learning and Smoothing

Particle Learning and Smoothing Particle Learning and Smoothing Hedibert Freitas Lopes The University of Chicago Booth School of Business September 18th 2009 Instituto Nacional de Pesquisas Espaciais São José dos Campos, Brazil Joint

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

cappe/

cappe/ Particle Methods for Hidden Markov Models - EPFL, 7 Dec 2004 Particle Methods for Hidden Markov Models Olivier Cappé CNRS Lab. Trait. Commun. Inform. & ENST département Trait. Signal Image 46 rue Barrault,

More information

Unsupervised Learning

Unsupervised Learning Unsupervised Learning Bayesian Model Comparison Zoubin Ghahramani zoubin@gatsby.ucl.ac.uk Gatsby Computational Neuroscience Unit, and MSc in Intelligent Systems, Dept Computer Science University College

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo

Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Probabilistic Graphical Models Lecture 17: Markov chain Monte Carlo Andrew Gordon Wilson www.cs.cmu.edu/~andrewgw Carnegie Mellon University March 18, 2015 1 / 45 Resources and Attribution Image credits,

More information

MCMC and Gibbs Sampling. Kayhan Batmanghelich

MCMC and Gibbs Sampling. Kayhan Batmanghelich MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction

More information

Bayesian Inference for Dirichlet-Multinomials

Bayesian Inference for Dirichlet-Multinomials Bayesian Inference for Dirichlet-Multinomials Mark Johnson Macquarie University Sydney, Australia MLSS Summer School 1 / 50 Random variables and distributed according to notation A probability distribution

More information

Bearings-Only Tracking in Modified Polar Coordinate System : Initialization of the Particle Filter and Posterior Cramér-Rao Bound

Bearings-Only Tracking in Modified Polar Coordinate System : Initialization of the Particle Filter and Posterior Cramér-Rao Bound Bearings-Only Tracking in Modified Polar Coordinate System : Initialization of the Particle Filter and Posterior Cramér-Rao Bound Thomas Brehard (IRISA/CNRS), Jean-Pierre Le Cadre (IRISA/CNRS). Journée

More information

LECTURE 15 Markov chain Monte Carlo

LECTURE 15 Markov chain Monte Carlo LECTURE 15 Markov chain Monte Carlo There are many settings when posterior computation is a challenge in that one does not have a closed form expression for the posterior distribution. Markov chain Monte

More information

Sampling Methods (11/30/04)

Sampling Methods (11/30/04) CS281A/Stat241A: Statistical Learning Theory Sampling Methods (11/30/04) Lecturer: Michael I. Jordan Scribe: Jaspal S. Sandhu 1 Gibbs Sampling Figure 1: Undirected and directed graphs, respectively, with

More information

arxiv: v1 [stat.co] 1 Jun 2015

arxiv: v1 [stat.co] 1 Jun 2015 arxiv:1506.00570v1 [stat.co] 1 Jun 2015 Towards automatic calibration of the number of state particles within the SMC 2 algorithm N. Chopin J. Ridgway M. Gerber O. Papaspiliopoulos CREST-ENSAE, Malakoff,

More information

Markov Chain Monte Carlo (MCMC)

Markov Chain Monte Carlo (MCMC) Markov Chain Monte Carlo (MCMC Dependent Sampling Suppose we wish to sample from a density π, and we can evaluate π as a function but have no means to directly generate a sample. Rejection sampling can

More information

Lecture Particle Filters. Magnus Wiktorsson

Lecture Particle Filters. Magnus Wiktorsson Lecture Particle Filters Magnus Wiktorsson Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace p(x)

More information

Bayesian Monte Carlo Filtering for Stochastic Volatility Models

Bayesian Monte Carlo Filtering for Stochastic Volatility Models Bayesian Monte Carlo Filtering for Stochastic Volatility Models Roberto Casarin CEREMADE University Paris IX (Dauphine) and Dept. of Economics University Ca Foscari, Venice Abstract Modelling of the financial

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

Nested Sequential Monte Carlo Methods

Nested Sequential Monte Carlo Methods Nested Sequential Monte Carlo Methods Christian A. Naesseth, Fredrik Lindsten, Thomas B. Schön Linköping University, Sweden The University of Cambridge, UK Uppsala University, Sweden ICML 2015 Presented

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Markov chain Monte Carlo methods for visual tracking

Markov chain Monte Carlo methods for visual tracking Markov chain Monte Carlo methods for visual tracking Ray Luo rluo@cory.eecs.berkeley.edu Department of Electrical Engineering and Computer Sciences University of California, Berkeley Berkeley, CA 94720

More information

Sensitivity analysis in HMMs with application to likelihood maximization

Sensitivity analysis in HMMs with application to likelihood maximization Sensitivity analysis in HMMs with application to likelihood maximization Pierre-Arnaud Coquelin, Vekia, Lille, France pacoquelin@vekia.fr Romain Deguest Columbia University, New York City, NY 10027 rd2304@columbia.edu

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information

Asymptotics and Simulation of Heavy-Tailed Processes

Asymptotics and Simulation of Heavy-Tailed Processes Asymptotics and Simulation of Heavy-Tailed Processes Department of Mathematics Stockholm, Sweden Workshop on Heavy-tailed Distributions and Extreme Value Theory ISI Kolkata January 14-17, 2013 Outline

More information

Basic math for biology

Basic math for biology Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Divide-and-Conquer with Sequential Monte Carlo

Divide-and-Conquer with Sequential Monte Carlo Divide-and-Conquer with Sequential Monte Carlo F. Lindsten,3, A. M. Johansen 2, C. A. Naesseth 3, B. Kirkpatrick 4, T. B. Schön 5, J. A. D. Aston, and A. Bouchard-Côté 6 arxiv:46.4993v2 [stat.co] 3 Jun

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information