An Brief Overview of Particle Filtering

Size: px
Start display at page:

Download "An Brief Overview of Particle Filtering"

Transcription

1 1 An Brief Overview of Particle Filtering Adam M. Johansen www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems Biology

2 2 Outline Background Monte Carlo Methods Hidden Markov Models / State Space Models Sequential Monte Carlo Filtering Sequential Importance Sampling Sequential Importance Resampling Advanced methods Smoothing Parameter Estimation

3 3 Background

4 4 Monte Carlo Estimating π Rain is uniform. Circle is inscribed in square. A square = 4r 2. A circle = πr 2. p = A circle A square = π of 500 successes. ˆπ = = Also obtain confidence intervals.

5 5 Monte Carlo The Monte Carlo Method Given a probability density, p, I = ϕ(x)p(x)dx Simple Monte Carlo solution: iid Sample X1,..., X N p. N Estimate Î = 1 N ϕ(x N ). i=1 E Justified by the law of large numbers... and the central limit theorem. Can also be viewed as approximating π(dx) = p(x)dx with ˆπ N (dx) = 1 N N δ Xi (dx). i=1

6 6 Monte Carlo Importance Sampling Given q, such that p(x) > 0 q(x) > 0 and p(x)/q(x) <, define w(x) = p(x)/q(x) and: I = ϕ(x)p(x)dx = ϕ(x)w(x)q(x)dx. This suggests the importance sampling estimator: iid Sample X 1,..., X N q. N Estimate Î = 1 N w(x i )ϕ(x i ). i=1 Can also be viewed as approximating π(dx) = p(x)dx with ˆπ n (dx) = 1 n w(x i )δ Xi (dx). n i=1

7 7 Monte Carlo Variance and Importance Sampling Variance depends upon proposal: ( ) 2 Var(Î) =E 1 N w(x i )ϕ(x i ) I N = 1 N 2 = 1 N i=1 N E [( w(x i ) 2 ϕ(x i ) 2 I 2)] i=1 ( Eq [ w(x) 2 ϕ(x) 2] I 2) = 1 N ( Ep [ w(x)ϕ(x) 2 ] I 2) For positive ϕ, minimized by: q(x) p(x)ϕ(x) w(x)ϕ(x) 1

8 8 Monte Carlo Self-Normalised Importance Sampling Often, p is known only up to a normalising constant. As E q (Cwϕ) = CE p (ϕ)... If v(x) = Cw(x), then E q (vϕ) E q (v1) = E q(cwϕ) E q (Cw1) = CE p(ϕ) CE p (1) = E p(ϕ). Estimate the numerator and denominator with the same sample: N v(x i )ϕ(x i ) i=1 Î =. N v(x i ) i=1 Biased for finite samples, but consistent. Typically reduces variance.

9 9 Hidden Markov Models Hidden Markov Models / State Space Models x 1 x 2 x 3 x 4 x 5 x 6 y 1 y 2 y 3 y 4 y 5 y 6 Unobserved Markov chain {X n } transition f. Observed process {Y n } conditional density g. Density: n p(x 1:n, y 1:n ) = f 1 (x 1 )g(y 1 x 1 ) f(x i x i 1 )g(y i x i ). i=2

10 10 Hidden Markov Models Inference in HMMs Given y 1:n : What is x1:n, in particular, what is xn, and what about xn+1? f and g may have unknown parameters θ. We wish to obtain estimates for n = 1, 2,....

11 11 Hidden Markov Models Some Terminology Three main problems: Given known parameters θ, what are: The filtering and prediction distributions: p(x n y 1:n ) = p(x 1:n y 1:n )dx 1:n 1 p(x n+1 y 1:n ) = p(x n+1 x n )p(x n y 1:n )dx n The smoothing distributions: p(x 1:n y 1:n ) = p(x 1:n, y 1:n ) p(y 1:n ) And how can we estimate static parameters: p(θ y 1:n ) and p(x 1:n, θ y 1:n )?

12 12 Hidden Markov Models Formal Solutions Filtering and Prediction Recursions: Smoothing: p(x n y 1:n ) = p(x n y 1:n 1 )g(y n x n ) p(x n y 1:n 1 )g(y n x n)dx n p(x n+1 y 1:n ) = p(x n y 1:n )f(x n+1 x n )dx n p(x 1:n 1 y 1:n 1 )f(x n x n 1 )g(y n x n ) p(x 1:n y 1:n ) = g(yn x n)f(x n x n 1 )p(x n 1 y 1:n 1)dx n 1:n

13 13 Particle Filtering Importance Sampling in This Setting Given p(x 1:n y 1:n ) for n = 1, 2,.... We could sample from a sequence q n (x 1:n ) for each n. Or we could let q n (x 1:n ) = q n (x n x 1:n 1 )q n 1 (x 1:n 1 ) and re-use our samples. The importance weight decomposes: p(x 1:n y 1:n ) w n (x 1:n ) q n (x n x 1:n 1 )q n 1 (x 1:n 1 ) p(x 1:n y 1:n ) = q n (x n x 1:n 1 )p(x 1:n 1 y 1:n 1 ) w n 1(x 1:n 1 ) = f(x n x n 1 )g(y n x n ) q n (x n x n 1 )p(y n y 1:n 1 ) w n 1(x 1:n 1 )

14 14 Particle Filtering Sequential Importance Sampling Prediction & Update A first particle filter : Simple default: qn (x n x n 1 ) = f(x n x n 1 ). Importance weighting becomes: w n (x 1:n ) = w n 1 (x 1:n 1 ) g(y n x n ) Algorithmically, at iteration n: Given {W i n 1, X i 1:n 1} for i = 1,..., N: Sample Xn i f( Xn 1) i (prediction) Weight Wn i Wn 1g(y i n Xn) i (update)

15 15 Particle Filtering Example: Almost Constant Velocity Model States: x n = [s x n u x n s y n u y n] T Dynamics: x n = Ax n 1 + ɛ n s x n 1 t 0 0 u x n s y n = t u y n Observation: y n = Bx n + ν n [ r x n r y n ] = [ ] s x n 1 u x n 1 s y n 1 u y n 1 s x n u x n s y n u y n + ɛ n + ν n

16 16 Particle Filtering Better Proposal Distributions The importance weighting is: w n (x 1:n ) f(x n x n 1 )g(y n x n ) w n 1 (x 1:n 1 ) q n (x n x n 1 ) Optimally, we d choose: q n (x n x n 1 ) f(x n x n 1 )g(y n x n ) Typically impossible; but good approximation is possible. Can obtain much better performance. But, eventually the approximation variance is too large.

17 17 Particle Filtering Resampling Given {W i n, X i 1:n } targetting p(x 1:n y 1:n ). Sample X i 1:n N j=1 Wnδ i X j. 1:n And {1/N, X i 1:n } also targets p(x 1:n y 1:n ). Such steps can be incorporated into the SIS algorithm.

18 18 Particle Filtering Sequential Importance Resampling Algorithmically, at iteration n: Given {W i n 1, X i n 1} for i = 1,..., N: Resample, obtaining {1/N, Xi n 1 }. Sample X i n q( e X i n 1) Weight W i n f(xi n e X i n 1 )g(yn Xi n ) q(x i n e X i n 1 ) Actually: You can resample more efficiently. It s only necessary to resample some of the time.

19 19 Particle Filtering What else can we do? Some common improvements: Incorporate MCMC (7) Use the next observation before resampling auxiliary particle filters (11, 9) Sample new values for recent states (6) Explicitly approximate the marginal filter (10)

20 20 Particle Smoothing Smoothing: Resampling helps filtering. But it eliminates unique particle values. Approximating p(x 1:n y 1:n ) or p(x m y 1:n ) fails for large n.

21 21 Particle Smoothing Smoothing Strategies Some approaches: Fixed-lag smoothing: for large enough L. p(x n L y 1:n ) p(x n L y 1:n 1 ) Particle implementation of forward-backward algorithm: p(x 1:n y 1:n ) = p(x n y 1:n ) n 1 m=1 p(x n 1 x n, y 1:n ) Doucet et al. have developed a forward-only variant. Particle implementation of two-filter formula: p(x m y 1:n ) p(x m y 1:m )p(y m+1:n x m )

22 22 Parameter Estimation Static Parameters If f(x n x n 1 ) and g(y n x n ) depend upon θ: We could set x n = (x n, θ n ) and f (x n x n 1) = f(x n x n 1 )δ θn 1 (θ n ) Degeneracy causes difficulties: Resampling eliminates values. We never propose new values. MCMC transitions don t mix well.

23 23 Parameter Estimation Approaches to Parameter Estimation Artificial dynamics: θ n = θ n 1 + ɛ n. Iterated filtering. Sufficient statistics. Various online likelihood approaches. Practical filtering..

24 24 Summary

25 25 SMCTC: C++ Template Class for SMC Algorithms Implementing SMC algorithms in C/C++ isn t hard. Software for implementing general SMC algorithms. C++ element largely confined to the library. Available (under a GPL-3 license from) www2.warwick.ac.uk/fac/sci/statistics/staff/ academic/johansen/smctc/ or type smctc into google. Example code includes simple particle filter.

26 26 In Conclusion (with references) Sequential Monte Carlo methods ( Particle Filters ) can approximate filtering and smoothing distributions (5). Proposal distributions matter. Resampling: is necessary (2), but not every iteration, and need not be multinomial (4). Smoothing & parameter estimation are harder. Software for easy implementation exists: PFLib (1) SMCTC (8) Actually, similar techniques apply elsewhere (3)

27 References [1] L. Chen, C. Lee, A. Budhiraja, and R. K. Mehra. PFlib: An object oriented matlab toolbox for particle filtering. In Proceedings of SPIE Signal Processing, Sensor Fusion and Target Recognition XVI, volume 6567, [2] P. Del Moral. Feynman-Kac formulae: genealogical and interacting particle systems with applications. Probability and Its Applications. Springer Verlag, New York, [3] P. Del Moral, A. Doucet, and A. Jasra. Sequential Monte Carlo samplers. Journal of the Royal Statistical Society B, 63(3): , [4] R. Douc and O. Cappé. Comparison of resampling schemes for particle filters. In Proceedings of the 4th International Symposium on Image and Signal Processing and Analysis, volume I, pages IEEE, [5] A. Doucet and A. M. Johansen. A tutorial on particle filtering and smoothing: Fiteen years later. In D. Crisan and B. Rozovsky, editors, The Oxford Handbook of Nonlinear Filtering. Oxford University Press, To appear. [6] A. Doucet, M. Briers, and S. Sénécal. Efficient block sampling strategies for sequential Monte Carlo methods. Journal of Computational and Graphical Statistics, 15(3): , [7] W. R. Gilks and C. Berzuini. Following a moving target Monte Carlo inference for dynamic Bayesian models. Journal of the Royal Statistical Society B, 63: , [8] A. M. Johansen. SMCTC: Sequential Monte Carlo in C++. Journal of Statistical Software, 30(6):1 41, April [9] A. M. Johansen and A. Doucet. A note on the auxiliary particle filter. Statistics and Probability Letters, 78(12): , September URL [10] M. Klass, N. de Freitas, and A. Doucet. Towards practical n 2 Monte Carlo: The marginal particle filter. In Proceedings of Uncertainty in Artificial Intelligence, [11] M. K. Pitt and N. Shephard. Filtering via simulation: Auxiliary particle filters. Journal of the American Statistical Association, 94(446): ,

28 28 SMC Sampler Outline Given a sample {X (i) 1:n 1 }N i=1 targeting η n 1, sample X n (i) calculate K n (X (i) n 1, ), W n (X (i) 1:n ) = η n(x n (i) )L n 1 (X n (i), X (i) n 1 ) η n 1 (X (i) n 1 )K n(x (i) n 1, X(i) n ). Resample, yielding: {X (i) 1:n }N i=1 targeting η n. Hints that we d like to use L n 1 (x n, x n 1 ) = η n 1(x n 1 )K n (x n 1, x n ) ηn 1 (x n 1 )K n(x n 1, x n).

Monte Carlo Approximation of Monte Carlo Filters

Monte Carlo Approximation of Monte Carlo Filters Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Divide-and-Conquer Sequential Monte Carlo

Divide-and-Conquer Sequential Monte Carlo Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick a.m.johansen@warwick.ac.uk http://go.warwick.ac.uk/amjohansen/talks/

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

Package RcppSMC. March 18, 2018

Package RcppSMC. March 18, 2018 Type Package Title Rcpp Bindings for Sequential Monte Carlo Version 0.2.1 Date 2018-03-18 Package RcppSMC March 18, 2018 Author Dirk Eddelbuettel, Adam M. Johansen and Leah F. South Maintainer Dirk Eddelbuettel

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London http://www2.imperial.ac.uk/~agandy London

More information

Recent Developments in Auxiliary Particle Filtering

Recent Developments in Auxiliary Particle Filtering Chapter 3 Recent Developments in Auxiliary Particle Filtering Nick Whiteley 1 and Adam M. Johansen 2 3.1 Background 3.1.1 State Space Models State space models (SSMs; sometimes termed hidden Markov models,

More information

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP

Advanced Monte Carlo integration methods. P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP Advanced Monte Carlo integration methods P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP MCQMC 2012, Sydney, Sunday Tutorial 12-th 2012 Some hyper-refs Feynman-Kac formulae,

More information

Introduction. log p θ (y k y 1:k 1 ), k=1

Introduction. log p θ (y k y 1:k 1 ), k=1 ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

Sequential Monte Carlo samplers

Sequential Monte Carlo samplers J. R. Statist. Soc. B (2006) 68, Part 3, pp. 411 436 Sequential Monte Carlo samplers Pierre Del Moral, Université Nice Sophia Antipolis, France Arnaud Doucet University of British Columbia, Vancouver,

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand

Concentration inequalities for Feynman-Kac particle models. P. Del Moral. INRIA Bordeaux & IMB & CMAP X. Journées MAS 2012, SMAI Clermond-Ferrand Concentration inequalities for Feynman-Kac particle models P. Del Moral INRIA Bordeaux & IMB & CMAP X Journées MAS 2012, SMAI Clermond-Ferrand Some hyper-refs Feynman-Kac formulae, Genealogical & Interacting

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France. AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER Qi Cheng and Pascal Bondon CNRS UMR 8506, Université Paris XI, France. August 27, 2011 Abstract We present a modified bootstrap filter to draw

More information

Monte Carlo Solution of Integral Equations

Monte Carlo Solution of Integral Equations 1 Monte Carlo Solution of Integral Equations of the second kind Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ March 7th, 2011 MIRaW Monte

More information

Lecture Particle Filters. Magnus Wiktorsson

Lecture Particle Filters. Magnus Wiktorsson Lecture Particle Filters Magnus Wiktorsson Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace p(x)

More information

LTCC: Advanced Computational Methods in Statistics

LTCC: Advanced Computational Methods in Statistics LTCC: Advanced Computational Methods in Statistics Advanced Particle Methods & Parameter estimation for HMMs N. Kantas Notes at http://wwwf.imperial.ac.uk/~nkantas/notes4ltcc.pdf Slides at http://wwwf.imperial.ac.uk/~nkantas/slides4.pdf

More information

Lecture 7 and 8: Markov Chain Monte Carlo

Lecture 7 and 8: Markov Chain Monte Carlo Lecture 7 and 8: Markov Chain Monte Carlo 4F13: Machine Learning Zoubin Ghahramani and Carl Edward Rasmussen Department of Engineering University of Cambridge http://mlg.eng.cam.ac.uk/teaching/4f13/ Ghahramani

More information

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography

Sequential Monte Carlo Methods for Bayesian Model Selection in Positron Emission Tomography Methods for Bayesian Model Selection in Positron Emission Tomography Yan Zhou John A.D. Aston and Adam M. Johansen 6th January 2014 Y. Zhou J. A. D. Aston and A. M. Johansen Outline Positron emission tomography

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Bayesian Monte Carlo Filtering for Stochastic Volatility Models

Bayesian Monte Carlo Filtering for Stochastic Volatility Models Bayesian Monte Carlo Filtering for Stochastic Volatility Models Roberto Casarin CEREMADE University Paris IX (Dauphine) and Dept. of Economics University Ca Foscari, Venice Abstract Modelling of the financial

More information

Self Adaptive Particle Filter

Self Adaptive Particle Filter Self Adaptive Particle Filter Alvaro Soto Pontificia Universidad Catolica de Chile Department of Computer Science Vicuna Mackenna 4860 (143), Santiago 22, Chile asoto@ing.puc.cl Abstract The particle filter

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

The chopthin algorithm for resampling

The chopthin algorithm for resampling The chopthin algorithm for resampling Axel Gandy F. Din-Houn Lau Department of Mathematics, Imperial College London Abstract Resampling is a standard step in particle filters and more generally sequential

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Robert Collins Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Robert Collins A Brief Overview of Sampling Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling

More information

A new class of interacting Markov Chain Monte Carlo methods

A new class of interacting Markov Chain Monte Carlo methods A new class of interacting Marov Chain Monte Carlo methods P Del Moral, A Doucet INRIA Bordeaux & UBC Vancouver Worshop on Numerics and Stochastics, Helsini, August 2008 Outline 1 Introduction Stochastic

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix

Infinite-State Markov-switching for Dynamic. Volatility Models : Web Appendix Infinite-State Markov-switching for Dynamic Volatility Models : Web Appendix Arnaud Dufays 1 Centre de Recherche en Economie et Statistique March 19, 2014 1 Comparison of the two MS-GARCH approximations

More information

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling

27 : Distributed Monte Carlo Markov Chain. 1 Recap of MCMC and Naive Parallel Gibbs Sampling 10-708: Probabilistic Graphical Models 10-708, Spring 2014 27 : Distributed Monte Carlo Markov Chain Lecturer: Eric P. Xing Scribes: Pengtao Xie, Khoa Luu In this scribe, we are going to review the Parallel

More information

Introduction to Artificial Intelligence (AI)

Introduction to Artificial Intelligence (AI) Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 9 Oct, 11, 2011 Slide credit Approx. Inference : S. Thrun, P, Norvig, D. Klein CPSC 502, Lecture 9 Slide 1 Today Oct 11 Bayesian

More information

An Introduction to Particle Filtering

An Introduction to Particle Filtering An Introduction to Particle Filtering Author: Lisa Turner Supervisor: Dr. Christopher Sherlock 10th May 2013 Abstract This report introduces the ideas behind particle filters, looking at the Kalman filter

More information

Answers and expectations

Answers and expectations Answers and expectations For a function f(x) and distribution P(x), the expectation of f with respect to P is The expectation is the average of f, when x is drawn from the probability distribution P E

More information

A Backward Particle Interpretation of Feynman-Kac Formulae

A Backward Particle Interpretation of Feynman-Kac Formulae A Backward Particle Interpretation of Feynman-Kac Formulae P. Del Moral Centre INRIA de Bordeaux - Sud Ouest Workshop on Filtering, Cambridge Univ., June 14-15th 2010 Preprints (with hyperlinks), joint

More information

Particle Learning and Smoothing

Particle Learning and Smoothing Particle Learning and Smoothing Carlos Carvalho, Michael Johannes, Hedibert Lopes and Nicholas Polson This version: September 2009 First draft: December 2007 Abstract In this paper we develop particle

More information

Adversarial Sequential Monte Carlo

Adversarial Sequential Monte Carlo Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

Divide-and-Conquer with Sequential Monte Carlo

Divide-and-Conquer with Sequential Monte Carlo Divide-and-Conquer with Sequential Monte Carlo F. Lindsten,3, A. M. Johansen 2, C. A. Naesseth 3, B. Kirkpatrick 4, T. B. Schön 5, J. A. D. Aston, and A. Bouchard-Côté 6 arxiv:46.4993v2 [stat.co] 3 Jun

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart Robotics Mobile Robotics State estimation, Bayes filter, odometry, particle filter, Kalman filter, SLAM, joint Bayes filter, EKF SLAM, particle SLAM, graph-based SLAM Marc Toussaint U Stuttgart DARPA Grand

More information

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING

PARTICLE FILTERS WITH INDEPENDENT RESAMPLING PARTICLE FILTERS WITH INDEPENDENT RESAMPLING Roland Lamberti 1, Yohan Petetin 1, François Septier, François Desbouvries 1 (1) Samovar, Telecom Sudparis, CNRS, Université Paris-Saclay, 9 rue Charles Fourier,

More information

Robert Collins CSE586, PSU Intro to Sampling Methods

Robert Collins CSE586, PSU Intro to Sampling Methods Intro to Sampling Methods CSE586 Computer Vision II Penn State Univ Topics to be Covered Monte Carlo Integration Sampling and Expected Values Inverse Transform Sampling (CDF) Ancestral Sampling Rejection

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

Bayesian Inference for Dirichlet-Multinomials

Bayesian Inference for Dirichlet-Multinomials Bayesian Inference for Dirichlet-Multinomials Mark Johnson Macquarie University Sydney, Australia MLSS Summer School 1 / 50 Random variables and distributed according to notation A probability distribution

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

USING PARALLEL PARTICLE FILTERS FOR INFERENCES IN HIDDEN MARKOV MODELS

USING PARALLEL PARTICLE FILTERS FOR INFERENCES IN HIDDEN MARKOV MODELS USING PARALLEL PARTICLE FILTERS FOR INFERENCES IN HIDDEN MARKOV MODELS HENG CHIANG WEE (M. Sc, NUS) SUPERVISORS: PROFESSOR CHAN HOCK PENG & ASSOCIATE PROFESSOR AJAY JASRA A THESIS SUBMITTED FOR THE DEGREE

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Sequential Monte Carlo for Bayesian Computation

Sequential Monte Carlo for Bayesian Computation BAYESIAN STATISTICS 8, pp. 1 34. J. M. Bernardo, M. J. Bayarri, J. O. Berger, A. P. Dawid, D. Heckerman, A. F. M. Smith and M. West (Eds.) c Oxford University Press, 2007 Sequential Monte Carlo for Bayesian

More information

Machine Learning for OR & FE

Machine Learning for OR & FE Machine Learning for OR & FE Hidden Markov Models Martin Haugh Department of Industrial Engineering and Operations Research Columbia University Email: martin.b.haugh@gmail.com Additional References: David

More information

Sequential Monte Carlo samplers for Bayesian DSGE models

Sequential Monte Carlo samplers for Bayesian DSGE models Sequential Monte Carlo samplers for Bayesian DSGE models Drew Creal Department of Econometrics, Vrije Universitiet Amsterdam, NL-8 HV Amsterdam dcreal@feweb.vu.nl August 7 Abstract Bayesian estimation

More information

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral

Mean field simulation for Monte Carlo integration. Part II : Feynman-Kac models. P. Del Moral Mean field simulation for Monte Carlo integration Part II : Feynman-Kac models P. Del Moral INRIA Bordeaux & Inst. Maths. Bordeaux & CMAP Polytechnique Lectures, INLN CNRS & Nice Sophia Antipolis Univ.

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Inference in state-space models with multiple paths from conditional SMC

Inference in state-space models with multiple paths from conditional SMC Inference in state-space models with multiple paths from conditional SMC Sinan Yıldırım (Sabancı) joint work with Christophe Andrieu (Bristol), Arnaud Doucet (Oxford) and Nicolas Chopin (ENSAE) September

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

MCMC and Gibbs Sampling. Kayhan Batmanghelich

MCMC and Gibbs Sampling. Kayhan Batmanghelich MCMC and Gibbs Sampling Kayhan Batmanghelich 1 Approaches to inference l Exact inference algorithms l l l The elimination algorithm Message-passing algorithm (sum-product, belief propagation) The junction

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Lecture Particle Filters

Lecture Particle Filters FMS161/MASM18 Financial Statistics November 29, 2010 Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace

More information

Basic math for biology

Basic math for biology Basic math for biology Lei Li Florida State University, Feb 6, 2002 The EM algorithm: setup Parametric models: {P θ }. Data: full data (Y, X); partial data Y. Missing data: X. Likelihood and maximum likelihood

More information

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden. System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

28 : Approximate Inference - Distributed MCMC

28 : Approximate Inference - Distributed MCMC 10-708: Probabilistic Graphical Models, Spring 2015 28 : Approximate Inference - Distributed MCMC Lecturer: Avinava Dubey Scribes: Hakim Sidahmed, Aman Gupta 1 Introduction For many interesting problems,

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

CSEP 573: Artificial Intelligence

CSEP 573: Artificial Intelligence CSEP 573: Artificial Intelligence Hidden Markov Models Luke Zettlemoyer Many slides over the course adapted from either Dan Klein, Stuart Russell, Andrew Moore, Ali Farhadi, or Dan Weld 1 Outline Probabilistic

More information

Graphical Models and Kernel Methods

Graphical Models and Kernel Methods Graphical Models and Kernel Methods Jerry Zhu Department of Computer Sciences University of Wisconsin Madison, USA MLSS June 17, 2014 1 / 123 Outline Graphical Models Probabilistic Inference Directed vs.

More information

Monte Carlo Filtering of Piecewise Deterministic Processes

Monte Carlo Filtering of Piecewise Deterministic Processes Monte Carlo Filtering of Piecewise Deterministic Processes Nick Whiteley, Adam M. Johansen and Simon Godsill Abstract We present efficient Monte Carlo algorithms for performing Bayesian inference in a

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

16 : Markov Chain Monte Carlo (MCMC)

16 : Markov Chain Monte Carlo (MCMC) 10-708: Probabilistic Graphical Models 10-708, Spring 2014 16 : Markov Chain Monte Carlo MCMC Lecturer: Matthew Gormley Scribes: Yining Wang, Renato Negrinho 1 Sampling from low-dimensional distributions

More information

Surveying the Characteristics of Population Monte Carlo

Surveying the Characteristics of Population Monte Carlo International Research Journal of Applied and Basic Sciences 2013 Available online at www.irjabs.com ISSN 2251-838X / Vol, 7 (9): 522-527 Science Explorer Publications Surveying the Characteristics of

More information

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,

More information

On-Line Parameter Estimation in General State-Space Models

On-Line Parameter Estimation in General State-Space Models On-Line Parameter Estimation in General State-Space Models Christophe Andrieu School of Mathematics University of Bristol, UK. c.andrieu@bris.ac.u Arnaud Doucet Dpts of CS and Statistics Univ. of British

More information

Probabilistic Machine Learning

Probabilistic Machine Learning Probabilistic Machine Learning Bayesian Nets, MCMC, and more Marek Petrik 4/18/2017 Based on: P. Murphy, K. (2012). Machine Learning: A Probabilistic Perspective. Chapter 10. Conditional Independence Independent

More information

Fast Sequential Monte Carlo PHD Smoothing

Fast Sequential Monte Carlo PHD Smoothing 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 11 Fast Sequential Monte Carlo PHD Smoothing Sharad Nagappa and Daniel E. Clark School of EPS Heriot Watt University

More information

Effective Sample Size for Importance Sampling based on discrepancy measures

Effective Sample Size for Importance Sampling based on discrepancy measures Effective Sample Size for Importance Sampling based on discrepancy measures L. Martino, V. Elvira, F. Louzada Universidade de São Paulo, São Carlos (Brazil). Universidad Carlos III de Madrid, Leganés (Spain).

More information

Sensitivity analysis in HMMs with application to likelihood maximization

Sensitivity analysis in HMMs with application to likelihood maximization Sensitivity analysis in HMMs with application to likelihood maximization Pierre-Arnaud Coquelin, Vekia, Lille, France pacoquelin@vekia.fr Romain Deguest Columbia University, New York City, NY 10027 rd2304@columbia.edu

More information

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak

Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak Introduction to Systems Analysis and Decision Making Prepared by: Jakub Tomczak 1 Introduction. Random variables During the course we are interested in reasoning about considered phenomenon. In other words,

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM Roger Frigola Fredrik Lindsten Thomas B. Schön, Carl E. Rasmussen Dept. of Engineering, University of Cambridge,

More information

Adaptive Monte Carlo methods

Adaptive Monte Carlo methods Adaptive Monte Carlo methods Jean-Michel Marin Projet Select, INRIA Futurs, Université Paris-Sud joint with Randal Douc (École Polytechnique), Arnaud Guillin (Université de Marseille) and Christian Robert

More information

Particle Metropolis-Hastings using gradient and Hessian information

Particle Metropolis-Hastings using gradient and Hessian information Particle Metropolis-Hastings using gradient and Hessian information Johan Dahlin, Fredrik Lindsten and Thomas B. Schön September 19, 2014 Abstract Particle Metropolis-Hastings (PMH) allows for Bayesian

More information

An Introduction to Sequential Monte Carlo for Filtering and Smoothing

An Introduction to Sequential Monte Carlo for Filtering and Smoothing An Introduction to Sequential Monte Carlo for Filtering and Smoothing Olivier Cappé LTCI, TELECOM ParisTech & CNRS http://perso.telecom-paristech.fr/ cappe/ Acknowlegdment: Eric Moulines (TELECOM ParisTech)

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

Lecture 4: Hidden Markov Models: An Introduction to Dynamic Decision Making. November 11, 2010

Lecture 4: Hidden Markov Models: An Introduction to Dynamic Decision Making. November 11, 2010 Hidden Lecture 4: Hidden : An Introduction to Dynamic Decision Making November 11, 2010 Special Meeting 1/26 Markov Model Hidden When a dynamical system is probabilistic it may be determined by the transition

More information