Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Size: px
Start display at page:

Download "Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering"

Transcription

1 Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering Axel Gandy Department of Mathematics Imperial College London London Taught Course Centre for PhD Students in the Mathematical Sciences Autumn 2014

2 Outline Introduction Setup Examples Finitely Many States Kalman Filter Particle Filtering Improving the Algorithm Further Topics Summary Axel Gandy Particle Filtering 2

3 Sequential Monte Carlo/Particle Filtering Particle filtering introduced in Gordon et al. (1993) Most of the material on particle filtering is based on Doucet et al. (2001) and on the tutorial Doucet & Johansen (2008). Examples: Tracking of Objects Robot Localisation Financial Applications Axel Gandy Particle Filtering 3

4 Setup - Hidden Markov Model x 0, x 1, x 2,... : unobserved Markov chain - hidden states y 1, y 2,... : observations; y 1 y 2 y 3 y 4 x 0 x 1 x 2 x 3 x 4... y 1, y 2,... are conditionally independent given x 0, x 1,... Model given by π(x0 ) - the initial distribution f (xt x t 1 ) for t 1 - the transition kernel of the Markov chain g(yt x t ) for t 1 - the distribution of the observations Notation: x0:t = (x 0,..., x t ) - hidden states up to time t y1:t = (y 1,..., y t ) - observations up to time t Interested in the posterior distribution p(x 0:t y 1:t ) or in p(x t y 1:t )

5 Some Remarks No explicit solution in the general case - only in special cases. Will focus on the time-homogeneous case, i.e. the transition densities f and the density of the observations g will be the same for each step. Extensions to inhomogeneous case straightforward. It is important that one is able to update quickly as new data becomes available, i.e. if y t+1 is observed want to be able to quickly compute p(x t+1 y 1:t+1 ) based on p(x t y 1:t ) and y t+1. Axel Gandy Particle Filtering 5

6 Example- Bearings Only Tracking Gordon et al. (1993); Ship moving in the two-dimensional plane Stationary observer sees only the angle to the ship. Hidden states: position x t,1, x t,2, speed x t,3, x t,4. Speed changes randomly x t,3 N(x t 1,3, σ 2 ), x t,4 N(x t 1,4, σ 2 ) Position changes accordingly x t,1 = x t 1,1 + x t 1,3, x t,2 = x t 1,2 + x t 1,4 Observations: y t N(tan 1 (x t,1 /x t,2 ), η 2 )

7 Example- Stochastic Volatility Returns: y t N(0, β 2 exp(x t )) (observable from price data) Volatility: x t N(αx t 1, σ 2 (1 α) 2 ), x 1 N(0, σ 2 (1 α) 2 ), σ = 1, β = 0.5, α = Volatility x t Returns y t t Axel Gandy Particle Filtering 7

8 Hidden Markov Chain with finitely many states If the hidden process x t takes only finitely many values then p(x t+1 y 1:t+1 ) can be computed recursively via p(x t+1 y 1:t+1 ) g(y t+1 x t+1 ) x t f (x t+1 x t )p(x t y 0:t ) May not be practicable if there are too many states! Axel Gandy Particle Filtering 8

9 Kalman Filter Kalman (1960) Linear Gaussian state space model: x 0 N(ˆx 0, P 0 ) x t = Ax t 1 + w t yt = Hx t + v t wt N(0, Q) iid, v t N(0, R) iid A, H deterministic matrices; ˆx0, P 0, A, H, Q, R known Explicit computation and updating of the posterior possible: Posterior: x t y 1,..., y t N(ˆx t, P t ) ˆx 0, P 0 P t Time Update ˆx t = Aˆx t 1 = AP t 1 A + Q Measurement Update K t = P t H (HP t H + R) 1 ˆx t = ˆx t + K t (y t Hˆx t ) P t = (I K t H)P t

10 Kalman Filter - Simple Example x 0 N(0, 1) x t = 0.9x t 1 + w t y t = 2x t + v t w t N(0, 1) iid, v t N(0, 1) iid y t t true x t mean x t y 1,...,y t 95% credible interval

11 Kalman Filter - Remarks I Updating very easy - only involves linear algebra. Very widely used A, H, Q and R can change with time A linear control input can be incorporated, i.e. the hidden state can evolve according to where u t can be controlled. x t = Ax t 1 + Bu t 1 + w t 1 Normal prior/updating/observations can be replaced through appropriate conjugate distributions. Continuous Time Version: Kalman-Bucy filter See Øksendal (2003) for a nice introduction. Axel Gandy Particle Filtering 11

12 Kalman Filter - Remarks II Extended Kalman Filter extension to nonlinear dynamics: xt = f (x t 1, u t 1, w t 1 ) y t = g(x t, v t ). where f and g are nonlinear functions. The extended Kalman filter linearises the nonlinear dynamics around the current mean and covariance. To do so it uses the Jacobian matrices, i.e. the matrices of partial derivatives of f and g with respect to its components. The extended Kalman filter does no longer compute precise posterior distributions. Axel Gandy Particle Filtering 12

13 Outline Introduction Particle Filtering Introduction Bootstrap Filter Example-Tracking Example-Stochastic Volatility Example-Football Theoretical Results Improving the Algorithm Further Topics Summary Axel Gandy Particle Filtering 13

14 What to do if there were no observations... y 1 y 2 y 3 y 4 x 0 x 1 x 2 x 3 x 4... Without observations y i the following simple approach would work: Sample N particles following the initial distribution x (1) 0,..., x(n) 0 π(x 0 ) For every step propagate each particle according to the transition kernel of the Markov chain: x (j) i+1 f ( x(j) i ), j = 1,..., N After each step there are N particles that approximate the distribution of x i. Note: very easy to update to the next step.

15 Importance Sampling Cannot sample from x 0:t y 1:t directly. Main idea: Change the density we are sampling from. Interested in E(φ(x 0:t ) y 1:t ) = φ(x 0:t )p(x 0:t y 1:t )dx 0:t For any density h, E(φ(x 0:t ) y 1:t ) = φ(x 0:t ) p(x 0:t y 1:t ) h(x 0:t )dx 0:t, h(x 0:t ) Thus an unbiased estimator of E(φ(x 0:t ) y 1:t ) is Î = 1 N N φ(x i 0:t)wt i, i=1 where wt i = p(xi 0:t y 1:t ) h(x i 0:t ) and x 1 0:t,..., xn 0:t h iid. How to evaluate p(x i 0:t y 1:t)? How to choose h? Can importance sampling be done recursively? Axel Gandy Particle Filtering 15

16 Sequential Importance Sampling I Recursive definition and sampling of the importance sampling distribution: h(x 0:t ) = h(x t x 0:t 1 )h(x 0:t 1 ) Can the weights be computed recursively? By Bayes Theorem: w t = p(x 0:t y 1:t ) h(x 0:t ) = p(y 1:t x 0:t )p(x 0:t ) h(x 0:t )p(y 1:t ) Hence, w t = g(y t x t )p(y 1:t 1 x 0:t 1 )f (x t x t 1 )p(x 0:t 1 ) h(x t x 0:t 1 )h(x 0:t 1 )p(y 1:t ) Thus, g(y w t = w t x t )f (x t x t 1 ) p(y 1:t 1 ) t 1 h(x t x 0:t 1 ) p(y 1:t ) Axel Gandy Particle Filtering 16

17 Sequential Importance Sampling I Recursive definition and sampling of the importance sampling distribution: h(x 0:t ) = h(x t x 0:t 1 )h(x 0:t 1 ) Can the weights be computed recursively? By Bayes Theorem: w t = p(x 0:t y 1:t ) h(x 0:t ) = p(y 1:t x 0:t )p(x 0:t ) h(x 0:t )p(y 1:t ) Hence, w t = g(y t x t )p(y 1:t 1 x 0:t 1 )f (x t x t 1 )p(x 0:t 1 ) h(x t x 0:t 1 )h(x 0:t 1 )p(y 1:t ) Thus, g(y w t = w t x t )f (x t x t 1 ) p(y 1:t 1 ) t 1 h(x t x 0:t 1 ) p(y 1:t ) Axel Gandy Particle Filtering 16

18 Sequential Importance Sampling I Recursive definition and sampling of the importance sampling distribution: h(x 0:t ) = h(x t x 0:t 1 )h(x 0:t 1 ) Can the weights be computed recursively? By Bayes Theorem: w t = p(x 0:t y 1:t ) h(x 0:t ) = p(y 1:t x 0:t )p(x 0:t ) h(x 0:t )p(y 1:t ) Hence, w t = g(y t x t )p(y 1:t 1 x 0:t 1 )f (x t x t 1 )p(x 0:t 1 ) h(x t x 0:t 1 )h(x 0:t 1 )p(y 1:t ) Thus, g(y w t = w t x t )f (x t x t 1 ) p(y 1:t 1 ) t 1 h(x t x 0:t 1 ) p(y 1:t ) Axel Gandy Particle Filtering 16

19 Sequential Importance Sampling II Can work with normalised weights: w i t = the recursion w t i w t 1 i g(y t x i t)f (x i t x i t 1 ) h(x i t x i 0:t 1 ) w t i ; then one gets j w j t If one uses uses the prior distribution h(x 0 ) = π(x 0 ) and h(x t x 0:t 1 ) = f (x t x t 1 ) as importance sampling distribution then the recursion is simply w i t w i t 1g(y t x i t) Axel Gandy Particle Filtering 17

20 Failure of Sequential Importance Sampling Weights degenerate as t increases. Example: x 0 N(0, 1), x t+1 N(x t, 1), y t N(x t, 1). N = 100 particles Plot of the empirical cdfs of the normalised weights w 1 t,..., wt N t= 1 t= 10 t= 20 t= 40 Fn(x) Fn(x) Fn(x) Fn(x) e 52 1e 41 1e 30 1e 19 1e 08 1e 52 1e 41 1e 30 1e 19 1e 08 1e 52 1e 41 1e 30 1e 19 1e 08 x x x Note the log-scale on the x-axis. Most weights get very small. 1e 52 1e 41 1e 30 1e 19 1e 08 x Axel Gandy Particle Filtering 18

21 Resampling Goal: Eliminate particles with very low weights. Suppose Q = N w t i δ x i t i=1 is the current approximation to the distribution of x t. Then one can obtain a new approximation as follows: Sample N iid particles x i t from Q The new approximation is Q = N i=1 1 N δ x i t Axel Gandy Particle Filtering 19

22 The Bootstrap Filter 1. Sample x (i) 0 π(x 0 ) and set t = 1 2. Importance Sampling Step For i = 1,..., N: Sample x (i) t f (x t x (i) t 1 ) and set x(i) 0:t = (x(i) 0:t 1, x(i) Evaluate the importance weights w (i) t 3. Selection Step: t ) = g(y t x (i) t ). (i) Resample with replacement N particles (x 0:t ; i = 1,..., N) from the set { x (1) } according to the normalised importance weights 0:t,..., x(1) 0:t. w (i) t N j=1 w (j) t t:=t+1; go to step 2. Axel Gandy Particle Filtering 20

23 Illustration of the Bootstrap Filter N=10 particles g(y 1 x) g(y 2 x) g(y 3 x)

24 Example- Bearings Only Tracking N = particles true position estimated position observer Axel Gandy Particle Filtering 22

25 Example- Stochastic Volatility Bootstrap Particle Filter with N = 1000 xtrue true volatility mean posterior volatility 95% credible interval t Axel Gandy Particle Filtering 23

26 Example: Football Data: Premier League 2007/08 x t,j strength of the jth team at time t, j = 1,..., 20 y t result of the games on date t Note: not time-homogeneous (different teams playing one-another - different time intervals between games). Model: Initial distribution of the strength: xt,j N(0, 1) Evolution of strength: xt,j N((1 /β) 1/2 x t,j, /β) will use β = 50 Result of games conditional on strength: Match between team H of strength x H (at home) against team A of strength x A. Goals scored by the home team Poisson(λ H exp(x H x A )) Goals scored by the away team Poisson(λ A exp(x A x H )) λ H and λ A constants chosen based on the average number of goals scored at home/away. Axel Gandy Particle Filtering 24

27 Mean Team Strength Arsenal Aston Villa Birmingham Blackburn Bolton Chelsea Derby Everton Fulham Liverpool t Man City Man United Middlesbrough Newcastle Portsmouth Reading Sunderland Tottenham West Ham Wigan t (based on N = particle)

28 League Table at the end of 2007/08 1 Man Utd 87 2 Chelsea 85 3 Arsenal 83 4 Liverpool 76 5 Everton 65 6 Aston Villa 60 7 Blackburn 58 8 Portsmouth 57 9 Manchester City West Ham Utd Tottenham Newcastle Middlesbrough Wigan Athletic Sunderland Bolton Fulham Reading Birmingham Derby County 11

29 Influence of the Number N of Particles N=100 Arsenal Aston Villa Birmingham Blackburn Bolton Chelsea Derby Everton Fulham Liverpool N=1000 Arsenal Aston Villa Birmingham Blackburn Bolton Chelsea Derby Everton Fulham Liverpool N=10000 Arsenal Aston Villa Birmingham Blackburn Bolton Chelsea Derby Everton Fulham Liverpool N= Arsenal Aston Villa Birmingham Blackburn Bolton Chelsea Derby Everton Fulham Liverpool

30 Theoretical Results Convergence results are as N Laws of Large Numbers Central limit theorems see e.g. Chopin (2004) The central limit theorems yield an asymptotic variance. This asymptotic variance can be used for theoretical comparisons of algorithms. Axel Gandy Particle Filtering 28

31 Outline Introduction Particle Filtering Improving the Algorithm General Proposal Distribution Improving Resampling Further Topics Summary Axel Gandy Particle Filtering 29

32 General Proposal Distribution Algorithm with a general proposal distribution h: 1. Sample x (i) 0 π(x 0 ) and set t = 1 2. Importance Sampling Step For i = 1,..., N: Sample x (i) t h t (x t x (i) t 1 ) and set x(i) 0:t = (x(i) Evaluate the importance weights w (i) t 3. Selection Step: = g(y t x(i) t 0:t 1, x(i) t ) )f ( x (i) t x (i) t 1 ). h t( x (i) t x (i) t 1 ) Resample with replacement N particles (x (i) 0:t ; i = 1,..., N) from the set { x (1) } according to the normalised importance weights 0:t,..., x(1) 0:t. w (i) t N j=1 w (j) t t:=t+1; go to step 2. Optimal proposal distribution depends on quantity to be estimated. generic choice: choose proposal to minimise the variance of the normalised weights. Axel Gandy Particle Filtering 30

33 Improving Resampling Resampling was introduced to remove particles with low weights Downside: adds variance Axel Gandy Particle Filtering 31

34 Other Types of Resampling Goal: Reduce additional variance in the resampling step Standardised weights W 1,..., W N ; N i - number of offspring of the ith element. Need E N i = W i N for all i. Want to minimise the resulting variance of the weights. Multinomial Resampling - resampling with replacement Systematic Resampling Sample U 1 U(0, 1/N). Let U i = U 1 + i 1 N, i = 2,..., N N i = {j : i 1 k=1 W k U j i k=1 W k Residual Resampling Idea: Guarantee at least Ñi = W i N offspring of the ith element; N 1,..., N n : Multinomial sample of N Ñ i items with weights W i W i Ñ i /N. Set N i = Ñ i + N i.

35 Adaptive Resampling Resampling was introduced to remove particles with low weights. However, resampling introduces additional randomness to the algorithm. Idea: Only resample when weights are too uneven. Can be assessed by computing the variance of the weights and comparing it to a threshold. Equivalently, one can compute the effective sample size (ESS): ( n ) 1 ESS = (wt i ) 2. i=1 (w 1 t,..., w n t are the normalised weights) Intuitively the effective sample size describes how many samples from the target distribution would be roughly equivalent to importance sampling with the weights w i t. Thus one could decide to resample only if ESS < k where k can be chosen e.g. as k = N/2.

36 Outline Introduction Particle Filtering Improving the Algorithm Further Topics Path Degeneracy Smoothing Parameter Estimates Summary Axel Gandy Particle Filtering 34

37 Path Degeneracy Let s N. #{x i 0:s : i = 1,..., N} 1 (as # of steps t infty) Example x 0 N(0, 1), x t+1 N(x t, 1), y t N(x t, 1). N = 100 particles. t=20 t=40 t= Axel Gandy Particle Filtering 35

38 Particle Smoothing Estimate the distribution of the state x t given all the observations y 1,..., y τ up to some late point τ > t. Intuitively, a better estimation should be possible than with filtering (where only information up to τ = t is available). Trajectory estimates tend to be smoother than those obtained by filtering. More sophisticated algorithms are needed. Axel Gandy Particle Filtering 36

39 Filtering and Parameter Estimates Recall that the model is given by π(x0 ) - the initial distribution f (xt x t 1 ) for t 1 - the transition kernel of the Markov chain g(yt x t ) for t 1 - the distribution of the observations In practical applications these distributions will not be known explicitly - they will depend on unknown parameters themselves. Two different starting points Bayesian point of view: parameters have some prior distribution Frequentist point of view: parameters are unknown constants. Examples: σ Stoch. Volatility: x t N(αx t 1, 2 σ ), x (1 α) 2 1 N(0, 2 ), (1 α) 2 y t N(0, β 2 exp(x t )) Unknown parameters: σ, β, α Football: Unknown Parameters: β, λ H, λ A. How to estimate these unknown parameters? Axel Gandy Particle Filtering 37

40 Maximum Likelihood Approach Let θ Θ contain all unknown parameters in the model. Would need marginal density p θ (y 1:t ). Can be estiamted by running the particle filter for each θ of interest and by muliplying the unnormalised weights, see the slides Sequential Importance Sampling I/II earlier in the lecture for some intuition. Axel Gandy Particle Filtering 38

41 Artificial(?) Random Walk Dynamics Allow the parameters to change with time - give them some dynamic. More precisely: Suppose we have a parameter vector θ Allow it to depend on time (θt ), assign a dynamic to it, i.e. a prior distribution and some transition probability from θ t 1 to θ t incorporate θ t in the state vector x t May be reasonable in the Stoch. Volatility and Football Example Axel Gandy Particle Filtering 39

42 Bayesian Parameters Prior on θ Want: Posterior p(θ y 1,..., y t ). Naive Approach: Incorporate θ into xt ; transition for these components is just the identity Resampling will lead to θ degenerating - after a moderate number of steps only few (or even one) θ will be left New approaches: Particle filter within an MCMC algorithm, Andrieu et al. (2010) - computationally very expensive. SMC 2 by Chopin, Jacob, Papaspiliopoulos, arxiv: Axel Gandy Particle Filtering 40

43 Outline Introduction Particle Filtering Improving the Algorithm Further Topics Summary Concluding Remarks Axel Gandy Particle Filtering 41

44 Concluding Remarks Active research area A collection of references/resources regarding SMC http: // Collection of application-oriented articles: Doucet et al. (2001) Brief introduction to SMC: (Robert & Casella, 2004, Chapter 14) R-package implementing several methods: pomp Axel Gandy Particle Filtering 42

45 References Part I Appendix Axel Gandy Particle Filtering 43

46 References References I Andrieu, C., Doucet, A. & Holenstein, R. (2010). Particle markov chain monte carlo methods. Journal of the Royal Statistical Society: Series B (Statistical Methodology) 72, Chopin, N. (2004). Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. Ann. Statist. 32, Doucet, A., Freitas, N. D. & Gordon, N. (eds.) (2001). Sequential Monte Carlo Methods in Practice. Springer. Doucet, A. & Johansen, A. M. (2008). A tutorial on particle filtering and smoothing: Fifteen years later. Available at Gordon, N., Salmond, D. & Smith, A. (1993). Novel approach to nonlinear/non-gaussian bayesian state estimation. Radar and Signal Processing, IEE Proceedings F 140, Kalman, R. (1960). A new approach to linear filtering and prediction problems. Journal of Basic Engineering 82, Axel Gandy Particle Filtering 44

47 References References II Øksendal, B. (2003). Stochastic Differential Equations: An Introduction With Applications. Springer. Robert, C. & Casella, G. (2004). Monte Carlo Statistical Methods. Second ed., Springer. Axel Gandy Particle Filtering 45

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

Monte Carlo Approximation of Monte Carlo Filters

Monte Carlo Approximation of Monte Carlo Filters Monte Carlo Approximation of Monte Carlo Filters Adam M. Johansen et al. Collaborators Include: Arnaud Doucet, Axel Finke, Anthony Lee, Nick Whiteley 7th January 2014 Context & Outline Filtering in State-Space

More information

The Hierarchical Particle Filter

The Hierarchical Particle Filter and Arnaud Doucet http://go.warwick.ac.uk/amjohansen/talks MCMSki V Lenzerheide 7th January 2016 Context & Outline Filtering in State-Space Models: SIR Particle Filters [GSS93] Block-Sampling Particle

More information

An introduction to Sequential Monte Carlo

An introduction to Sequential Monte Carlo An introduction to Sequential Monte Carlo Thang Bui Jes Frellsen Department of Engineering University of Cambridge Research and Communication Club 6 February 2014 1 Sequential Monte Carlo (SMC) methods

More information

Learning Static Parameters in Stochastic Processes

Learning Static Parameters in Stochastic Processes Learning Static Parameters in Stochastic Processes Bharath Ramsundar December 14, 2012 1 Introduction Consider a Markovian stochastic process X T evolving (perhaps nonlinearly) over time variable T. We

More information

An Brief Overview of Particle Filtering

An Brief Overview of Particle Filtering 1 An Brief Overview of Particle Filtering Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ May 11th, 2010 Warwick University Centre for Systems

More information

A Note on Auxiliary Particle Filters

A Note on Auxiliary Particle Filters A Note on Auxiliary Particle Filters Adam M. Johansen a,, Arnaud Doucet b a Department of Mathematics, University of Bristol, UK b Departments of Statistics & Computer Science, University of British Columbia,

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

The chopthin algorithm for resampling

The chopthin algorithm for resampling The chopthin algorithm for resampling Axel Gandy F. Din-Houn Lau Department of Mathematics, Imperial College London Abstract Resampling is a standard step in particle filters and more generally sequential

More information

Bayesian modelling of football outcomes. 1 Introduction. Synopsis. (using Skellam s Distribution)

Bayesian modelling of football outcomes. 1 Introduction. Synopsis. (using Skellam s Distribution) Karlis & Ntzoufras: Bayesian modelling of football outcomes 3 Bayesian modelling of football outcomes (using Skellam s Distribution) Dimitris Karlis & Ioannis Ntzoufras e-mails: {karlis, ntzoufras}@aueb.gr

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Sequential Monte Carlo Methods (for DSGE Models)

Sequential Monte Carlo Methods (for DSGE Models) Sequential Monte Carlo Methods (for DSGE Models) Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 Some References These lectures use material from our joint work: Tempered

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

Inferring biological dynamics Iterated filtering (IF)

Inferring biological dynamics Iterated filtering (IF) Inferring biological dynamics 101 3. Iterated filtering (IF) IF originated in 2006 [6]. For plug-and-play likelihood-based inference on POMP models, there are not many alternatives. Directly estimating

More information

An Introduction to Particle Filtering

An Introduction to Particle Filtering An Introduction to Particle Filtering Author: Lisa Turner Supervisor: Dr. Christopher Sherlock 10th May 2013 Abstract This report introduces the ideas behind particle filters, looking at the Kalman filter

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Controlled sequential Monte Carlo

Controlled sequential Monte Carlo Controlled sequential Monte Carlo Jeremy Heng, Department of Statistics, Harvard University Joint work with Adrian Bishop (UTS, CSIRO), George Deligiannidis & Arnaud Doucet (Oxford) Bayesian Computation

More information

Package RcppSMC. March 18, 2018

Package RcppSMC. March 18, 2018 Type Package Title Rcpp Bindings for Sequential Monte Carlo Version 0.2.1 Date 2018-03-18 Package RcppSMC March 18, 2018 Author Dirk Eddelbuettel, Adam M. Johansen and Leah F. South Maintainer Dirk Eddelbuettel

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Sequential Monte Carlo Samplers for Applications in High Dimensions

Sequential Monte Carlo Samplers for Applications in High Dimensions Sequential Monte Carlo Samplers for Applications in High Dimensions Alexandros Beskos National University of Singapore KAUST, 26th February 2014 Joint work with: Dan Crisan, Ajay Jasra, Nik Kantas, Alex

More information

Divide-and-Conquer Sequential Monte Carlo

Divide-and-Conquer Sequential Monte Carlo Divide-and-Conquer Joint work with: John Aston, Alexandre Bouchard-Côté, Brent Kirkpatrick, Fredrik Lindsten, Christian Næsseth, Thomas Schön University of Warwick a.m.johansen@warwick.ac.uk http://go.warwick.ac.uk/amjohansen/talks/

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

Bayesian Monte Carlo Filtering for Stochastic Volatility Models

Bayesian Monte Carlo Filtering for Stochastic Volatility Models Bayesian Monte Carlo Filtering for Stochastic Volatility Models Roberto Casarin CEREMADE University Paris IX (Dauphine) and Dept. of Economics University Ca Foscari, Venice Abstract Modelling of the financial

More information

arxiv: v1 [stat.co] 1 Jun 2015

arxiv: v1 [stat.co] 1 Jun 2015 arxiv:1506.00570v1 [stat.co] 1 Jun 2015 Towards automatic calibration of the number of state particles within the SMC 2 algorithm N. Chopin J. Ridgway M. Gerber O. Papaspiliopoulos CREST-ENSAE, Malakoff,

More information

Introduction. log p θ (y k y 1:k 1 ), k=1

Introduction. log p θ (y k y 1:k 1 ), k=1 ESAIM: PROCEEDINGS, September 2007, Vol.19, 115-120 Christophe Andrieu & Dan Crisan, Editors DOI: 10.1051/proc:071915 PARTICLE FILTER-BASED APPROXIMATE MAXIMUM LIKELIHOOD INFERENCE ASYMPTOTICS IN STATE-SPACE

More information

Seminar: Data Assimilation

Seminar: Data Assimilation Seminar: Data Assimilation Jonas Latz, Elisabeth Ullmann Chair of Numerical Mathematics (M2) Technical University of Munich Jonas Latz, Elisabeth Ullmann (TUM) Data Assimilation 1 / 28 Prerequisites Bachelor:

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

SMC 2 : an efficient algorithm for sequential analysis of state-space models

SMC 2 : an efficient algorithm for sequential analysis of state-space models SMC 2 : an efficient algorithm for sequential analysis of state-space models N. CHOPIN 1, P.E. JACOB 2, & O. PAPASPILIOPOULOS 3 1 ENSAE-CREST 2 CREST & Université Paris Dauphine, 3 Universitat Pompeu Fabra

More information

Particle filters, the optimal proposal and high-dimensional systems

Particle filters, the optimal proposal and high-dimensional systems Particle filters, the optimal proposal and high-dimensional systems Chris Snyder National Center for Atmospheric Research Boulder, Colorado 837, United States chriss@ucar.edu 1 Introduction Particle filters

More information

Sequential Bayesian Inference for Dynamic State Space. Model Parameters

Sequential Bayesian Inference for Dynamic State Space. Model Parameters Sequential Bayesian Inference for Dynamic State Space Model Parameters Arnab Bhattacharya and Simon Wilson 1. INTRODUCTION Dynamic state-space models [24], consisting of a latent Markov process X 0,X 1,...

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France. AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER Qi Cheng and Pascal Bondon CNRS UMR 8506, Université Paris XI, France. August 27, 2011 Abstract We present a modified bootstrap filter to draw

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

State-Space Methods for Inferring Spike Trains from Calcium Imaging

State-Space Methods for Inferring Spike Trains from Calcium Imaging State-Space Methods for Inferring Spike Trains from Calcium Imaging Joshua Vogelstein Johns Hopkins April 23, 2009 Joshua Vogelstein (Johns Hopkins) State-Space Calcium Imaging April 23, 2009 1 / 78 Outline

More information

Kernel Sequential Monte Carlo

Kernel Sequential Monte Carlo Kernel Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) * equal contribution April 25, 2016 1 / 37 Section

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Lecture Particle Filters

Lecture Particle Filters FMS161/MASM18 Financial Statistics November 29, 2010 Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems Modeling CSC752 Autonomous Robotic Systems Ubbo Visser Department of Computer Science University of Miami February 21, 2017 Outline 1 Modeling and state estimation 2 Examples 3 State estimation 4 Probabilities

More information

Markov Chain Monte Carlo Methods for Stochastic

Markov Chain Monte Carlo Methods for Stochastic Markov Chain Monte Carlo Methods for Stochastic Optimization i John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U Florida, Nov 2013

More information

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS Saiat Saha and Gustaf Hendeby Linöping University Post Print N.B.: When citing this wor, cite the original article. 2014

More information

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1 Motivation Recall: Discrete filter Discretize the

More information

1 Kalman Filter Introduction

1 Kalman Filter Introduction 1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation

More information

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods Jonas Hallgren 1 1 Department of Mathematics KTH Royal Institute of Technology Stockholm, Sweden BFS 2012 June

More information

Lecture Particle Filters. Magnus Wiktorsson

Lecture Particle Filters. Magnus Wiktorsson Lecture Particle Filters Magnus Wiktorsson Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace p(x)

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Sequential Monte Carlo methods for system identification

Sequential Monte Carlo methods for system identification Technical report arxiv:1503.06058v3 [stat.co] 10 Mar 2016 Sequential Monte Carlo methods for system identification Thomas B. Schön, Fredrik Lindsten, Johan Dahlin, Johan Wågberg, Christian A. Naesseth,

More information

Particle Learning and Smoothing

Particle Learning and Smoothing Particle Learning and Smoothing Carlos Carvalho, Michael Johannes, Hedibert Lopes and Nicholas Polson This version: September 2009 First draft: December 2007 Abstract In this paper we develop particle

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Particle Filters Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Motivation For continuous spaces: often no analytical formulas for Bayes filter updates

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Sequential Monte Carlo in the machine learning toolbox

Sequential Monte Carlo in the machine learning toolbox Sequential Monte Carlo in the machine learning toolbox Working with the trend of blending Thomas Schön Uppsala University Sweden. Symposium on Advances in Approximate Bayesian Inference (AABI) Montréal,

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution Andreas Svensson, Thomas B. Schön, and Fredrik Lindsten Department of Information Technology,

More information

A new iterated filtering algorithm

A new iterated filtering algorithm A new iterated filtering algorithm Edward Ionides University of Michigan, Ann Arbor ionides@umich.edu Statistics and Nonlinear Dynamics in Biology and Medicine Thursday July 31, 2014 Overview 1 Introduction

More information

Adversarial Sequential Monte Carlo

Adversarial Sequential Monte Carlo Adversarial Sequential Monte Carlo Kira Kempinska Department of Security and Crime Science University College London London, WC1E 6BT kira.kowalska.13@ucl.ac.uk John Shawe-Taylor Department of Computer

More information

Computational statistics

Computational statistics Computational statistics Markov Chain Monte Carlo methods Thierry Denœux March 2017 Thierry Denœux Computational statistics March 2017 1 / 71 Contents of this chapter When a target density f can be evaluated

More information

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering 2006 International Joint Conference on Neural Networks Sheraton Vancouver Wall Centre Hotel, Vancouver, BC, Canada July 16-21, 2006 A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

More information

Effective Sample Size for Importance Sampling based on discrepancy measures

Effective Sample Size for Importance Sampling based on discrepancy measures Effective Sample Size for Importance Sampling based on discrepancy measures L. Martino, V. Elvira, F. Louzada Universidade de São Paulo, São Carlos (Brazil). Universidad Carlos III de Madrid, Leganés (Spain).

More information

Bayesian Computations for DSGE Models

Bayesian Computations for DSGE Models Bayesian Computations for DSGE Models Frank Schorfheide University of Pennsylvania, PIER, CEPR, and NBER October 23, 2017 This Lecture is Based on Bayesian Estimation of DSGE Models Edward P. Herbst &

More information

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS A Thesis by SIRISH BODDIKURAPATI Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of

More information

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015. Outline Part 4 Nonlinear system identification using sequential Monte Carlo methods Part 4 Identification strategy 2 (Data augmentation) Aim: Show how SMC can be used to implement identification strategy

More information

Kernel adaptive Sequential Monte Carlo

Kernel adaptive Sequential Monte Carlo Kernel adaptive Sequential Monte Carlo Ingmar Schuster (Paris Dauphine) Heiko Strathmann (University College London) Brooks Paige (Oxford) Dino Sejdinovic (Oxford) December 7, 2015 1 / 36 Section 1 Outline

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

A Review of Pseudo-Marginal Markov Chain Monte Carlo

A Review of Pseudo-Marginal Markov Chain Monte Carlo A Review of Pseudo-Marginal Markov Chain Monte Carlo Discussed by: Yizhe Zhang October 21, 2016 Outline 1 Overview 2 Paper review 3 experiment 4 conclusion Motivation & overview Notation: θ denotes the

More information

Monte Carlo Methods. Leon Gu CSD, CMU

Monte Carlo Methods. Leon Gu CSD, CMU Monte Carlo Methods Leon Gu CSD, CMU Approximate Inference EM: y-observed variables; x-hidden variables; θ-parameters; E-step: q(x) = p(x y, θ t 1 ) M-step: θ t = arg max E q(x) [log p(y, x θ)] θ Monte

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,

More information

An intro to particle methods for parameter inference in state-space models

An intro to particle methods for parameter inference in state-space models An intro to particle methods for parameter inference in state-space models PhD course FMS020F NAMS002 Statistical inference for partially observed stochastic processes, Lund University http://goo.gl/sx8vu9

More information

Bayesian Methods for Machine Learning

Bayesian Methods for Machine Learning Bayesian Methods for Machine Learning CS 584: Big Data Analytics Material adapted from Radford Neal s tutorial (http://ftp.cs.utoronto.ca/pub/radford/bayes-tut.pdf), Zoubin Ghahramni (http://hunch.net/~coms-4771/zoubin_ghahramani_bayesian_learning.pdf),

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Monte Carlo Solution of Integral Equations

Monte Carlo Solution of Integral Equations 1 Monte Carlo Solution of Integral Equations of the second kind Adam M. Johansen a.m.johansen@warwick.ac.uk www2.warwick.ac.uk/fac/sci/statistics/staff/academic/johansen/talks/ March 7th, 2011 MIRaW Monte

More information

Approximate Bayesian Computation and Particle Filters

Approximate Bayesian Computation and Particle Filters Approximate Bayesian Computation and Particle Filters Dennis Prangle Reading University 5th February 2014 Introduction Talk is mostly a literature review A few comments on my own ongoing research See Jasra

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

An efficient stochastic approximation EM algorithm using conditional particle filters

An efficient stochastic approximation EM algorithm using conditional particle filters An efficient stochastic approximation EM algorithm using conditional particle filters Fredrik Lindsten Linköping University Post Print N.B.: When citing this work, cite the original article. Original Publication:

More information

Part 1: Expectation Propagation

Part 1: Expectation Propagation Chalmers Machine Learning Summer School Approximate message passing and biomedicine Part 1: Expectation Propagation Tom Heskes Machine Learning Group, Institute for Computing and Information Sciences Radboud

More information

Self Adaptive Particle Filter

Self Adaptive Particle Filter Self Adaptive Particle Filter Alvaro Soto Pontificia Universidad Catolica de Chile Department of Computer Science Vicuna Mackenna 4860 (143), Santiago 22, Chile asoto@ing.puc.cl Abstract The particle filter

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

Perfect simulation algorithm of a trajectory under a Feynman-Kac law

Perfect simulation algorithm of a trajectory under a Feynman-Kac law Perfect simulation algorithm of a trajectory under a Feynman-Kac law Data Assimilation, 24th-28th September 212, Oxford-Man Institute C. Andrieu, N. Chopin, A. Doucet, S. Rubenthaler University of Bristol,

More information

16 : Approximate Inference: Markov Chain Monte Carlo

16 : Approximate Inference: Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models 10-708, Spring 2017 16 : Approximate Inference: Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Yuan Yang, Chao-Ming Yen 1 Introduction As the target distribution

More information

Stat 451 Lecture Notes Monte Carlo Integration

Stat 451 Lecture Notes Monte Carlo Integration Stat 451 Lecture Notes 06 12 Monte Carlo Integration Ryan Martin UIC www.math.uic.edu/~rgmartin 1 Based on Chapter 6 in Givens & Hoeting, Chapter 23 in Lange, and Chapters 3 4 in Robert & Casella 2 Updated:

More information

New Insights into History Matching via Sequential Monte Carlo

New Insights into History Matching via Sequential Monte Carlo New Insights into History Matching via Sequential Monte Carlo Associate Professor Chris Drovandi School of Mathematical Sciences ARC Centre of Excellence for Mathematical and Statistical Frontiers (ACEMS)

More information

Smoothers: Types and Benchmarks

Smoothers: Types and Benchmarks Smoothers: Types and Benchmarks Patrick N. Raanes Oxford University, NERSC 8th International EnKF Workshop May 27, 2013 Chris Farmer, Irene Moroz Laurent Bertino NERSC Geir Evensen Abstract Talk builds

More information

CPSC 540: Machine Learning

CPSC 540: Machine Learning CPSC 540: Machine Learning MCMC and Non-Parametric Bayes Mark Schmidt University of British Columbia Winter 2016 Admin I went through project proposals: Some of you got a message on Piazza. No news is

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

Negative Association, Ordering and Convergence of Resampling Methods

Negative Association, Ordering and Convergence of Resampling Methods Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal

More information

Implementation of Particle Filter-based Target Tracking

Implementation of Particle Filter-based Target Tracking of -based V. Rajbabu rajbabu@ee.iitb.ac.in VLSI Group Seminar Dept. of Electrical Engineering IIT Bombay November 15, 2007 Outline Introduction 1 Introduction 2 3 4 5 2 / 55 Outline Introduction 1 Introduction

More information

Master Degree in Data Science. Particle Filtering for Nonlinear State Space Models. Hans-Peter Höllwirth. Director: Dr. Christian Brownlees

Master Degree in Data Science. Particle Filtering for Nonlinear State Space Models. Hans-Peter Höllwirth. Director: Dr. Christian Brownlees Master Degree in Data Science Particle Filtering for Nonlinear State Space Models Hans-Peter Höllwirth Director: Dr. Christian Brownlees June 2017 ABSTRACT IN ENGLISH : This paper studies a novel particle

More information

Particle Filters: Convergence Results and High Dimensions

Particle Filters: Convergence Results and High Dimensions Particle Filters: Convergence Results and High Dimensions Mark Coates mark.coates@mcgill.ca McGill University Department of Electrical and Computer Engineering Montreal, Quebec, Canada Bellairs 2012 Outline

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information