An introduction to particle filters

Similar documents
Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Sequential Monte Carlo Methods for Bayesian Computation

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

An efficient stochastic approximation EM algorithm using conditional particle filters

Lecture 2: From Linear Regression to Kalman Filter and Beyond

An Brief Overview of Particle Filtering

Monte Carlo Approximation of Monte Carlo Filters

The Kalman Filter ImPr Talk

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Sensor Fusion: Particle Filter

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Markov Chain Monte Carlo Methods for Stochastic Optimization

AUTOMOTIVE ENVIRONMENT SENSORS

Particle Filtering Approaches for Dynamic Stochastic Optimization

1 / 31 Identification of nonlinear dynamic systems (guest lectures) Brussels, Belgium, June 8, 2015.

The Hierarchical Particle Filter

Controlled sequential Monte Carlo

Inferring biological dynamics Iterated filtering (IF)

1 / 32 Summer school on Foundations and advances in stochastic filtering (FASF) Barcelona, Spain, June 22, 2015.

Robert Collins CSE586, PSU Intro to Sampling Methods

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Learning of state-space models with highly informative observations: a tempered Sequential Monte Carlo solution

Particle Filters. Outline

Sequential Monte Carlo methods for system identification

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 7: Optimal Smoothing

Sequential Monte Carlo in the machine learning toolbox

Identification of Gaussian Process State-Space Models with Particle Stochastic Approximation EM

Particle Filtering for Data-Driven Simulation and Optimization

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

Markov Chain Monte Carlo Methods for Stochastic

Parameter Estimation in a Moving Horizon Perspective

Robert Collins CSE586, PSU Intro to Sampling Methods

Linear Dynamical Systems

Kernel adaptive Sequential Monte Carlo

TSRT14: Sensor Fusion Lecture 8

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

State Estimation using Moving Horizon Estimation and Particle Filtering

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

An Introduction to Particle Filtering

2D Image Processing (Extended) Kalman and particle filter

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

Blind Equalization via Particle Filtering

Computer Intensive Methods in Mathematical Statistics

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Lecture 6: Bayesian Inference in SDE Models

Autonomous Mobile Robot Design

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

Seminar: Data Assimilation

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Bayesian Filters for Location Estimation and Tracking An Introduction

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Imprecise Filtering for Spacecraft Navigation

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

Auxiliary Particle Methods

Particle Filters: Convergence Results and High Dimensions

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Human Pose Tracking I: Basics. David Fleet University of Toronto

Available online at ScienceDirect. IFAC PapersOnLine (2018)

Package RcppSMC. March 18, 2018

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lecture Particle Filters. Magnus Wiktorsson

Parallel Particle Filter in Julia

Learning Static Parameters in Stochastic Processes

An introduction to Sequential Monte Carlo

Content.

Terrain Navigation Using the Ambient Magnetic Field as a Map

Kernel Sequential Monte Carlo

CPSC 540: Machine Learning

Computer Intensive Methods in Mathematical Statistics

Nested Sequential Monte Carlo Methods

Dynamic System Identification using HDMR-Bayesian Technique

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

Machine Learning for Data Science (CS4786) Lecture 24

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier

Sequential Bayesian Inference for Dynamic State Space. Model Parameters

Tracking. Readings: Chapter 17 of Forsyth and Ponce. Matlab Tutorials: motiontutorial.m. 2503: Tracking c D.J. Fleet & A.D. Jepson, 2009 Page: 1

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

IN particle filter (PF) applications, knowledge of the computational

A Note on Auxiliary Particle Filters

Learning the Linear Dynamical System with ASOS ( Approximated Second-Order Statistics )

Dual Estimation and the Unscented Transformation

EXAM IN STATISTICAL MACHINE LEARNING STATISTISK MASKININLÄRNING

OPTIMAL ESTIMATION of DYNAMIC SYSTEMS

A new unscented Kalman filter with higher order moment-matching

Stat 521A Lecture 18 1

Tracking. Readings: Chapter 17 of Forsyth and Ponce. Matlab Tutorials: motiontutorial.m, trackingtutorial.m

A Brief Tutorial On Recursive Estimation With Examples From Intelligent Vehicle Applications (Part IV): Sampling Based Methods And The Particle Filter

Fundamental Probability and Statistics

Bayesian Inference for Nonlinear Dynamical Systems Applications and Software Implementation Nordh, Jerker

Computer Intensive Methods in Mathematical Statistics

Hidden Markov Models. Terminology, Representation and Basic Problems

Lecture Particle Filters

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Robert Collins CSE586 CSE 586, Spring 2015 Computer Vision II

Bayesian Methods for Machine Learning

Transcription:

An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters

Outline Motivation and ideas Algorithm High-level Matlab code Practical aspects Resampling Computational complexity Software Terminology Advanced topics Convergence Extensions References June 10, 2014, 2 / 16 Andreas Svensson - An introduction to particle filters

State Space Models June 10, 2014, 3 / 16 Andreas Svensson - An introduction to particle filters

State Space Models Linear Gaussian state space model x t+1 = Ax t + Bu t + w t y t = Cx t + Du t + e t with w t and e t Gaussian and E [ w t w T t ] = Q, E [ et e T t ] = R. June 10, 2014, 3 / 16 Andreas Svensson - An introduction to particle filters

State Space Models Linear Gaussian state space model x t+1 = Ax t + Bu t + w t y t = Cx t + Du t + e t with w t and e t Gaussian and E [ w t wt T ] [ = Q, E et et T ] = R. A more general state space model in different notation x t+1 f (x t+1 x t, u t ) y t g(y t x t, u t ) June 10, 2014, 3 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) Linear Gaussian problems: June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) Linear Gaussian problems: Kalman filter! June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) Linear Gaussian problems: Kalman filter! Nonlinear problems: June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) Linear Gaussian problems: Kalman filter! Nonlinear problems: EKF, UKF, Particle filter June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) Linear Gaussian problems: Kalman filter! ''Exact solution to the right problem'' Nonlinear problems: EKF, UKF, Particle filter June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) Linear Gaussian problems: Kalman filter! ''Exact solution to the right problem'' Nonlinear problems: EKF, UKF, ''Exact solution to the wrong problem'' Particle filter June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Filtering - Find p(x t y 1:t ) Linear Gaussian problems: Kalman filter! ''Exact solution to the right problem'' Nonlinear problems: EKF, UKF, ''Exact solution to the wrong problem'' Particle filter ''Approximate solution to the right problem'' June 10, 2014, 4 / 16 Andreas Svensson - An introduction to particle filters

Idea True state tracejctory State Time A linear system - Find p(x t y 1:t) (true x t shown)! June 10, 2014, 5 / 16 Andreas Svensson - An introduction to particle filters

Idea True state tracejctory Kalman filter mean State Time Kalman filter mean June 10, 2014, 5 / 16 Andreas Svensson - An introduction to particle filters

Idea True state tracejctory Kalman filter mean Kalman filter covariance State Time Kalman filter mean and covariance June 10, 2014, 5 / 16 Andreas Svensson - An introduction to particle filters

Idea True state tracejctory Kalman filter mean Kalman filter covariance State Time Kalman filter mean and covariance defines a Gaussian distribution at each t June 10, 2014, 5 / 16 Andreas Svensson - An introduction to particle filters

Idea True state tracejctory State Time A numerical approximation can be used to describe the distribution - Particle Filter June 10, 2014, 5 / 16 Andreas Svensson - An introduction to particle filters

Idea True state tracejctory State Time The Particle Filter can easily handle, e.g., non-gaussian multimodal hypotheses June 10, 2014, 5 / 16 Andreas Svensson - An introduction to particle filters

Idea Generate a lot of hypotheses about x t, keep the most likely ones and propagate them further to x t+1. Keep the likely hypotheses about x t+1, propagate them again to x t+2, etc. June 10, 2014, 6 / 16 Andreas Svensson - An introduction to particle filters

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T end 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling x i t+1 from f ( xi t, u t ) N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 7 / 16 Andreas Svensson - An introduction to particle filters

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T end 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling x i t+1 from f ( xi t, u t ) N Number of particles, x i t Particles, w i t Particle weights 0 x(:,1) = random(i_dist,n,1); w(:,1) = ones(n,1)/n; for t = 1:T end 1 w(:,t) = pdf(m_dist,y(t)-g(x(:,t))); w(:,t) = w(:,t)/sum(w(:,t)); 2 Resample x(:,t) 3 x(:,t+1) = f(x(:,t),u(t)) + random(t_dist,n,1); June 10, 2014, 7 / 16 Andreas Svensson - An introduction to particle filters

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T end 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N end N Number of particles, x i t Particles, w i t Particle weights State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters

Algorithm 0 Initialize x1 i p(x 1) and w i = 1 N for t = 1 to T end 1 Evaluate w i t = g(y t x i t, u t ) 2 Resample {x i t } N i=1 from {x i t, w i t} N i=1 3 Propagate x i t by sampling from f ( x i t 1, u t 1) for i = 1,..., N N Number of particles, x i t Particles, w i t Particle weights State x 15 10 5 0 5 10 15 20 25 30 5 10 15 20 Time June 10, 2014, 8 / 16 Andreas Svensson - An introduction to particle filters

Resampling June 10, 2014, 9 / 16 Andreas Svensson - An introduction to particle filters

Resampling Represent the information contained in the N black dots (of different sizes) state June 10, 2014, 9 / 16 Andreas Svensson - An introduction to particle filters

Resampling Represent the information contained in the N black dots (of different sizes) with the N red dots (of equal sizes) state June 10, 2014, 9 / 16 Andreas Svensson - An introduction to particle filters

Resampling Represent the information contained in the N black dots (of different sizes) with the N red dots (of equal sizes) state Can be seen as sampling from a cathegorical distribution June 10, 2014, 9 / 16 Andreas Svensson - An introduction to particle filters

Resampling Represent the information contained in the N black dots (of different sizes) with the N red dots (of equal sizes) state Can be seen as sampling from a cathegorical distribution To avoid particle depletion (''a lot of 0-weights particles'') June 10, 2014, 9 / 16 Andreas Svensson - An introduction to particle filters

Resampling Represent the information contained in the N black dots (of different sizes) with the N red dots (of equal sizes) state Can be seen as sampling from a cathegorical distribution To avoid particle depletion (''a lot of 0-weights particles'') Some Matlab code: v = rand(n,1); wc = cumsum(w(:,t)); [,ind1] = sort([v;wc]); ind=find(ind1<=n)-(0:n-1)'; x(:,t)=x(ind,t); w(:,t)=ones(n,1)./n; June 10, 2014, 9 / 16 Andreas Svensson - An introduction to particle filters

Resampling cont'd June 10, 2014, 10 / 16 Andreas Svensson - An introduction to particle filters

Resampling cont'd Crucial step in the development in the 90's for making particle filters useful in practice June 10, 2014, 10 / 16 Andreas Svensson - An introduction to particle filters

Resampling cont'd Crucial step in the development in the 90's for making particle filters useful in practice Many different techniques exists with different properties and effiency (the Matlab code shown was only one way of doing it) June 10, 2014, 10 / 16 Andreas Svensson - An introduction to particle filters

Resampling cont'd Crucial step in the development in the 90's for making particle filters useful in practice Many different techniques exists with different properties and effiency (the Matlab code shown was only one way of doing it) Computationally heavy. Not necessary to do in every iteration - a ''depletion measure'' can be introduced, and the resampling only performed when it reaches a certain threshold. June 10, 2014, 10 / 16 Andreas Svensson - An introduction to particle filters

Resampling cont'd Crucial step in the development in the 90's for making particle filters useful in practice Many different techniques exists with different properties and effiency (the Matlab code shown was only one way of doing it) Computationally heavy. Not necessary to do in every iteration - a ''depletion measure'' can be introduced, and the resampling only performed when it reaches a certain threshold. It is sometimes preferred to use the logarithms of the weights for numerical reasons. June 10, 2014, 10 / 16 Andreas Svensson - An introduction to particle filters

Computational complexity June 10, 2014, 11 / 16 Andreas Svensson - An introduction to particle filters

Computational complexity In theory O(NTn 2 x) (n x is the number of states) June 10, 2014, 11 / 16 Andreas Svensson - An introduction to particle filters

Computational complexity In theory O(NTn 2 x) (n x is the number of states) (Kalman filter is approx. of order O(Tn 3 x)) June 10, 2014, 11 / 16 Andreas Svensson - An introduction to particle filters

Computational complexity In theory O(NTn 2 x) (n x is the number of states) (Kalman filter is approx. of order O(Tnx)) 3 Possible bottlenecks: Resampling step Likelihood evaluation (for the weight evaluation) Sampling from f for the propagation (trick: use proposal distributions!) June 10, 2014, 11 / 16 Andreas Svensson - An introduction to particle filters

Software June 10, 2014, 12 / 16 Andreas Svensson - An introduction to particle filters

Software Lot of software packages exists. However, to my knowledge, no package that ''everybody'' uses. (In our research, we write our own code.) June 10, 2014, 12 / 16 Andreas Svensson - An introduction to particle filters

Software Lot of software packages exists. However, to my knowledge, no package that ''everybody'' uses. (In our research, we write our own code.) A few relevant existing packages: Python: pyparticleest Matlab: PFToolbox, PFLib C++: Particle++ Disclaimer: I have not used any of these packages myself June 10, 2014, 12 / 16 Andreas Svensson - An introduction to particle filters

Terminology June 10, 2014, 13 / 16 Andreas Svensson - An introduction to particle filters

Terminology Bootstrap particle filter ''Standard'' particle filter June 10, 2014, 13 / 16 Andreas Svensson - An introduction to particle filters

Terminology Bootstrap particle filter ''Standard'' particle filter Sequential Monte Carlo (SMC) Particle filter June 10, 2014, 13 / 16 Andreas Svensson - An introduction to particle filters

Convergence June 10, 2014, 14 / 16 Andreas Svensson - An introduction to particle filters

Convergence The particle filter is ''exact for N = '' June 10, 2014, 14 / 16 Andreas Svensson - An introduction to particle filters

Convergence The particle filter is ''exact for N = '' Consider a function g(x t ) of interest. How well can it be approximated as ĝ(x t ) when x t is estimated in a particle filter? June 10, 2014, 14 / 16 Andreas Svensson - An introduction to particle filters

Convergence The particle filter is ''exact for N = '' Consider a function g(x t ) of interest. How well can it be approximated as ĝ(x t ) when x t is estimated in a particle filter? E [ĝ(x t ) E [g(x t )]] 2 pt g(xt) sup N June 10, 2014, 14 / 16 Andreas Svensson - An introduction to particle filters

Convergence The particle filter is ''exact for N = '' Consider a function g(x t ) of interest. How well can it be approximated as ĝ(x t ) when x t is estimated in a particle filter? E [ĝ(x t ) E [g(x t )]] 2 pt g(xt) sup N If the system ''forgets exponentially fast'' (e.g. linear systems), and some additional weak assumptions, p t = p <, i.e., E [ĝ(x t ) E [g(x t )]] 2 C N June 10, 2014, 14 / 16 Andreas Svensson - An introduction to particle filters

Extensions June 10, 2014, 15 / 16 Andreas Svensson - An introduction to particle filters

Extensions Smoothing: Finding p(x t y 1:T ) (marginal smoothing) or p(x 1:t y 1:T ) (joint smoothing) instead of p(x t y 1:t ) (filtering). Offline (y 1:T has to be available). Increased computational load. June 10, 2014, 15 / 16 Andreas Svensson - An introduction to particle filters

Extensions Smoothing: Finding p(x t y 1:T ) (marginal smoothing) or p(x 1:t y 1:T ) (joint smoothing) instead of p(x t y 1:t ) (filtering). Offline (y 1:T has to be available). Increased computational load. If f (x t+1 x t, u t ) is not suitable to sample from, proposal distributions can be used. In fact, an optimal proposal (w.r.t. reduced variance) exists. June 10, 2014, 15 / 16 Andreas Svensson - An introduction to particle filters

Extensions Smoothing: Finding p(x t y 1:T ) (marginal smoothing) or p(x 1:t y 1:T ) (joint smoothing) instead of p(x t y 1:t ) (filtering). Offline (y 1:T has to be available). Increased computational load. If f (x t+1 x t, u t ) is not suitable to sample from, proposal distributions can be used. In fact, an optimal proposal (w.r.t. reduced variance) exists. Rao-Blackwellization for mixed linear/nonlinear models. June 10, 2014, 15 / 16 Andreas Svensson - An introduction to particle filters

Extensions Smoothing: Finding p(x t y 1:T ) (marginal smoothing) or p(x 1:t y 1:T ) (joint smoothing) instead of p(x t y 1:t ) (filtering). Offline (y 1:T has to be available). Increased computational load. If f (x t+1 x t, u t ) is not suitable to sample from, proposal distributions can be used. In fact, an optimal proposal (w.r.t. reduced variance) exists. Rao-Blackwellization for mixed linear/nonlinear models. System identification: PMCMC, SMC 2. June 10, 2014, 15 / 16 Andreas Svensson - An introduction to particle filters

References Gustafsson, F. (2010). Particle filter theory and practice with positioning applications. Aerospace and Electronic Systems Magazine, IEEE, 25(7), 53-82. Schön, T.B., & Lindsten, F. Learning of dynamical systems - Particle Filters and Markov chain methods. Draft available. Doucet, A., & Johansen, A. M. (2009). A tutorial on particle filtering and smoothing: Fifteen years later. Handbook of Nonlinear Filtering, 12, 656-704. Homework: Implement your own Particle Filter for any (simple) problem of your choice! June 10, 2014, 16 / 16 Andreas Svensson - An introduction to particle filters