Introduction to Particle Filters for Data Assimilation

Similar documents
Par$cle Filters Part I: Theory. Peter Jan van Leeuwen Data- Assimila$on Research Centre DARC University of Reading

Miscellaneous Thoughts on Ocean Data Assimilation

Sensor Fusion: Particle Filter

Nonlinear ensemble data assimila/on in high- dimensional spaces. Peter Jan van Leeuwen Javier Amezcua, Mengbin Zhu, Melanie Ades

Lecture 2: From Linear Regression to Kalman Filter and Beyond

An introduction to Sequential Monte Carlo

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Inferring biological dynamics Iterated filtering (IF)

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

Fundamentals of Data Assimila1on

CS 6140: Machine Learning Spring What We Learned Last Week. Survey 2/26/16. VS. Model

Fundamentals of Data Assimila1on

CS 6140: Machine Learning Spring 2016

Lecture 8: Bayesian Estimation of Parameters in State Space Models

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Ensemble Data Assimila.on and Uncertainty Quan.fica.on

Ensemble Data Assimila.on for Climate System Component Models

An Introduction to Particle Filtering

Particle filters, the optimal proposal and high-dimensional systems

Particle Filters. Outline

Mini-course 07: Kalman and Particle Filters Particle Filters Fundamentals, algorithms and applications

Introduction to Machine Learning

DART Tutorial Part II: How should observa'ons impact an unobserved state variable? Mul'variate assimila'on.

Bayesian statistical data assimilation for ecosystem models using Markov Chain Monte Carlo

Blind Equalization via Particle Filtering

Markov Chain Monte Carlo Methods for Stochastic Optimization

Smoothers: Types and Benchmarks

Particle Filtering Approaches for Dynamic Stochastic Optimization

A Monte Carlo Sequential Estimation for Point Process Optimum Filtering

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

A nested sampling particle filter for nonlinear data assimilation

Bayesian Monte Carlo Filtering for Stochastic Volatility Models

Markov Chain Monte Carlo Methods for Stochastic

Bayesian Methods for Machine Learning

State Estimation using Moving Horizon Estimation and Particle Filtering

Learning Static Parameters in Stochastic Processes

Human Pose Tracking I: Basics. David Fleet University of Toronto

Bayesian Inference and MCMC

DART Tutorial Sec'on 9: More on Dealing with Error: Infla'on

A new unscented Kalman filter with higher order moment-matching

DART Tutorial Sec'on 5: Comprehensive Filtering Theory: Non-Iden'ty Observa'ons and the Joint Phase Space

Content.

DART Tutorial Sec'on 1: Filtering For a One Variable System

Particle Filtering for Data-Driven Simulation and Optimization

Latent Dirichlet Alloca/on

Monte Carlo Approximation of Monte Carlo Filters

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models

Evolution Strategies Based Particle Filters for Fault Detection

Lecture 7: Optimal Smoothing

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

CSE 473: Ar+ficial Intelligence. Probability Recap. Markov Models - II. Condi+onal probability. Product rule. Chain rule.

Efficient Monitoring for Planetary Rovers

Lecture 6: Bayesian Inference in SDE Models

Particle Learning and Smoothing

Available online at ScienceDirect. IFAC PapersOnLine (2018)

The Hierarchical Particle Filter

Master Degree in Data Science. Particle Filtering for Nonlinear State Space Models. Hans-Peter Höllwirth. Director: Dr. Christian Brownlees

The Unscented Particle Filter

Short tutorial on data assimilation

Tracking. Readings: Chapter 17 of Forsyth and Ponce. Matlab Tutorials: motiontutorial.m. 2503: Tracking c D.J. Fleet & A.D. Jepson, 2009 Page: 1

Density Estimation. Seungjin Choi

Monte Carlo Methods. Leon Gu CSD, CMU

CSE 473: Ar+ficial Intelligence. Hidden Markov Models. Bayes Nets. Two random variable at each +me step Hidden state, X i Observa+on, E i

CPSC 540: Machine Learning

Markov chain Monte Carlo methods for visual tracking

The Kalman Filter ImPr Talk

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Controlled sequential Monte Carlo

Dynamic System Identification using HDMR-Bayesian Technique

An Brief Overview of Particle Filtering

CSE 473: Ar+ficial Intelligence. Example. Par+cle Filters for HMMs. An HMM is defined by: Ini+al distribu+on: Transi+ons: Emissions:

RESEARCH ARTICLE. Online quantization in nonlinear filtering

An efficient stochastic approximation EM algorithm using conditional particle filters

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

DART Tutorial Part IV: Other Updates for an Observed Variable

A new Hierarchical Bayes approach to ensemble-variational data assimilation

arxiv: v1 [cs.ro] 1 May 2015

Organization. I MCMC discussion. I project talks. I Lecture.

Calibration of Stochastic Volatility Models using Particle Markov Chain Monte Carlo Methods

MCMC and Gibbs Sampling. Kayhan Batmanghelich

Variable Resolution Particle Filter

Package RcppSMC. March 18, 2018

Answers and expectations

SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

Least Squares Parameter Es.ma.on

A PARTICLE FILTERING FRAMEWORK FOR RANDOMIZED OPTIMIZATION ALGORITHMS. Enlu Zhou Michael C. Fu Steven I. Marcus

Seminar: Data Assimilation

A Brief Tutorial On Recursive Estimation With Examples From Intelligent Vehicle Applications (Part IV): Sampling Based Methods And The Particle Filter

NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH

Sequential Bayesian Inference for Dynamic State Space. Model Parameters

Linear and nonlinear filtering for state space models. Master course notes. UdelaR 2010

Quan&fying Uncertainty. Sai Ravela Massachuse7s Ins&tute of Technology

Hastings-within-Gibbs Algorithm: Introduction and Application on Hierarchical Model

Transcription:

Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on, Boulder, Colorado, May 18-25 2015

Problem Statement Noisy 'me series observa'ons on the system state y 1:T = (y 1,..., y T a discrete 'me stochas'c dynamic model describing the 'me evolu'on of the system state x t = θx t 1 + n t e.g. Bivariate AR(1 process We want to es'mate the system state (and some'mes parameters

Noisy Observations of System State Observations state -10-5 0 5 10 0 50 100 150 200 time

Stochastic Dynamics governing System State 10 Realizations of System State state -10-5 0 5 10 0 50 100 150 200 time

Estimation of System State by Particle Filters Particle Filter Estimate of the State state -10-5 0 5 10 0 50 100 150 200 time

Target: Evolution of the Probability Distribution Of the State through Time 'me state

A General Framework for DA Hierarchical Bayesian Model p(x 1:T,θ y 1:T p(y 1:T x 1:T,θ p(x 1:T θ p(θ target (posterior measurement dynamics prior Special Cases in DA Parameter es'ma'on (no model error Filtering solu'on for state * Smoothing solu'on for state Joint state & parameter es'ma'on Inference on dynamic model structure

Data Assimilation: A Statistical Framework State Space Model: x t = d(x t 1,θ,e t y t = h(x t,φ,ν t or x t ~ p(x t,θ x t 1 y t x t ~ p(y t,φ x t The general goal is to make inferences about the state x t and some'mes the sta'c parameters* θ and ϕ**. We use explicit priors on process noise & observa'on errors (i.e. parametric distribu'ons for e t and v t. Assume y t condi'onally independent given x t, and that e t and v t are independent. For DA, we want to do space- 'me problems: space is incorporated in x t è Solu'ons involve sequen'al Monte Carlo based for nonlinear and nongaussian state space models * Dynamic parameters are part of the state ** drop φ hereater, and defer θ t = 1,,T

Problem Classes We want to es'mate the system state at 'me t, or given the observa'ons up to 'me s, or y 1:S = (y 1, y 2,.., y S x t A Filtering (Nowcast B Forecasting C Smoothing (Hindcast s s s s=t è Filtering s>t è Predic'on s<t è Smoothing Time data available no data available estimation time

Sequential Estimation Single stage transition of system from time t-1 to time t prediction x t = d(x t 1,θ,e t measurement y t nowcast p(x t 1,θ y 1:t 1 forecast nowcast p(x t,θ y 1:t 1 p(x t,θ y 1:t time = t-1 time = t time - We ll drop parameters θ for now

Probabilistic Solution for Filtering A single stage transition of the system for time t-1 to t involves: Prediction: p(x t y 1:t 1 = p(x t x t 1 p(x t 1 y 1:t 1 dx t 1 Measurement update: p(x t y 1:t = p(y t x t p(x t y 1:t 1 p(y 1:t è General Filtering Solution: p(x t y 1:t p( y t x t p(x t x t 1 p(x t 1 y 1:t 1 dx t 1 do for t=1,, T, given p(x 0 and y 1:t = (y 1, y 2,.., y T

Sampling Based Estimation Let {x (i t t,w (i t }, i = 1,..., n be a random sample* from p(x t y 1:t drawn from the target distribu'on (the filter density (i where: x t t w t (i are the support points are the normalized weights The filter density can then be approximated as p(x t y 1:t n i=1 w (i t δ (x t t x (i t t *referred to an ensemble, made up of a collec'on of par'cles (which are sample members

Sampling Based Estimation (a unweighted ensemble, n=25 0.4 0.4 (b weighted ensemble, n=25 probability 0.2 probability 0.2 0 2 0 2 x (c unweighted ensemble, n=250 0 2 0 2 x (d weighted ensemble, n=250 0.4 0.4 probability 0.2 probability 0.2 0 2 0 2 x 0 2 0 2 x

Sequential Importance Sampling (SIS (i At time t-1 we have: i {x t 1 t 1,w (i t 1 }, draws from p(x t 1 y 1:t 1 i y t, an observation If a candidate x (i t t is drawn from a proposal q(x t t then w (i t p(x (i t t y t (i y t q(x t t or w (i (i t w t 1 p(y t x (i t t p(x (i (i t t x t 1 t 1 q(x (i t t A draw from p(x t y 1:t is then available as {x (i t t,w (i t } see Arulampalam et al. 2002

SIS Remarks SIS solves the filtering problems for the nonlinear, non- Gaussian state space model via an algorithm for sequen'al online es'ma'on of the system state relying on a proposal and weight update. A common simplifica'on is to let the proposal be the the prior (the predic've density q(x t t = p(x t x t 1 leading to: w (i (i t w t 1 p(y t x (i t 1 The main prac'cal problem with SIS is weight collapse wherein ater a few 'me steps, all the weights become concentrated on a few par'cles. How to fix?

Sequential Importance Resampling (SIR The major breakthrough in sequential Monte Carlo methods (Gordon et al. 1993 was to incorporate a resampling step to obviate weight collapse. (i i Specifically, for each time t, {x t t 1,w (i t } is a sample from p(x t y t 1 i We can resample (with replacement the x (i t t with a probability proportional to w t (i (i.e. a weighted bootstrap i This yields a modified sample {x (i t t }, wherein particles were both killed and replicated, and the weights are now equal, i.e. w t (i = 1 i. Remaining Issue: sample impoverishment - some particles replicated often in the unweighted sample {x (i t t }. Less severe than weight collapse (but of course related.

Basic Particle Filter Algorithm Given: (1 dynamical model, (2 observa'ons, (3 ini'al condi'ons for t = 1 to T (i (a Prediction: {x t 1 t 1 (i } {x t t 1 } as (i x t t 1 (i = d(x t 1 t 1,e (i t for i = 1,...,n (i (b Measurement: {x t t 1 } {x (i t t } as end (for t w (i (i t p(y t x t t 1 for i = 1,...,n (i (i resample with replacement {x t t 1 } using weights w t yields {x (i t t }

Particle Filter Schematic

Interpretation: Particle History state 'me

R-code for a Basic Particle Filter! for (t in 2:T! predic'on step {! from t- 1 to t! xf=t(apply(xa,1,dynamicmodel!! extract measurement yobs=matrix(y[t,],1,ncol(y! at 'me t! w=apply(matrix(xf[,1],np,1,1,dmvnorm,x=yobs,sigma=r!! assign weights m=sample(1:np,size=np,prob=w,replace=t;! xa=xf[m,]!! }! weighted resampling

Element 1. Prediction Step { (i x } t 1 t 1 { (i x } t t 1 draw from p(x t 1 y 1:t 1 filter density at time t-1 draw from p(x t y 1:t 1 predictive density at time t Role: acts as proposal distribution for particle filter (i x t t 1 (i = d(x t 1 t 1,e (i t for i = 1,...,n Remarks: - computa'onally expensive to generate large samples for big GFD models - hard to effec'vely populate a large dimension state space with par'cles

Element 2. Measurement Update { (i x } t t 1 draw from p(x t y 1:t 1 predictive density at time t-1 { (i x } t t draw from p(x t y 1:t filter density at time t uses p(x t t y 1:t p(y t x t t 1 p(x t t 1 (Bayes Formula Approaches: (i i PF/SIR: weighted resample of x t t 1 y t { (i,w } (i t x t t { } Metropolis Hastings MCMC Ensemble Kalman filter (approximate plus many others (auxiliary, unscented,. that modify proposal distributions

Sequential Metropolis-Hastings Construct (independence chain to yield sample from target filter distribu5on. for k = 1 to K 1. Generate candidate from predictive density: * draw x t t 1 2. Evaluate acceptance probability from predictive density p(x t y 1:t 1 α = min 1, p(y x * t t t 1 (k p(y t x t t 3. Let x * t t enter the posterior sample {x t t (k otherwise retain x t t end (for k yields (i } with probability α, { (i x t t }, a sample from p(x t y 1:t flexible, configurable and efficient (+ a parallel with SIR can alleviate sample impoverishment (resample- move

Limitations Theory: convergence to target marginal distribu'on holds under weak assump'ons (e.g. Kunsch 2005 p(x t y 1:t Prac7cal Problem: The standard par'cle filtering (SIR when used in DA suffers from sample impoverishment (recall path degeneracy Typically, the predic'on ensemble far away from the observed state (due to small ensemble size, and high dimensional state space. è likelihood is poorly represented and es'ma'on compromised. What modifica5ons are made to improve par5cle filters?

Some Fixes Add Ji;er : overdisperse the sample of predic've density (inflate process error Make a Gaussian approxima5on, or transforma5on/anamorphosis: Can Use Kalman filter upda'ng equa'on Approximate the Likelihood func5on: change its func'onal form, or inflate or alter measurement error. Error Subspace: confine stochas'city in parameters only. Dimension reduc'on. Use Fixed lag smoother, Batch processing incorporate observa'ons from mul'ple 'mes into observa'on update. Clever Proposal Distribu5ons and look- ahead filters: move beyond using prior (predic've density as proposal.

What about Parameter Estimation? 1. State Augmentation: append parameters to the state x t = x t θ t Can use particle filter machinery to solve (Kitagawa 1998. Also, see multiple iterative filtering (Ionides 2006 2. Likelihood Methods: Use sample based likelihoods T [y 1:T θ] = L(θ y 1:T = [y t x t,θ][x t y 1:t 1,θ]dx t t=1 T t=1 n 1 (i p(y t x t t 1 n i=1

Summary 1. State space model is sta's'cal framework for considering dynamic models and measurements 2. Importance sampling provides theore'cal basis for par'cle filters 3. Sequen'al importance resampling (SIR yields a straighforward and workable algorithm for the filtering problem 4. For the realis'c problems in data assimila'on, par'cle filters must be modified either via approxima'ons, or clever use of proposal distribu'ons.

Some Key References Jazwinski AH. 1970. Stochas'c Processes and Filtering Theory. Academic Press: New York, 376pp. Gordon NJ, Salmond DJ, Smith AFM. 1993. Novel approach to nonlinear/non- Gaussian Bayesian state es'ma'on. IEE Proceedings- F 140(2:107-113. Kitagawa G. 1996. Monte Carlo filter and smoother for non- Gaussian nonlinear state space models. Journal of Computa'onal and Graphical Sta's'cs 5: 1-25. Kitagawa G. 1998. A self- organizing state space model. Journal of the American Sta's'cal Associa'on. 93(443:1203-1215. Arulampalam MS, Maskell S, Gordon N, Clapp T, 2002. A tutorial on par'cle filters for online nonlinear non- Gaussian Bayesian tracking. IEEE Transac'ons on Signal Processing 50(2:174-188. Kunsch HR. 2005. Recursive Monte Carlo filters: algorithms and theore'cal analysis. The Annals of Sta's'cs. 33(5:1983-2021. Godsill SJ, Doucet A, West M. 2004. Monte Carlo Smoothing for Nonlinear Time Series. Journal of the American Sta's'cal Associa'on. 99(465:156-168. Ionides EL, Breto C, King AA. 2006. Inference for nonlinear dynamical systems. Proceedings of the Na'onal Academy of Sciences 103: 18438-18443.