The Unscented Particle Filter

Similar documents
Efficient Monitoring for Planetary Rovers

Introduction to Unscented Kalman Filter

Dual Estimation and the Unscented Transformation

MODEL BASED LEARNING OF SIGMA POINTS IN UNSCENTED KALMAN FILTERING. Ryan Turner and Carl Edward Rasmussen

in a Rao-Blackwellised Unscented Kalman Filter

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

2D Image Processing (Extended) Kalman and particle filter

A new unscented Kalman filter with higher order moment-matching

SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

F denotes cumulative density. denotes probability density function; (.)

Lecture 2: From Linear Regression to Kalman Filter and Beyond

A Comparison of Particle Filters for Personal Positioning

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

TSRT14: Sensor Fusion Lecture 8

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations

Nonlinear State Estimation! Particle, Sigma-Points Filters!

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models

Autonomous Mobile Robot Design

NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH

Sensor Fusion: Particle Filter

Efficient Particle Filtering for Jump Markov Systems. Application to Time-Varying Autoregressions

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

State Estimation of Linear and Nonlinear Dynamic Systems

Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements

Particle Filters. Outline

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

CSC487/2503: Foundations of Computer Vision. Visual Tracking. David Fleet

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

A Note on Auxiliary Particle Filters

New Statistical Model for the Enhancement of Noisy Speech

A STATE ESTIMATOR FOR NONLINEAR STOCHASTIC SYSTEMS BASED ON DIRAC MIXTURE APPROXIMATIONS

Sigma-Point Kalman Filters for Probabilistic Inference in Dynamic State-Space Models

An Improved Particle Filter with Applications in Ballistic Target Tracking

Lecture 7: Optimal Smoothing

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Linear and nonlinear filtering for state space models. Master course notes. UdelaR 2010

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

arxiv: v1 [cs.ro] 1 May 2015

Chapter 9: Bayesian Methods. CS 336/436 Gregory D. Hager

MMSE-Based Filtering for Linear and Nonlinear Systems in the Presence of Non-Gaussian System and Measurement Noise

Monte Carlo Approximation of Monte Carlo Filters

SIGMA POINT GAUSSIAN SUM FILTER DESIGN USING SQUARE ROOT UNSCENTED FILTERS

A Gaussian Mixture PHD Filter for Nonlinear Jump Markov Models

2D Image Processing. Bayes filter implementation: Kalman filter

Sigma-Point Kalman Filters for Probabilistic Inference in Dynamic State-Space Models

Sequential Monte Carlo Methods for Bayesian Computation

A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER

An introduction to Sequential Monte Carlo

Analytically-Guided-Sampling Particle Filter Applied to Range-only Target Tracking

Lecture 6: Bayesian Inference in SDE Models

Test Volume 11, Number 2. December 2002

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

Particle Filtering for Data-Driven Simulation and Optimization

The Kalman Filter ImPr Talk

Introduction to Particle Filters for Data Assimilation

Extended Object and Group Tracking with Elliptic Random Hypersurface Models

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

Markov Chain Monte Carlo Methods for Stochastic Optimization

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra

Particle Filtering Approaches for Dynamic Stochastic Optimization

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems

An evaluation of the nonlinear/non-gaussian filters for the sequential data assimilation

Data Assimilation for Dispersion Models

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

RAO-BLACKWELLIZED PARTICLE FILTER FOR MARKOV MODULATED NONLINEARDYNAMIC SYSTEMS

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

THE well known Kalman filter [1], [2] is an optimal

An Introduction to Particle Filtering

The Scaled Unscented Transformation

Estimating the Shape of Targets with a PHD Filter

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Learning Static Parameters in Stochastic Processes

TARGET TRACKING BY USING PARTICLE FILTER IN SENSOR NETWORKS

Markov Chain Monte Carlo Methods for Stochastic

State Estimation using Moving Horizon Estimation and Particle Filtering

2D Image Processing. Bayes filter implementation: Kalman filter

Conditional Posterior Cramér-Rao Lower Bounds for Nonlinear Sequential Bayesian Estimation

Markov chain Monte Carlo methods for visual tracking

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1

Target Localization using Multiple UAVs with Sensor Fusion via Sigma-Point Kalman Filtering

Introduction to Machine Learning

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

Constrained State Estimation Using the Unscented Kalman Filter

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Unscented Transformation of Vehicle States in SLAM

A comparison of estimation accuracy by the use of KF, EKF & UKF filters

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

An Brief Overview of Particle Filtering

Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions

Transcription:

The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI)

Outline Optimal Estimation & Filtering Optimal Recursive Bayesian Solution Practical Solutions Gaussian approximations (EKF, UKF) Sequential Monte Carlo methods (Particle Filters) The Unscented Particle Filter The Unscented Transformation and UKF Applications of UT/UKF to Particle Filters Experimental Results Conclusions

Filtering General problem statement y y 2 1 y Observed p( y x ) Unobserved x 2 x 1 p( x x ) 1 x Filtering is the problem of sequentially estimating the states (parameters or hidden variables) of a system as a set of observations become available on-line.

Filtering Solution of sequential estimation problem given by Posterior density : p( X Y ) X = { x, x,, x } 1 2 Y = { y, y,, y } 1 2 By recursively computing a marginal of the posterior, the filtering density, p( x Y ) one need not eep trac of the complete history of the states.

Filtering Given the filtering density, a number of estimates of the system state can be calculated: Mean (optimal MMSE estimate of state) xˆ E x Y x p( x Y ) dx [ ] = = Mode Median Confidence intervals Kurtosis, etc.

State Space Formulation of System General discrete-time nonlinear, non-gaussian dynamic system state x = f( x, u, v ) 1 1 1 y = hx (, u, n ) process noise nown input noisy observation measurement noise Assumptions : 1) States follow a first order Marov process p( x x, x,, x ) = p( x x ) 1 2 0 1 2) Observations independent given the states p( y x, A) = p( y x )

Recursive Bayesian Estimation Given this state space model, how do we recursively estimate the filtering density? p( x Y ) = = = = = p( Y x) p( x) p( Y ) p( y, Y 1 x) p( x) p( y, Y ) 1 p( y Y 1, x) p( Y 1 x) p( x) p( y Y ) p( Y ) 1 1 p( y Y 1, x) p( x Y 1) p( Y 1) p( x ) p( y Y ) p( Y ) p( x ) p( y 1 1 x) p( x Y 1) p( y Y ) 1

Recursive Bayesian Estimation lielihood prior posterior p( x Y ) = p( y x) p( x Y 1) p( y Y ) 1 evidence Prior : p( x Y ) = p( x x ) p( x Y ) dx 1 1 1 1 1 transition density given by process model (Propagation of past state into future before new observation is made.) Lielihood : defined in terms of observation model Evidence : p( y Y ) = p( y x ) p( x Y ) dx 1 1

Practical Solutions Gaussian Approximations Perfect Monte Carlo Simulation Sequential Monte Carlo Methods : Particle Filters Bayesian Importance Sampling Sampling-importance resampling (SIR)

Gaussian Approximations Most common approach. Assume all RV statistics are Gaussian. Optimal recursive MMSE estimate is then given by xˆ = ( prediction of x ) + κ [( observation of y ) ( prediction of y )] Different implementations : Extended Kalman Filter (EKF) : optimal quantities approximated via first order Taylor series expansion (linearization) of process and measurement models. Unscented Kalman Filter (UKF) : optimal quantities calculated using the Unscented Transformation (accurate to second order for any nonlinearity). Drastic improvement over EKF [Wan, van der Merwe, Nelson 2000]. Problem : Gaussian approximation breas down for most nonlinear real-world applications (multi-modal distributions, non-gaussian noise sources, etc.)

Perfect Monte Carlo Simulation Allow for a complete representation of the posterior distribution. Map intractable integrals of optimal Bayesian solution to tractable discrete sums of weighted samples drawn from the posterior distribution. N 1 = N δ i= 1 ( () i ) pˆ ( x Y ) x x x p( x Y ) () i IID... So, any estimate of the form [ ] E f( x ) = f( x ) p( x Y ) dx may be approximated by : [ ] 1 ( () i ( x ) ) N f x E f N i= 1

Particle Filters Bayesian Importance Sampling It is often impossible to sample directly from the true posterior density. However, we can rather sample from a nown, easy-tosample, proposal distribution, q( x Y ) and mae use of the following substitution p( x Y ) [ ( )] = ( ) ( ) q( x Y ) E f x f x dx w x p( x ) ( x Y ) p( Y ) q q x Y = = ( x ) = ( Y x ) p( x ) ( Y ) q( x Y ) ( ) f ( x ) q x Y dx ( x) ( ) q Y ( ) p p w p f ( x ) x Y dx

Particle Filters 1 [ ( )] = ( ) ( ) ( ) ( ) E f x f x w x q x Y dx p Y = = = ( ) f( x ) w ( x ) q x Y dx p( Y x ) p( x ) q ( x Y) ( x Y ) q dx q ( ) ( ) ( ) w ( x ) q( x Y ) dx ( ) [ ( ) ( )] x x x Y Eq( x Y ) [ w( x) ] f x w x q x Y dx E w f

Particle Filters q( x Y ) So, by drawing samples from, we can approximate expectations of interest by the following: [ ( x )] E f 1 N () i () i N w ( ) ( ) 1 x f x i= 1 N () i N w ( ) i= 1 x N i= 1 w () i () i ( x ) f ( x ) Where the normalized importance weights are given by w () i () i w( x ) ( x ) = N ( j) w j= 1 x ( )

Particle Filters Using the state space assumptions (1 st order Marov / observational independence given state), the importance weights can be estimated recursively by [proof in De Freitas (2000)] w = w 1 p y x p x x ( ) ( ) q 1 x X, Y ( ) 1 Problem with SIS is that the variance of the importance weights increase stochastically over time [Kong et al. (1994), Doucet et al. (1999)] To solve this, we need to resample the particles eep / multiply particles with high importance weights discard particles with low importance weights Sampling-importance Resampling (SIR)

Particle Filters Sampling-importance Resampling Maps the N unequally weighted particles into a new set of N equally weighted samples. { () ()} { ( ) 1 x i, i j, } w x N Method proposed by Gordon, Salmond & Smith (1993) and proven mathematically by Gordon (1994). cdf 1 N 1 ( j ) w t j resampled index p(i)

Particle Filters

Particle Filters Choice of Proposal Distribution w = w 1 p ( y ) ( ) x p x x 1 q( x X, Y ) 1 critical design issue for successful particle filter samples/particles are drawn from this distribution used to evaluate importance weights Requirements 1) Support of proposal distribution must include support of true posterior distribution, i.e. heavy-tailed distributions are preferable. 2) Must include most recent observations.

Particle Filters Most popular choice of proposal distribution does not satisfy these requirements though: q x X, Y = p x x ( ) ( ) 1 1 [Isard and Blae 96, Kitagawa 96, Gordon et al. 93, Beadle and Djuric 97, Avitzour 95] Easy to implement : w = w 1 p y x p x x ( ) ( ) 1 p x ( ) x 1 = w p ( y x ) 1 Does not incorporate most recent observation though!

Improving Particle Filters Incorporate New Observations into Proposal Use Gaussian approximation (i.e. Kalman filter) to generate proposal by combining new observation with prior q x X, Y = p x X, Y ( ) ( ) 1 G 1 = N ˆ,cov[ ] ( x x )

Improving Particle Filters Extented Kalman Filter Proposal Generation De Freitas (1998), Doucet (1998), Pitt & Shephard (1999). Greatly improved performance compared to standard particle filter in problems with very accurate measurements, i.e. lielihood very peaed in comparison to prior. In highly nonlinear problems, the EKF tends to be very inaccurate and underestimates the true covariance of the state. This violates the distribution support requirement for the proposal distribution and can lead to poor performance and filter divergence. We propose the use of the Unscented Kalman Filter for proposal generation to address these problems!

Improving Particle Filters Unscented Kalman Filter Proposal Generation UKF is a recursive MMSE estimator based on the Unscented Transformation (UT). UT : Method for calculating the statistics of a RV that undergoes a nonlinear transformation (Julier and Uhlmann 1997) UT/UKF : - accurate to 3 rd order for Gaussians - higher order errors scaled by choice of transform parameters. More accurate estimates than EKF (Wan, van der Merwe, Nelson 2000) Have some control over higher order moments, i.e. urtosis, etc. heavy tailed distributions!

Unscented Transformation b i Weighted sample mean x y a + - f b g y i Weighted sample covariance P x P y b i l q = + - c x x a P x a P i x x 8

The Unscented Transformation Actual (sampling) UT Linearized (EKF) covariance sigma points mean y= f( x ) ψ = f ( χ ) i i T y = f ( x) Py = A P A x true mean UT mean true covariance UT covariance transformed sigma points A T P x A f ( x) 9

Unscented Particle Filter Particle Filter UKF Proposal + Unscented Particle Filter

Experimental Results Synthetic Experiment Time-series process model : x x v 1 1 sin ( ωπ ) + = + + φ + process noise (Gamma) nonstationary observation model : y 2 φ x + n = φ x 2 + n > 30 30 measurement noise (Gaussian)

Experimental Results Synthetic Experiment : (100 independent runs) Filter Extended Kalman Filter (EKF) MSE mean variance 0.374 0.015 Unscented Kalman Filter (UKF) Particle Filter : generic Particle Filter : EKF proposal Unscented Particle Filter 0.280 0.424 0.310 0.070 0.012 0.053 0.016 0.006

Experimental Results Synthetic Experiment

Experimental Results Pricing Financial Options Options : financial derivative that gives the holder the right (but not obligation) to do something in the future. Call option : - allow holder to buy an underlying cash product - at a specified future date ( maturity time ) - for a predetermined price ( strie price ) Put option : - allow holder to sell an underlying cash product Blac Scholes partial differential equation Main industry standard for pricing options ris-free interest rate 2 f f f 1 2 2 + rs + 2 σ S = rf 2 t S S volatility of cash product option value value of underlying cash product

Experimental Results Pricing Financial Options Blac & Scholes (1973) derived the following pricing solution: C = SN ( d ) Xe N ( d ) c rtm 1 c 2 rtm P= SN ( d1) + Xe N ( d ) c c 2 N c d 1 = 2 1 1 2 ln( / ) + ( + 2σ ) d = d σ t S X r t σ m t m (.) = cumulative normal distribution m

Experimental Results Pricing Financial Options State-space representation to model system for particle filters Hidden states : Output observations: Known control signals: r C t m, σ, P, S Estimate call and put prices over a 204 day period on the FTSE-100 index. Performance : normalized square error for one-step-ahead predictions NSE = y ˆ ( y )

Experimental Results Options Pricing Experiment : (100 independent runs) Option Type Call Put Algorithm Trivial Extended Kalman Filter (EKF) Unscented Kalman Filter (UKF) Particle Filter : generic Particle Filter : EKF proposal Unscented Particle Filter Trivial Extended Kalman Filter (EKF) Unscented Kalman Filter (UKF) Particle Filter : generic Particle Filter : EKF proposal Unscented Particle Filter NSE mean var 0.078 0.000 0.037 0.000 0.037 0.000 0.037 0.000 0.092 0.508 0.009 0.000 0.035 0.000 0.023 0.000 0.023 0.000 0.023 0.000 0.024 0.007 0.008 0.000

Experimental Results Options Pricing Experiment : UPF one-step-ahead predictions

Experimental Results Options Pricing Experiment : Estimated interest rate and volatility

Experimental Results Options Pricing Experiment : Probability distributions of implied interest rate and volatility

Particle Filter Demos Visual Dynamics Group, Oxford. (Andrew Blae) Tracing agile motion Tracing motion against camouflage Mixed state tracing

Conclusions Particle filters allow for a practical but complete representation of posterior probability distribution. Applicable to general nonlinear, non-gaussian estimation problems where standard Gaussian approximations fail. Particle filters rely on importance sampling, so the proper choice of proposal distribution is very important: Standard approach (i.e. transition prior proposal) fails when lielihood of new data is very peaed (accurate sensors) or for heavy-tailed noise sources. EKF proposal : Incorporates new observations, but can diverge due to inaccurate and inconsistent state estimates. Unscented Particle Filter : UKF proposal More consistent and accurate state estimates. Larger support overlap, can have heavier tailed distributions. Theory predicts and experiments prove significantly better performance.

The End