Sequential Importance Resampling (SIR) Particle Filter

Similar documents
Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

CSE-473. A Gentle Introduction to Particle Filters

CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation

Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Introduction to Mobile Robotics

Probabilistic Robotics

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Estimation of Poses with Particle Filters

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

7630 Autonomous Robotics Probabilistic Localisation

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

Using the Kalman filter Extended Kalman filter

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM

Probabilistic Robotics The Sparse Extended Information Filter

Temporal probability models

Probabilistic Robotics SLAM

Anno accademico 2006/2007. Davide Migliore

Data Fusion using Kalman Filter. Ioannis Rekleitis

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping

Probabilistic Robotics SLAM

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

Temporal probability models. Chapter 15, Sections 1 5 1

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

Localization. Mobile robot localization is the problem of determining the pose of a robot relative to a given map of the environment.

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004

20. Applications of the Genetic-Drift Model

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

EKF SLAM vs. FastSLAM A Comparison

Look-ahead Proposals for Robust Grid-based SLAM

CS 4495 Computer Vision Tracking 1- Kalman,Gaussian

Fixed-lag Sampling Strategies for Particle Filtering SLAM

Notes on Kalman Filtering

Vehicle Arrival Models : Headway

Chapter 14. (Supplementary) Bayesian Filtering for State Estimation of Dynamic Systems

Recursive Bayes Filtering Advanced AI

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Probabilistic Fundamentals in Robotics

Introduction to Mobile Robotics Summary

Planning in POMDPs. Dominik Schoenberger Abstract

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -

MCMC-Based Particle Filtering for Tracking a Variable Number of Interacting Targets. Zia Khan, Tucker Balch, and Frank Dellaert

Recent Developments In Evolutionary Data Assimilation And Model Uncertainty Estimation For Hydrologic Forecasting Hamid Moradkhani

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Book Corrections for Optimal Estimation of Dynamic Systems, 2 nd Edition

2016 Possible Examination Questions. Robotics CSCE 574

Speech and Language Processing

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course

Maintenance Models. Prof. Robert C. Leachman IEOR 130, Methods of Manufacturing Improvement Spring, 2011

SMC in Estimation of a State Space Model

Self assessment due: Monday 4/29/2019 at 11:59pm (submit via Gradescope)

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Written HW 9 Sol. CS 188 Fall Introduction to Artificial Intelligence

A Bayesian Approach to Spectral Analysis

On using Likelihood-adjusted Proposals in Particle Filtering: Local Importance Sampling

Fundamental Problems In Robotics

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK

3.1.3 INTRODUCTION TO DYNAMIC OPTIMIZATION: DISCRETE TIME PROBLEMS. A. The Hamiltonian and First-Order Conditions in a Finite Time Horizon

AUV positioning based on Interactive Multiple Model

An EM based training algorithm for recurrent neural networks

RL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1

Indoor Simultaneous Localization And Mapping Based On FastSLAM

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

Presentation Overview

Mapping in Dynamic Environments

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006

Hidden Markov Models

1 Review of Zero-Sum Games

Linear Gaussian State Space Models

Understanding the asymptotic behaviour of empirical Bayes methods

Chapter 4. Truncation Errors

A Sequential Smoothing Algorithm with Linear Computational Cost

A First Course on Kinetics and Reaction Engineering. Class 19 on Unit 18

Comparing Means: t-tests for One Sample & Two Related Samples

Monte Carlo Sampling of Non-Gaussian Proposal Distribution in Feature-Based RBPF-SLAM

AUTONOMOUS SYSTEMS. Probabilistic Robotics Basics Kalman Filters Particle Filters. Sebastian Thrun

Particle Filtering and Smoothing Methods

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

A PROBABILISTIC MULTIMODAL ALGORITHM FOR TRACKING MULTIPLE AND DYNAMIC OBJECTS

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Monte Carlo data association for multiple target tracking

Block Diagram of a DCS in 411

Object Tracking. Computer Vision Jia-Bin Huang, Virginia Tech. Many slides from D. Hoiem

Article from. Predictive Analytics and Futurism. July 2016 Issue 13

Ensemble Confidence Estimates Posterior Probability

מקורות לחומר בשיעור ספר הלימוד: Forsyth & Ponce מאמרים שונים חומר באינטרנט! פרק פרק 18

Tom Heskes and Onno Zoeter. Presented by Mark Buller

Ordinary differential equations. Phys 750 Lecture 7

Monocular SLAM Using a Rao-Blackwellised Particle Filter with Exhaustive Pose Space Search

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =

CMU-Q Lecture 3: Search algorithms: Informed. Teacher: Gianni A. Di Caro

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2

Distributed Particle Filters for Sensor Networks

Improved Rao-Blackwellized H filter based mobile robot SLAM

Excel-Based Solution Method For The Optimal Policy Of The Hadley And Whittin s Exact Model With Arma Demand

hen found from Bayes rule. Specically, he prior disribuion is given by p( ) = N( ; ^ ; r ) (.3) where r is he prior variance (we add on he random drif

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems

Transcription:

Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle Filer S =, η = 0 3. For i =1 n Generae new samples 4. Sample index j(i) from he discree disribuion given by w -1 i 5. Sample x from i i 6. w = p( z x ) Compue imporance weigh i 7. η = η + w Updae normalizaion facor i i 8. S = S { < x, w > } Inser 9. For i i 10. w = w /η Normalize weighs 11. Reurn S i =1 n p(x x j(i)!1,u ) Page 1!

Ouline Improved Sampling Issue wih vanilla paricle filer when noise dominaed by moion model Imporance Sampling Opimal Proposal Examples Resampling Paricle Deprivaion Noise-free Sensors Adaping Number of Paricles: KLD Sampling Noise Dominaed by Moion Model [Grisei, Sachniss, Burgard, T-RO2006] à Mos paricles ge (near) zero weighs and are los. Page 2!

Imporance Sampling Theoreical jusificaion: for any funcion f we have: f could be: wheher a grid cell is occupied or no, wheher he posiion of a robo is wihin 5cm of some (x,y), ec. Imporance Sampling Task: sample from densiy p(.) Soluion: sample from proposal densiy ¼(.) Weigh each sample x (i) by p(x (i) ) / ¼(x (i) ) E.g.: p ¼ Requiremen: if ¼(x) = 0 hen p(x) = 0. Page 3!

Paricle Filers Revisied 1. Algorihm paricle_filer( S -1, u, z ): 2. S =, η = 0 3. For i =1 n Generae new samples 4. Sample index j(i) from he discree disribuion given by w -1 i 5. Sample x from 6. w i = p(z x i Compue imporance weigh i 7. η = η + w Updae normalizaion facor i i 8. S = S { < x, w > } Inser 9. For i =1 n i i 10. w = w /η Normalize weighs 11. Reurn S i )p(x i i x!1,u, z )! (x i x!1! (x x j(i)!1,u, z ),u ) Opimal Sequenial Proposal ¼(.) Opimal! (x x i!1,u, z ) = p(x x i,u, z )!1 à Applying Bayes rule o he denominaor gives: Subsiuion and simplificaion gives Page 4!

Opimal proposal ¼(.) Opimal! (x x i!1,u, z ) = p(x x i,u, z )!1 à Challenges: Typically difficul o sample from p(x x i,u, z )!1 Imporance weigh: ypically expensive o compue inegral Example 1: ¼(.) = Opimal proposal Nonlinear Gaussian Sae Space Model Nonlinear Gaussian Sae Space Model: Then: wih And: Page 5!

Example 2: ¼(.) = Moion Model à he sandard paricle filer Example 3: Approximaing Opimal ¼ for Localizaion [Grisei, Sachniss, Burgard, T-RO2006] One (no so desirable soluion): use smoohed likelihood such ha more paricles reain a meaningful weigh --- BUT informaion is los Beer: inegrae laes observaion z ino proposal ¼ Page 6!

Example 3: Approximaing Opimal ¼ for Localizaion: Generaing One Weighed Sample 1. Iniial guess 2. Execue scan maching saring from he iniial guess, resuling in pose esimae. 3. Sample K poins in region around. 4. Proposal disribuion is Gaussian wih mean and covariance: 5. Sample from (approximaely opimal) proposal disribuion. 6. Weigh = Scan Maching Compue E.g., using gradien descen P( z x, m) = K k = 1 P( z k x, m)! # # P(z k x, m) = # # # "! hi! unexp! max! rand T $! & # & # & '# & # & # % " P hi (z k x, m) $ & P unexp (z k x, m) & & P max (z k x, m) & P rand (z k x, m) & % Page 7!

Example 3: Example Paricle Disribuions [Grisei, Sachniss, Burgard, T-RO2006] Paricles generaed from he approximaely opimal proposal disribuion. If using he sandard moion model, in all hree cases he paricle se would have been similar o (c). Resampling Consider running a paricle filer for a sysem wih deerminisic dynamics and no sensors Problem: While no informaion is obained ha favors one paricle over anoher, due o resampling some paricles will disappear and afer running sufficienly long wih very high probabiliy all paricles will have become idenical. On he surface i migh look like he paricle filer has uniquely deermined he sae. Resampling induces loss of diversiy. The variance of he paricles decreases, he variance of he paricle se as an esimaor of he rue belief increases. Page 8!

Resampling Soluion I Effecive sample size: Example: Normalized weighs All weighs = 1/N à Effecive sample size = N All weighs = 0, excep for one weigh = 1 à Effecive sample size = 1 Idea: resample only when effecive sampling size is low Resampling Soluion I (cd) Page 9!

Resampling Soluion II: Low Variance Sampling M = number of paricles r \in [0, 1/M] Advanages: More sysemaic coverage of space of samples If all samples have same imporance weigh, no samples are los Lower compuaional complexiy Resampling Soluion III Loss of diversiy caused by resampling from a discree disribuion Soluion: regularizaion Consider he paricles o represen a coninuous densiy Sample from he coninuous densiy E.g., given (1-D) paricles sample from he densiy: Page 10!

Paricle Deprivaion = when here are no paricles in he viciniy of he correc sae Occurs as he resul of he variance in random sampling. An unlucky series of random numbers can wipe ou all paricles near he rue sae. This has non-zero probabiliy o happen a each ime à will happen evenually. Popular soluion: add a small number of randomly generaed paricles when resampling. Advanages: reduces paricle deprivaion, simpliciy. Con: incorrec poserior esimae even in he limi of infiniely many paricles. Oher benefi: iniializaion a ime 0 migh no have goen anyhing near he rue sae, and no even near a sae ha over ime could have evolved o be close o rue sae now; adding random samples will cu ou paricles ha were no very consisen wih pas evidence anyway, and insead gives a new chance a geing close he rue sae. Paricle Deprivaion: How Many Paricles o Add? Simples: Fixed number. Beer way: Monior he probabiliy of sensor measuremens which can be approximaed by: Average esimae over muliple ime-seps and compare o ypical values when having reasonable sae esimaes. If low, injec random paricles. Page 11!

Noise-free Sensors Consider a measuremen obained wih a noise-free sensor, e.g., a noise-free laser-range finder---issue? All paricles would end up wih weigh zero, as i is very unlikely o have had a paricle maching he measuremen exacly. Soluions: Arificially inflae amoun of noise in sensors Beer proposal disribuion (see firs secion of his se of slides). Page 12!

Adaping Number of Paricles: KLD-Sampling E.g., ypically more paricles need a he beginning of localizaion run Idea: Pariion he sae-space When sampling, keep rack of number of bins occupied Sop sampling when a hreshold ha depends on he number of occupied bins is reached If all samples fall in a small number of bins à lower hreshold z_{1-\dela}: he upper 1- \dela quanile of he sandard normal disribuion \dela = 0.01 and \epsilon = 0.05 works well in pracice Page 13!

KLD-sampling KLD-sampling Page 14!