Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Similar documents
CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation

CSE-473. A Gentle Introduction to Particle Filters

Probabilistic Robotics

Sequential Importance Resampling (SIR) Particle Filter

Introduction to Mobile Robotics

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Estimation of Poses with Particle Filters

Using the Kalman filter Extended Kalman filter

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Anno accademico 2006/2007. Davide Migliore

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

AUTONOMOUS SYSTEMS. Probabilistic Robotics Basics Kalman Filters Particle Filters. Sebastian Thrun

7630 Autonomous Robotics Probabilistic Localisation

Probabilistic Robotics SLAM

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

CS 4495 Computer Vision Tracking 1- Kalman,Gaussian

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS

Data Fusion using Kalman Filter. Ioannis Rekleitis

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM

Probabilistic Robotics SLAM

2016 Possible Examination Questions. Robotics CSCE 574

Notes on Kalman Filtering

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

Written HW 9 Sol. CS 188 Fall Introduction to Artificial Intelligence

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Tracking. Announcements

Recursive Bayes Filtering Advanced AI

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Uncertainty & Localization I

Self assessment due: Monday 4/29/2019 at 11:59pm (submit via Gradescope)

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Localization and Map Making

3.1 More on model selection

Book Corrections for Optimal Estimation of Dynamic Systems, 2 nd Edition

EKF SLAM vs. FastSLAM A Comparison

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Block Diagram of a DCS in 411

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

Vehicle Arrival Models : Headway

מקורות לחומר בשיעור ספר הלימוד: Forsyth & Ponce מאמרים שונים חומר באינטרנט! פרק פרק 18

Temporal probability models

Inventory Analysis and Management. Multi-Period Stochastic Models: Optimality of (s, S) Policy for K-Convex Objective Functions

1. Consider a pure-exchange economy with stochastic endowments. The state of the economy

Ensamble methods: Bagging and Boosting

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

GMM - Generalized Method of Moments

Temporal probability models. Chapter 15, Sections 1 5 1

Hidden Markov Models

Probabilistic Robotics Sebastian Thrun-- Stanford

Localization. Mobile robot localization is the problem of determining the pose of a robot relative to a given map of the environment.

Comparing Means: t-tests for One Sample & Two Related Samples

Object Tracking. Computer Vision Jia-Bin Huang, Virginia Tech. Many slides from D. Hoiem

1 Review of Zero-Sum Games

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

Ensamble methods: Boosting

5. Stochastic processes (1)

Some Basic Information about M-S-D Systems

Stochastic models and their distributions

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006

Chapter 14. (Supplementary) Bayesian Filtering for State Estimation of Dynamic Systems

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes

Subway stations energy and air quality management

Linear Gaussian State Space Models

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

EE 315 Notes. Gürdal Arslan CLASS 1. (Sections ) What is a signal?

Lecture 3: Exponential Smoothing

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Transform Techniques. Moment Generating Function

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

Understanding the asymptotic behaviour of empirical Bayes methods

Math 10B: Mock Mid II. April 13, 2016

Unit Root Time Series. Univariate random walk

Approximation Algorithms for Unique Games via Orthogonal Separators

WATER LEVEL TRACKING WITH CONDENSATION ALGORITHM

Fundamental Problems In Robotics

Planning in POMDPs. Dominik Schoenberger Abstract

Affine term structure models

UNIVERSITY OF CALIFORNIA College of Engineering Department of Electrical Engineering and Computer Sciences EECS 121 FINAL EXAM

References are appeared in the last slide. Last update: (1393/08/19)

Expert Advice for Amateurs

T. J. HOLMES AND T. J. KEHOE INTERNATIONAL TRADE AND PAYMENTS THEORY FALL 2011 EXAMINATION

In this chapter the model of free motion under gravity is extended to objects projected at an angle. When you have completed it, you should

Online Convex Optimization Example And Follow-The-Leader

An EM algorithm for maximum likelihood estimation given corrupted observations. E. E. Holmes, National Marine Fisheries Service

A Bayesian Approach to Spectral Analysis

Tom Heskes and Onno Zoeter. Presented by Mark Buller

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

IB Physics Kinematics Worksheet

An introduction to the theory of SDDP algorithm

Speech and Language Processing

Lecture 2 April 04, 2018

Solutions Problem Set 3 Macro II (14.452)

Lecture Notes 2. The Hilbert Space Approach to Time Series

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -

Transcription:

Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017

1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u = acion x = sae,,, 1 1 z u z u x P x Bel Markov,,, 1 1 u z u x P x z P Markov 1 1 1 1 1,,,, dx u z u x P x u x P x z P 1 1 1 1 1 1 1,,,,,,, dx u z u x P x u z u x P x z P Toal prob. Markov 1 1 1 1 1 1,,,, dx z z u x P x u x P x z P

Discree Bayes Filer Algorihm 1. Algorihm Discree_Bayes_filer Belx,d : 2. 0 3. If d is a percepual daa iem z hen 4. For all x do 5. 6. 7. For all x do 8. 9. Else if d is an acion daa iem u hen 10. For all x do 11. 12. Reurn Bel x Bel ' x P z x Bel x Bel' x Bel ' x 1 x' Bel ' x Bel ' x P x u, x' Bel x'

Piecewise Consan Belx

Problem Saemen Wha are represenaions for Belx and maching updae rules work well in pracice? Bel x P z x P x u, x 1 Bel x 1 dx 1 Desirable: Accuracy and correcness Time and space usage scales well wih size of sae and # dimensions Represen realisic range of moion and measuremen models

Par 1: Paricle Filers Inuiion: rack Belx wih adapively locaed discree samples Poenials: Beer accuracy/compuaion rade-off Paricles can ake shape of arbirary disribuions Uses: Indoor roboics Self driving cars Compuer vision General ool in learning

Inuiive Example: Localizing During Robocup

Disribuions Consider disribuions o each px zi only. Are hese relaed o our answer?

Disribuions Waned: samples disribued according o px z 1, z 2, z 3 10

This is Easy! We can draw samples from px z l by adding noise o he deecion parameers.

Imporance Sampling As seen, i is ofen easy o draw samples from one porion of our Bayes filer Main rick: imporance sampling, i.e. how o esimae properies/saisics of one disribuion f given samples from anoher disribuion g For example, suppose we wan o esimae he expeced value of f given only samples from g.

Imporance Sampling As seen, i is ofen easy o draw samples from one porion of our Bayes filer Main rick: imporance sampling, i.e. how o esimae properies/saisics of one disribuion f given samples from anoher disribuion g

Imporance Sampling As seen, i is ofen easy o draw samples from one porion of our Bayes filer Main rick: imporance sampling, i.e. how o esimae properies/saisics of one disribuion f given samples from anoher disribuion g Weighs describe he mismach beween he wo disribuions, i.e. how o reweigh samples o obain saisics of f from samples of g

Imporance Sampling for Robocup,...,,,...,, : Targe disribuion f 2 1 2 1 n k k n z z z p x p x z p z z z x p Sampling disribuion g : l l l z p x p x z p z x p,...,,,...,, w : Imporanceweighs 2 1 2 1 n l k k l l n z z z p x z p z p z x p z z z x p g f

Imporance Sampling Here are all of our px zi samples, now wih w aached no shown. If we re-draw from hese samples, weighed by w, we ge Weighed samples Afer resampling

Imporance Sampling for Bayes Filer Wha is are he proposal disribuion and weighing compuaions? Sample from propagaion, before updae Wan poserior belief afer updae Recall: weighing o remove sample bias

Imporance Sampling for Bayes Filer Wha is are he proposal disribuion and weighing compuaions? Sample from propagaion, before updae Wan poserior belief afer updae This algorihm is known as a paricle filer.

Paricle Filer Algorihm Acual observaion and conrol received

Paricle Filer Algorihm Paricle propagaion/predicion: noise needs o be added in order o make paricles differeniae from each oher. If propagaion is deerminisic hen paricles are going o collapse o a single paricle afer a few resampling seps.

Paricle Filer Algorihm Weigh compuaion as measuremen likelihood. For each paricle we compue he probabiliy of he acual observaion given he sae is a ha paricle.

Paricle Filer Algorihm Resampling sep Noe: paricle deprivaion heurisics are no shown here

Paricle Filer Algorihm Resampling: The paricle locaions now have a chance o adap according o he weighs. More likely paricles persis, while unlikely choices are removed.

Examples: 1D Localizaion

Examples: 1D Localizaion

Resampling Given: Se S of weighed samples. Waned : Random sample, where he probabiliy of drawing x i is given by w i. Typically done n imes wih replacemen o generae new sample se S.

Resampling Carefully W n-1 w n w 1 w 2 W n-1 w n w 1 w 2 w 3 w 3 Roulee wheel Binary search, n log n Sochasic universal sampling Sysemaic resampling Linear ime complexiy Easy o implemen, low variance

Resampling Algorihm 1. Algorihm sysemaic_resamplings,n: 1 2. S', c1 w 3. For i 2n Generae cdf i 4. ci ci 1 w 1 5. u ~ U]0, n ], i 1 Iniialize hreshold 1 6. For j 1n Draw samples 7. While u j c i Skip unil nex hreshold reached 8. i i 1 9. i 1 S' S' x, n Inser 10. u u 1 n Incremen hreshold j1 11. Reurn S j Also called sochasic universal sampling

Paricle Moion Model Sar

Proximiy Sensor Model Reminder Laser sensor Sonar sensor

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

46

Paricle Filer Summary Very flexible ool as we ge o make our choice of proposal disribuions as long as we can properly compue imporance weigh Performance is guaraneed given infinie samples! The paricle cloud and is weighs represen our disribuion, bu making decisions can sill be complex: Ac based on he mos likely paricle Ac using a weighed summaion over paricles Ac conservaively, accouning for he wors paricle In pracice, he number of paricles required o perform well scales wih he problem complexiy and his can be hard o measure

Par 2: Kalman Filers Inuiion: rack Belx wih a Gaussian disribuion, simplifying assumpions o ensure updaes are all possible Payoffs: Coninuous represenaion Efficien compuaion Uses: Rockery Mobile devices Drones GPS he lis is very long

Par 2: Kalman Filers Inuiion: rack Belx wih a Gaussian disribuion, simplifying assumpions o ensure updaes are all possible Payoffs: Coninuous represenaion Efficien compuaion Uses: Rockery Mobile devices Drones GPS he lis is very long

Example: Landing on mars

Kalman Filer: Approach Kalman Filer: an insance of Bayes Filer Linear dynamics wih Gaussian noise Linear observaions wih Gaussian noise Iniial belief is Gaussian

Kalman Filer: assumpions Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued

Kalman Filer: why so many assumpions? Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae Wihou lineariy here is no closed-form soluion for he poserior belief in he Bayes Filer. Recall ha if X is Gaussian hen Y=AX+b is also Gaussian. This is no rue in general if Y=hX. Also, we will see laer ha applying Bayes rule o a Gaussian prior and a Gaussian measuremen likelihood resuls in a Gaussian poserior. are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued

Kalman Filer: why so many assumpions? Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae This resuls in he belief remaining Gaussian afer each propagaion and updae sep. This means ha we only have o worry abou how he mean and he covariance of he belief evolve recursively wih each predicion sep and updae sep COOL! are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued

Kalman Filer: why so many assumpions? Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued This makes he recursive updaes of he mean and covariance much simpler.

Assumpions guaranee ha if he prior belief before he predicion sep is Gaussian Kalman Filer: an insance of Bayes Filer hen he prior belief afer he predicion sep will be Gaussian and he poserior belief afer he updae sep will be Gaussian.

Kalman Filer: an insance of Bayes Filer Belief afer predicion sep o simplify noaion So, under he Kalman Filer assumpions we ge Noaion: esimae a ime given hisory of observaions and conrols up o ime -1

Kalman Filer: an insance of Bayes Filer So, under he Kalman Filer assumpions we ge Two main quesions: 1. How o ge predicion mean and covariance from prior mean and covariance? 2. How o ge poserior mean and covariance from predicion mean and covariance? These quesions were answered in he 1960s. The resuling algorihm was used in he Apollo missions o he moon, and in almos every sysem in which here is a noisy sensor involved COOL!

Kalman Filer wih 1D sae Le s sar wih he updae sep recursion. Here s an example: Suppose your measuremen model is wih Suppose your belief afer he predicion sep is Suppose your firs noisy measuremen is Q: Wha is he mean and covariance of?

Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so

Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so Predicion residual/error beween acual observaion and expeced observaion. You expeced he measured mean o be 0, according o your predicion prior, bu you acually observed 5. The smaller his predicion error is he beer your esimae will be, or he beer i will agree wih he measuremens.

Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so Kalman Gain: specifies how much effec will he measuremen have in he poserior, compared o he predicion prior. Which one do you rus more, your prior, or your measuremen?

Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so The measuremen is more confiden lower variance han he prior, so he poserior mean is going o be closer o 5 han o 0.

Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so No maer wha happens, he variance of he poserior is going o be reduced. I.e. new measuremen increases confidence no maer how noisy i is.

Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so In fac you can wrie his as so and I.e. he poserior is more confiden han boh he prior and he measuremen.

Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so In his example:

Kalman Filer wih 1D sae: he updae sep Anoher example:

Kalman Filer wih 1D sae: he updae sep Take-home message: new observaions, no maer how noisy, always reduce uncerainy in he poserior. The mean of he poserior, on he oher hand, only changes when here is a nonzero predicion residual.

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: his noaion means expeced value wih respec o condiional expecaion, i.e Conrol is a consan wih respec o he disribuion Dynamics noise is zero mean, and independen of observaions and conrols

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: his noaion means covariance wih respec o condiional expecaion, i.e

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: covariance neglecs addiion of consan erms, i.e. CovX+b = CovX

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: CovX+Y=CovX+CovY-2CovX,Y Recall: we denoe CovX,X=CovX as a shorhand

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then We assumed dynamics noise is independen of pas measuremen and conrols We assumed noise variables are independen of sae. So his covariance is zero.

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then

Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then

Kalman Filer wih 1D sae: he propagaion/predicion sep Take home message: uncerainy increases afer he predicion sep, because we are speculaing abou he fuure.

Kalman Filer Algorihm 1. Algorihm Kalman_filer m -1, S -1, u, z : 2. Predicion: 3. 4. 5. Correcion: 6. 7. 8. 9. Reurn m, S u A B 1 m m T R A A S S 1 1 S S T T Q C C C K C z K m m m C K I S S

The Predicion-Correcion-Cycle S S T R A A B u A x bel 1 1 m m 2, 2 2 2 1 ac a b u a x bel m m Predicion

The Predicion-Correcion-Cycle 1, S S S S T T Q C C C K K C I C z K x bel m m m 2, 2 2 2 2, 1 obs K K z K x bel m m m Correcion

The Predicion-Correcion-Cycle 1, S S S S T T Q C C C K K C I C z K x bel m m m 2, 2 2 2 2, 1 obs K K z K x bel m m m S S T R A A B u A x bel 1 1 m m 2, 2 2 2 1 ac a b u a x bel m m Correcion Predicion

Kalman Filer Summary Highly efficien: Polynomial in measuremen dimensionaliy k and sae dimensionaliy n: Ok 2.376 + n 2 Opimal for linear Gaussian sysems! Mos roboics sysems are nonlinear!