Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017
1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u = acion x = sae,,, 1 1 z u z u x P x Bel Markov,,, 1 1 u z u x P x z P Markov 1 1 1 1 1,,,, dx u z u x P x u x P x z P 1 1 1 1 1 1 1,,,,,,, dx u z u x P x u z u x P x z P Toal prob. Markov 1 1 1 1 1 1,,,, dx z z u x P x u x P x z P
Discree Bayes Filer Algorihm 1. Algorihm Discree_Bayes_filer Belx,d : 2. 0 3. If d is a percepual daa iem z hen 4. For all x do 5. 6. 7. For all x do 8. 9. Else if d is an acion daa iem u hen 10. For all x do 11. 12. Reurn Bel x Bel ' x P z x Bel x Bel' x Bel ' x 1 x' Bel ' x Bel ' x P x u, x' Bel x'
Piecewise Consan Belx
Problem Saemen Wha are represenaions for Belx and maching updae rules work well in pracice? Bel x P z x P x u, x 1 Bel x 1 dx 1 Desirable: Accuracy and correcness Time and space usage scales well wih size of sae and # dimensions Represen realisic range of moion and measuremen models
Par 1: Paricle Filers Inuiion: rack Belx wih adapively locaed discree samples Poenials: Beer accuracy/compuaion rade-off Paricles can ake shape of arbirary disribuions Uses: Indoor roboics Self driving cars Compuer vision General ool in learning
Inuiive Example: Localizing During Robocup
Disribuions Consider disribuions o each px zi only. Are hese relaed o our answer?
Disribuions Waned: samples disribued according o px z 1, z 2, z 3 10
This is Easy! We can draw samples from px z l by adding noise o he deecion parameers.
Imporance Sampling As seen, i is ofen easy o draw samples from one porion of our Bayes filer Main rick: imporance sampling, i.e. how o esimae properies/saisics of one disribuion f given samples from anoher disribuion g For example, suppose we wan o esimae he expeced value of f given only samples from g.
Imporance Sampling As seen, i is ofen easy o draw samples from one porion of our Bayes filer Main rick: imporance sampling, i.e. how o esimae properies/saisics of one disribuion f given samples from anoher disribuion g
Imporance Sampling As seen, i is ofen easy o draw samples from one porion of our Bayes filer Main rick: imporance sampling, i.e. how o esimae properies/saisics of one disribuion f given samples from anoher disribuion g Weighs describe he mismach beween he wo disribuions, i.e. how o reweigh samples o obain saisics of f from samples of g
Imporance Sampling for Robocup,...,,,...,, : Targe disribuion f 2 1 2 1 n k k n z z z p x p x z p z z z x p Sampling disribuion g : l l l z p x p x z p z x p,...,,,...,, w : Imporanceweighs 2 1 2 1 n l k k l l n z z z p x z p z p z x p z z z x p g f
Imporance Sampling Here are all of our px zi samples, now wih w aached no shown. If we re-draw from hese samples, weighed by w, we ge Weighed samples Afer resampling
Imporance Sampling for Bayes Filer Wha is are he proposal disribuion and weighing compuaions? Sample from propagaion, before updae Wan poserior belief afer updae Recall: weighing o remove sample bias
Imporance Sampling for Bayes Filer Wha is are he proposal disribuion and weighing compuaions? Sample from propagaion, before updae Wan poserior belief afer updae This algorihm is known as a paricle filer.
Paricle Filer Algorihm Acual observaion and conrol received
Paricle Filer Algorihm Paricle propagaion/predicion: noise needs o be added in order o make paricles differeniae from each oher. If propagaion is deerminisic hen paricles are going o collapse o a single paricle afer a few resampling seps.
Paricle Filer Algorihm Weigh compuaion as measuremen likelihood. For each paricle we compue he probabiliy of he acual observaion given he sae is a ha paricle.
Paricle Filer Algorihm Resampling sep Noe: paricle deprivaion heurisics are no shown here
Paricle Filer Algorihm Resampling: The paricle locaions now have a chance o adap according o he weighs. More likely paricles persis, while unlikely choices are removed.
Examples: 1D Localizaion
Examples: 1D Localizaion
Resampling Given: Se S of weighed samples. Waned : Random sample, where he probabiliy of drawing x i is given by w i. Typically done n imes wih replacemen o generae new sample se S.
Resampling Carefully W n-1 w n w 1 w 2 W n-1 w n w 1 w 2 w 3 w 3 Roulee wheel Binary search, n log n Sochasic universal sampling Sysemaic resampling Linear ime complexiy Easy o implemen, low variance
Resampling Algorihm 1. Algorihm sysemaic_resamplings,n: 1 2. S', c1 w 3. For i 2n Generae cdf i 4. ci ci 1 w 1 5. u ~ U]0, n ], i 1 Iniialize hreshold 1 6. For j 1n Draw samples 7. While u j c i Skip unil nex hreshold reached 8. i i 1 9. i 1 S' S' x, n Inser 10. u u 1 n Incremen hreshold j1 11. Reurn S j Also called sochasic universal sampling
Paricle Moion Model Sar
Proximiy Sensor Model Reminder Laser sensor Sonar sensor
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
Paricle Filer Summary Very flexible ool as we ge o make our choice of proposal disribuions as long as we can properly compue imporance weigh Performance is guaraneed given infinie samples! The paricle cloud and is weighs represen our disribuion, bu making decisions can sill be complex: Ac based on he mos likely paricle Ac using a weighed summaion over paricles Ac conservaively, accouning for he wors paricle In pracice, he number of paricles required o perform well scales wih he problem complexiy and his can be hard o measure
Par 2: Kalman Filers Inuiion: rack Belx wih a Gaussian disribuion, simplifying assumpions o ensure updaes are all possible Payoffs: Coninuous represenaion Efficien compuaion Uses: Rockery Mobile devices Drones GPS he lis is very long
Par 2: Kalman Filers Inuiion: rack Belx wih a Gaussian disribuion, simplifying assumpions o ensure updaes are all possible Payoffs: Coninuous represenaion Efficien compuaion Uses: Rockery Mobile devices Drones GPS he lis is very long
Example: Landing on mars
Kalman Filer: Approach Kalman Filer: an insance of Bayes Filer Linear dynamics wih Gaussian noise Linear observaions wih Gaussian noise Iniial belief is Gaussian
Kalman Filer: assumpions Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued
Kalman Filer: why so many assumpions? Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae Wihou lineariy here is no closed-form soluion for he poserior belief in he Bayes Filer. Recall ha if X is Gaussian hen Y=AX+b is also Gaussian. This is no rue in general if Y=hX. Also, we will see laer ha applying Bayes rule o a Gaussian prior and a Gaussian measuremen likelihood resuls in a Gaussian poserior. are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued
Kalman Filer: why so many assumpions? Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae This resuls in he belief remaining Gaussian afer each propagaion and updae sep. This means ha we only have o worry abou how he mean and he covariance of he belief evolve recursively wih each predicion sep and updae sep COOL! are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued
Kalman Filer: why so many assumpions? Two assumpions inheried from Bayes Filer Linear dynamics and observaion models Iniial belief is Gaussian Noise variables and iniial sae are joinly Gaussian and independen Noise variables are independen and idenically disribued Noise variables are independen and idenically disribued This makes he recursive updaes of he mean and covariance much simpler.
Assumpions guaranee ha if he prior belief before he predicion sep is Gaussian Kalman Filer: an insance of Bayes Filer hen he prior belief afer he predicion sep will be Gaussian and he poserior belief afer he updae sep will be Gaussian.
Kalman Filer: an insance of Bayes Filer Belief afer predicion sep o simplify noaion So, under he Kalman Filer assumpions we ge Noaion: esimae a ime given hisory of observaions and conrols up o ime -1
Kalman Filer: an insance of Bayes Filer So, under he Kalman Filer assumpions we ge Two main quesions: 1. How o ge predicion mean and covariance from prior mean and covariance? 2. How o ge poserior mean and covariance from predicion mean and covariance? These quesions were answered in he 1960s. The resuling algorihm was used in he Apollo missions o he moon, and in almos every sysem in which here is a noisy sensor involved COOL!
Kalman Filer wih 1D sae Le s sar wih he updae sep recursion. Here s an example: Suppose your measuremen model is wih Suppose your belief afer he predicion sep is Suppose your firs noisy measuremen is Q: Wha is he mean and covariance of?
Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so
Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so Predicion residual/error beween acual observaion and expeced observaion. You expeced he measured mean o be 0, according o your predicion prior, bu you acually observed 5. The smaller his predicion error is he beer your esimae will be, or he beer i will agree wih he measuremens.
Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so Kalman Gain: specifies how much effec will he measuremen have in he poserior, compared o he predicion prior. Which one do you rus more, your prior, or your measuremen?
Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so The measuremen is more confiden lower variance han he prior, so he poserior mean is going o be closer o 5 han o 0.
Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so No maer wha happens, he variance of he poserior is going o be reduced. I.e. new measuremen increases confidence no maer how noisy i is.
Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so In fac you can wrie his as so and I.e. he poserior is more confiden han boh he prior and he measuremen.
Kalman Filer wih 1D sae: he updae sep From Bayes Filer we ge so In his example:
Kalman Filer wih 1D sae: he updae sep Anoher example:
Kalman Filer wih 1D sae: he updae sep Take-home message: new observaions, no maer how noisy, always reduce uncerainy in he poserior. The mean of he poserior, on he oher hand, only changes when here is a nonzero predicion residual.
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: his noaion means expeced value wih respec o condiional expecaion, i.e Conrol is a consan wih respec o he disribuion Dynamics noise is zero mean, and independen of observaions and conrols
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: his noaion means covariance wih respec o condiional expecaion, i.e
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: covariance neglecs addiion of consan erms, i.e. CovX+b = CovX
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then Recall: CovX+Y=CovX+CovY-2CovX,Y Recall: we denoe CovX,X=CovX as a shorhand
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then We assumed dynamics noise is independen of pas measuremen and conrols We assumed noise variables are independen of sae. So his covariance is zero.
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then
Kalman Filer wih 1D sae: he propagaion/predicion sep Suppose ha he dynamics model is and you applied he command. Then
Kalman Filer wih 1D sae: he propagaion/predicion sep Take home message: uncerainy increases afer he predicion sep, because we are speculaing abou he fuure.
Kalman Filer Algorihm 1. Algorihm Kalman_filer m -1, S -1, u, z : 2. Predicion: 3. 4. 5. Correcion: 6. 7. 8. 9. Reurn m, S u A B 1 m m T R A A S S 1 1 S S T T Q C C C K C z K m m m C K I S S
The Predicion-Correcion-Cycle S S T R A A B u A x bel 1 1 m m 2, 2 2 2 1 ac a b u a x bel m m Predicion
The Predicion-Correcion-Cycle 1, S S S S T T Q C C C K K C I C z K x bel m m m 2, 2 2 2 2, 1 obs K K z K x bel m m m Correcion
The Predicion-Correcion-Cycle 1, S S S S T T Q C C C K K C I C z K x bel m m m 2, 2 2 2 2, 1 obs K K z K x bel m m m S S T R A A B u A x bel 1 1 m m 2, 2 2 2 1 ac a b u a x bel m m Correcion Predicion
Kalman Filer Summary Highly efficien: Polynomial in measuremen dimensionaliy k and sae dimensionaliy n: Ok 2.376 + n 2 Opimal for linear Gaussian sysems! Mos roboics sysems are nonlinear!