Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle Filer S =, η = 0 3. For i =1 n Generae new samples 4. Sample index j(i) from he discree disribuion given by w -1 i 5. Sample x from i i 6. w = p( z x ) Compue imporance weigh i 7. η = η + w Updae normalizaion facor i i 8. S = S { < x, w > } Inser 9. For i i 10. w = w /η Normalize weighs 11. Reurn S i =1 n p(x x j(i)!1,u ) Page 1!
Ouline Improved Sampling Issue wih vanilla paricle filer when noise dominaed by moion model Imporance Sampling Opimal Proposal Examples Resampling Paricle Deprivaion Noise-free Sensors Adaping Number of Paricles: KLD Sampling Noise Dominaed by Moion Model [Grisei, Sachniss, Burgard, T-RO2006] à Mos paricles ge (near) zero weighs and are los. Page 2!
Imporance Sampling Theoreical jusificaion: for any funcion f we have: f could be: wheher a grid cell is occupied or no, wheher he posiion of a robo is wihin 5cm of some (x,y), ec. Imporance Sampling Task: sample from densiy p(.) Soluion: sample from proposal densiy ¼(.) Weigh each sample x (i) by p(x (i) ) / ¼(x (i) ) E.g.: p ¼ Requiremen: if ¼(x) = 0 hen p(x) = 0. Page 3!
Paricle Filers Revisied 1. Algorihm paricle_filer( S -1, u, z ): 2. S =, η = 0 3. For i =1 n Generae new samples 4. Sample index j(i) from he discree disribuion given by w -1 i 5. Sample x from 6. w i = p(z x i Compue imporance weigh i 7. η = η + w Updae normalizaion facor i i 8. S = S { < x, w > } Inser 9. For i =1 n i i 10. w = w /η Normalize weighs 11. Reurn S i )p(x i i x!1,u, z )! (x i x!1! (x x j(i)!1,u, z ),u ) Opimal Sequenial Proposal ¼(.) Opimal! (x x i!1,u, z ) = p(x x i,u, z )!1 à Applying Bayes rule o he denominaor gives: Subsiuion and simplificaion gives Page 4!
Opimal proposal ¼(.) Opimal! (x x i!1,u, z ) = p(x x i,u, z )!1 à Challenges: Typically difficul o sample from p(x x i,u, z )!1 Imporance weigh: ypically expensive o compue inegral Example 1: ¼(.) = Opimal proposal Nonlinear Gaussian Sae Space Model Nonlinear Gaussian Sae Space Model: Then: wih And: Page 5!
Example 2: ¼(.) = Moion Model à he sandard paricle filer Example 3: Approximaing Opimal ¼ for Localizaion [Grisei, Sachniss, Burgard, T-RO2006] One (no so desirable soluion): use smoohed likelihood such ha more paricles reain a meaningful weigh --- BUT informaion is los Beer: inegrae laes observaion z ino proposal ¼ Page 6!
Example 3: Approximaing Opimal ¼ for Localizaion: Generaing One Weighed Sample 1. Iniial guess 2. Execue scan maching saring from he iniial guess, resuling in pose esimae. 3. Sample K poins in region around. 4. Proposal disribuion is Gaussian wih mean and covariance: 5. Sample from (approximaely opimal) proposal disribuion. 6. Weigh = Scan Maching Compue E.g., using gradien descen P( z x, m) = K k = 1 P( z k x, m)! # # P(z k x, m) = # # # "! hi! unexp! max! rand T $! & # & # & '# & # & # % " P hi (z k x, m) $ & P unexp (z k x, m) & & P max (z k x, m) & P rand (z k x, m) & % Page 7!
Example 3: Example Paricle Disribuions [Grisei, Sachniss, Burgard, T-RO2006] Paricles generaed from he approximaely opimal proposal disribuion. If using he sandard moion model, in all hree cases he paricle se would have been similar o (c). Resampling Consider running a paricle filer for a sysem wih deerminisic dynamics and no sensors Problem: While no informaion is obained ha favors one paricle over anoher, due o resampling some paricles will disappear and afer running sufficienly long wih very high probabiliy all paricles will have become idenical. On he surface i migh look like he paricle filer has uniquely deermined he sae. Resampling induces loss of diversiy. The variance of he paricles decreases, he variance of he paricle se as an esimaor of he rue belief increases. Page 8!
Resampling Soluion I Effecive sample size: Example: Normalized weighs All weighs = 1/N à Effecive sample size = N All weighs = 0, excep for one weigh = 1 à Effecive sample size = 1 Idea: resample only when effecive sampling size is low Resampling Soluion I (cd) Page 9!
Resampling Soluion II: Low Variance Sampling M = number of paricles r \in [0, 1/M] Advanages: More sysemaic coverage of space of samples If all samples have same imporance weigh, no samples are los Lower compuaional complexiy Resampling Soluion III Loss of diversiy caused by resampling from a discree disribuion Soluion: regularizaion Consider he paricles o represen a coninuous densiy Sample from he coninuous densiy E.g., given (1-D) paricles sample from he densiy: Page 10!
Paricle Deprivaion = when here are no paricles in he viciniy of he correc sae Occurs as he resul of he variance in random sampling. An unlucky series of random numbers can wipe ou all paricles near he rue sae. This has non-zero probabiliy o happen a each ime à will happen evenually. Popular soluion: add a small number of randomly generaed paricles when resampling. Advanages: reduces paricle deprivaion, simpliciy. Con: incorrec poserior esimae even in he limi of infiniely many paricles. Oher benefi: iniializaion a ime 0 migh no have goen anyhing near he rue sae, and no even near a sae ha over ime could have evolved o be close o rue sae now; adding random samples will cu ou paricles ha were no very consisen wih pas evidence anyway, and insead gives a new chance a geing close he rue sae. Paricle Deprivaion: How Many Paricles o Add? Simples: Fixed number. Beer way: Monior he probabiliy of sensor measuremens which can be approximaed by: Average esimae over muliple ime-seps and compare o ypical values when having reasonable sae esimaes. If low, injec random paricles. Page 11!
Noise-free Sensors Consider a measuremen obained wih a noise-free sensor, e.g., a noise-free laser-range finder---issue? All paricles would end up wih weigh zero, as i is very unlikely o have had a paricle maching he measuremen exacly. Soluions: Arificially inflae amoun of noise in sensors Beer proposal disribuion (see firs secion of his se of slides). Page 12!
Adaping Number of Paricles: KLD-Sampling E.g., ypically more paricles need a he beginning of localizaion run Idea: Pariion he sae-space When sampling, keep rack of number of bins occupied Sop sampling when a hreshold ha depends on he number of occupied bins is reached If all samples fall in a small number of bins à lower hreshold z_{1-\dela}: he upper 1- \dela quanile of he sandard normal disribuion \dela = 0.01 and \epsilon = 0.05 works well in pracice Page 13!
KLD-sampling KLD-sampling Page 14!