EKF SLAM vs. FastSLAM A Comparison

Similar documents
SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS

Probabilistic Robotics

Probabilistic Robotics SLAM

Estimation of Poses with Particle Filters

Introduction to Mobile Robotics

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Probabilistic Robotics SLAM

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping

Probabilistic Robotics The Sparse Extended Information Filter

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

Simultaneous Localization and Mapping with Unknown Data Association Using FastSLAM

FastSLAM: An Efficient Solution to the Simultaneous Localization And Mapping Problem with Unknown Data Association

Using the Kalman filter Extended Kalman filter

Notes on Kalman Filtering

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Sequential Importance Resampling (SIR) Particle Filter

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004

Temporal probability models. Chapter 15, Sections 1 5 1

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

Simultaneous Localisation and Mapping. IAR Lecture 10 Barbara Webb

2016 Possible Examination Questions. Robotics CSCE 574

Tracking. Announcements

Temporal probability models

CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation

FastSLAM 2.0: An Improved Particle Filtering Algorithm for Simultaneous Localization and Mapping that Provably Converges

CSE-473. A Gentle Introduction to Particle Filters

7630 Autonomous Robotics Probabilistic Localisation

CS 4495 Computer Vision Tracking 1- Kalman,Gaussian

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Anno accademico 2006/2007. Davide Migliore

Improved Rao-Blackwellized H filter based mobile robot SLAM

Monocular SLAM Using a Rao-Blackwellised Particle Filter with Exhaustive Pose Space Search

GMM - Generalized Method of Moments

Hidden Markov Models

Chapter 14. (Supplementary) Bayesian Filtering for State Estimation of Dynamic Systems

Linear Gaussian State Space Models

References are appeared in the last slide. Last update: (1393/08/19)

מקורות לחומר בשיעור ספר הלימוד: Forsyth & Ponce מאמרים שונים חומר באינטרנט! פרק פרק 18

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

An introduction to the theory of SDDP algorithm

Ensamble methods: Bagging and Boosting

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

1 Review of Zero-Sum Games

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Monte Carlo data association for multiple target tracking

Probabilistic Fundamentals in Robotics

Vehicle Arrival Models : Headway

Multi-Robot Simultaneous Localization and Mapping (Multi-SLAM)

An recursive analytical technique to estimate time dependent physical parameters in the presence of noise processes

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems

Ensamble methods: Boosting

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

WATER LEVEL TRACKING WITH CONDENSATION ALGORITHM

A PROBABILISTIC MULTIMODAL ALGORITHM FOR TRACKING MULTIPLE AND DYNAMIC OBJECTS

Západočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl

Tracking. Many slides adapted from Kristen Grauman, Deva Ramanan

Data Fusion using Kalman Filter. Ioannis Rekleitis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Linear Response Theory: The connection between QFT and experiments

Introduction to Mobile Robotics Summary

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

Fixed-lag Sampling Strategies for Particle Filtering SLAM

FastSLAM with Stereo Vision

Monte Carlo Sampling of Non-Gaussian Proposal Distribution in Feature-Based RBPF-SLAM

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Testing for a Single Factor Model in the Multivariate State Space Framework

Look-ahead Proposals for Robust Grid-based SLAM

Overview. COMP14112: Artificial Intelligence Fundamentals. Lecture 0 Very Brief Overview. Structure of this course

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

Planning in POMDPs. Dominik Schoenberger Abstract

STATE-SPACE MODELLING. A mass balance across the tank gives:

Robust estimation based on the first- and third-moment restrictions of the power transformation model

A Rao-Blackwellized Parts-Constellation Tracker

Lecture 2-1 Kinematics in One Dimension Displacement, Velocity and Acceleration Everything in the world is moving. Nothing stays still.

An EM based training algorithm for recurrent neural networks

Decentralized Stochastic Control with Partial History Sharing: A Common Information Approach

Online Appendix to Solution Methods for Models with Rare Disasters

Final Spring 2007

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Localization. Mobile robot localization is the problem of determining the pose of a robot relative to a given map of the environment.

23.2. Representing Periodic Functions by Fourier Series. Introduction. Prerequisites. Learning Outcomes

Robert Kollmann. 6 September 2017

Filtering Turbulent Signals Using Gaussian and non-gaussian Filters with Model Error

AUV positioning based on Interactive Multiple Model

Learning a Class from Examples. Training set X. Class C 1. Class C of a family car. Output: Input representation: x 1 : price, x 2 : engine power

Recent Developments In Evolutionary Data Assimilation And Model Uncertainty Estimation For Hydrologic Forecasting Hamid Moradkhani

Object tracking: Using HMMs to estimate the geographical location of fish

Maximum Likelihood Parameter Estimation in State-Space Models

FastSLAM: A Factored Solution to the Simultaneous Localization and Mapping Problem

Kalman filtering for maximum likelihood estimation given corrupted observations.

Transcription:

vs. A Comparison Michael Calonder, Compuer Vision Lab Swiss Federal Insiue of Technology, Lausanne EPFL) michael.calonder@epfl.ch The wo algorihms are described wih a planar robo applicaion in mind. Generalizaion is o any spaial SLAM scenarios is sraighforward. For simpliciy, we assume here is no conrol inpu. The pose consiss of he robo s posiion x, y) and is heading direcion δ: s := x, y, δ). The landmarks are denoed θ i, simply consising of a pair of planar coordinaes. 1 Probabilisical EKF Formulaion The Exended Kalman Filer EKF) can be viewed as a varian of a Bayesian Filer; EKFs provide a recursive esimae of he sae of a dynamic sysem, or more precise, solve an unobservable, nonlinear esimaion problem. Roughly speaking, he sae x of a sysem a ime can be considered a random variable where he uncerainy abou his sae is represened by a probabiliy disribuion. One is ineresed in he poserior densiy p x z ), where z := {z 1,..., z } is he se of measuremens up o ime and x := s, θ 0, θ 1,..., θ K) is he sae vecor. Noe ha he superscrip refers o he se of variables a ime.) In general, he complexiy of compuing such a densiy grows exponenially wih ime; o make he compuaion racable, he rue sae is being assumed o be an unobserved Markov process implying ha he rue sae is condiionally independen of all earlier saes excep he previous sae: p x x 1) = p x x 1 ) where x := {x 1,..., x } denoes he se of saes. he measuremen a he -h imesep depends only upon he curren sae and is condiionally independen of all oher saes: p z x ) = p z x ). EKF uses a predicion and an updae sep. The predicion sep calculaes he probabiliy of he curren sae x, while he measuremen z for he curren ime sep k is no ye available: p x z 1) = p x x 1 ) p x 1 z 1) dx 1. The lef PDF erm of he inegrand is he PDF of he moion model see below) where he righ par is he PDF of he sae esimaed in he las ime sep, rendering he esimaion recursive. Once we know p x z 1) we can find he poserior or correced esimae using a new measuremen z : p x z ) = p z x ) p x z 1) p z z 1 ) wih p z z 1) = p z x ) p x z 1) dx being some consan) normalizaion erm. The PDF p z x ) is where he measuremen model comes in. EKF filering makes he following assumpions abou he respecive PDFs: p x x 1 ) Nfx 1 ), Q ) p z x ) Nhx ), R ) p x 1 z 1) Nx 1, P 1 ) where Nµ, Σ) denoes he Gaussian disribuion wih mean µ and covariance marix Σ, fx) he moion model and hx) he measuremen model. The error covariance P is in he predicion sep a ime k projeced ahead by P = A P 1 A + W Q 1 W ekf fasslam comp 1/5

and finally updaed in he correcion sep according o P = I K H ) P where K is he Kalman gain marix [8]. The Jacobians involved are A = f x, W = f x 1 w, and H = h x 1 x wih w being he process noise vecor [8]. 2 Algorihm The key idea of explois he fac ha knowledge of he robo s pah s 1, s 2,..., s renders he individual landmark measuremens independen [3], as originally observed by Murphy [6]. decomposes he SLAM problem ino one robo localizaion problem, and a collecion of K landmark esimaion problems. In, alike in, poses are assumed o behave according o a probabilisic law named moion model wih an underlying densiy p s s 1 ). Likewise, he measuremens are governed by he probabilisic) measuremen model p z s, θ, n ) wih z measuremen, θ = {θ 1,..., θ K } he se of landmarks, and n {1,..., K} he index of he observed landmark a ime only one a a ime). The ulimae goal is o esimae he poserior p s, θ z ). The exac facored represenaion employed by reads p s, θ z, n ) = p s z, n ) p θ z, n ) = p s z, n ) p θ k s, z, n ). 1) 1 k K Noe he condiional dependence on n. The K + 1 facors in 1) are compued as follows. Esimaion of pah p s z, n ) This is achieved by a paricle filer: mainains a se of M paricles, {s,[m] } = {s [m] 1,..., s [m] }, where [m] refers o he m-h paricle in he se. To creae he paricle se a ime, s,[m], one firs ) samples from he moion model M paricles ino a emporary se; s [m] p s s [m] 1. Then he imporance facor weighs w [m] are calculaed; hey deermine he probabiliy ha a cerain paricle from he emporary se eners he final se. Esimaion of landmark locaions p θ k s, z, n ) The poserior updae depends on wheher he k-h landmark has been observed or no, p { θ k s, z, n ) p z θ k, s, n ) p θ k s 1, z 1, n 1) if k = n p θ k s 1, z 1, n 1) oherwise implemens he above updae equaion by means of an EKF. Daa associaion problem as mos exising SLAM soluions based on EKF) solves he daa associaion problem via maximum likelihood esimaion, see secion 6.1. 3 Similariies Basically, and solve he same problem while making use of he idenical probabilisic moion and measuremen models. Furhermore, boh use Kalman filering: EKF SLAM applies he filer once o a high dimensional filering problem where employes M K iny EKFs K of hem in each paricle). x ekf fasslam comp 2/5

4 Differences 4.1 Sae Vecor A fundamenal advanage of over EKF based approaches o he SLAM problem is ha he EKF suffers from a OK 2 ) complexiy where K being he number of landmarks. In conras, has an OM log K) complexiy wih M = cons denoing he number of paricles. Sae vecor is comprised of curren pose esimae as well as landmark esimaes: x = s, θ 0, θ 1,..., θ K). The map managemen will have o change he size of he sae vecor a runime and updae a dimx ) dimx ) marix in each sep, dimx ) = 2K + 3. 4.2 Poserior Densiy All densiies involved in he calculaion of he poserior p x z ) = p z x ) p x z 1) p z z 1 ) are modeled as Gaussians. No sae vecor as in, bu loose se of small pars insead; he K covariance marices in each paricle soring he landmark uncerainies have a small consan size here: 2 2). New observaions jus slighly aler some ree srucure soring he parameers of he Gaussians of each paricle. The problem is decomposed ino K +1 subproblems: one problem of esimaing a poserior over robo pahs s, and K problems of esimaing he locaions of he K landmarks, p s, θ z, n ) = p s z, n ) p θ k s, z, n ). k This facorizaion is exac. 4.3 Daa Associaion Differen approaches, e.g. iniializaion of a 3D line ino he map and successive esimaion of deph before inroducing i definiely. Bu: all aemps use jus one daa associaion, if i is wrong for some reason, e.g. because of an ambigous environmen, he filer will probably diverge. 4.4 Observaion Model The observaion model is simply assumed o have an underlying Normal disribuion, p z x ) Nhx ), R ). The covariance marix R is updaed according o a Jacobian calculaion in every ime sep. Each paricle has is own hypohesis of daa associaion, n [m]. The concep of resampling according o imporance weighs les wrong associaions disappear in expecaion) and makes he filer more robus. In addiion o he s observaion model, makes explici use of daa associaion informaion in n : p z θ k, s, n ). ekf fasslam comp 3/5

5 1.0 vs. 2.0 Basically, he only modificaion proposed by Version 2 is ha he proposal disribuion 2) should no only rely on he previous esimae of he pose s 1, bu also on he acual measuremen z, p s s 1,[m], z, n ). s [m] 5.1 Proposal Disribuion This secion roughly inroduces he consequences of he new choice for he proposal disribuion. I can be expressed as p s s 1,[m], z, n ) = η [m] wih p z θ n, s, n ) Nh θ n, s )), R )) p θ n s 1,[m], z 1, n 1) ) N µ [m] n, 1, Σ[m] n, 1 ) p s s [m] 1 N f s [m] 1 p z θ n, s, n ) p θ n s 1,[m], z 1, n 1) ) dθ n p s s [m] 1 ) ), P where h ) and f ) arise in he general SLAM problem as he measuremen model p z θ, s, n ) = h θ n, s ) + ɛ and he moion model p s s 1 ) = f s 1 ) + δ, respecively, are inroduced. Furhermore, ɛ N0, R ) and δ N0, Q ). However, he new proposal disribuion does no have a closed form unless h ) is linearized around he landmark posiion µ [m] n, 1 and he prediced pose esimae ŝ[m], ) ) h θ n, s ) ẑ [m] + H θ θ n µ [m] n, 1 + H s s ŝ [m]. ˆθ[m] where we used he prediced measurumen ẑ [m] = h n ). H θ and H s sand for he respecive Jacobians evaluaed a ŝ [m] [m], ˆθ n ). 5.2 Imporance Weighs The ways he imporance weighs are calculaed for Version 1 and 2 differ insofar ha in Fas- SLAM 2.0 one has o accoun for he normalizer η [m] ha was no here before. However, he imporance weighs in Version 2 are sill normally disribued wih mean ẑ [m] and covariance H s P H s + H θ Σ [m] n, 1 H θ + R. 5.3 Unknown Daa Associaion Similar o Version 1, 2.0 selecs ha associaion n ha maximizes he probabiliy of measuremen z for he m-h paricle cf. Secion 6.1). However, his probabiliy has o be modified in order o consider he sampled pose. Linearizing ) h ) and calculaing he new probabiliy leads o a Gaussian over z wih mean h µ [m] n, 1, s[m] and covariance R. 6 wih Unknown Daa Associaion Solving he unknown daa associaion problem is a crucial sep in since oherwise i would no be possible o decompose he poserior 1). Monemerlo e al. suggesed four ways o handle his problem [4]. ekf fasslam comp 4/5

6.1 Per-Paricle Maximum Likelihood Daa Associaion Wih Bayes and Markov one can show ha p n z ) ) p z s [m], n, n = arg max p n z ), n m where n is he ML esimae. This procedure is also adoped by EKFs, even hough here i is used on a per-paricle basis. Consequences are a) Noise in he pose esimaion will be filered ou, given a reasonable number of paricles. b) Delayed decision-making: pose ambiguiies ha appear reasonable a he ime he decision is aken) are resolved since hey will laer receive low imporance weighs. 6.2 Mone-Carlo Daa Associaion The mehod above can be aken a sep furher: each paricle can draw a random associaion weighed by he probabiliies of each landmark having generaed he observaion. However, uniformly high measuremen errors lead o an exponenial increase in he number of paricles required in comparison o he same scenario wih known daa associaion. 6.3 Muual Exclusion Muual Exclusion requires o consider more han one observaion per ime sep and allows o eliminae daa associaion hypoheses ha assign muliple measuremens o he same landmark. They propose o apply he muual exclusion consrain in a greedy fashion: each observaion is associaed wih he mos likely landmark in each paricle ha has no received an observaion ye compuaionally inexpensive since already mainains a se of daa associaion hypoheses). Thus, muual exclusion forces he creaion of a new landmark whenever he observaion canno be explained by he acual associaions. 6.4 Negaive Informaion Areas wihou known) landmarks canno be assumed o be empy, bu sill one can draw an inference from such informaion: if he sysem is expecing o see a paricular landmark bu acually does no, i should become less confiden abou he exisence of he landmark. This can be achieved by borrowing a echnique for making evidence grids a deailed descripion is given in [7]). References [1] A. Davison. Real-ime simulaneous localisaion and mapping wih a single camera, 2003. [2] Suar Kramer. A Bayesian perspecive on why he exended Kalman filer fails in passive racking. Technical repor, Air Force Insiue of Technology, 1998. [3] M. Monemerlo, S. Thrun, D. Koller, and B. Wegbrei. : A facored soluion o he simulaneous localizaion and mapping problem. 2002. [4] Michael Monemerlo and Sebasian Thrun. Simulaneous localizaion and mapping wih unknown daa associaion using fasslam. In ICRA, pages 1985 1991, 2003. [5] Michael Monemerlo, Sebasian Thrun, Daphne Koller, and Ben Wegbrei. Fasslam 2.0: An improved paricle filering algorihm for simulaneous localizaion and mapping ha provably converges. In IJCAI, pages 1151 1156, 2003. [6] K. Murphy. Bayesian map learning in dynamic environmens. 1999. [7] Sebasian Thrun. Learning Meric-Topological Maps Maps for Indoor Mobile Robo Navigaion. Arificial Inelligence, 991):21 71, 1998. [8] Greg Welch and Gary Bishop. An inroducion o he kalman filer. Technical Repor TR 95-041, 2004. ekf fasslam comp 5/5