Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Similar documents
Particle Filters. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Computer Vision Group Prof. Daniel Cremers. 11. Sampling Methods

Autonomous Mobile Robot Design

Computer Vision Group Prof. Daniel Cremers. 14. Sampling Methods

Recursive Bayes Filtering

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

The Particle Filter. PD Dr. Rudolph Triebel Computer Vision Group. Machine Learning for Computer Vision

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Introduction to Mobile Robotics Bayes Filter Kalman Filter

Bayesian Methods / G.D. Hager S. Leonard

2D Image Processing (Extended) Kalman and particle filter

Introduction to Mobile Robotics Information Gain-Based Exploration. Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Giorgio Grisetti, Kai Arras

AUTOMOTIVE ENVIRONMENT SENSORS

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

Probabilistic Fundamentals in Robotics

Advanced Techniques for Mobile Robotics Least Squares. Wolfram Burgard, Cyrill Stachniss, Kai Arras, Maren Bennewitz

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Introduction to Mobile Robotics Probabilistic Robotics

Content.

Chapter 9: Bayesian Methods. CS 336/436 Gregory D. Hager

Markov localization uses an explicit, discrete representation for the probability of all position in the state space.

Robot Localization and Kalman Filters

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Sampling Based Filters

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

L11. EKF SLAM: PART I. NA568 Mobile Robotics: Methods & Algorithms

Sensor Fusion: Particle Filter

Mobile Robot Localization

CSE-473. A Gentle Introduction to Particle Filters

CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation

Robotics 2 Least Squares Estimation. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Maren Bennewitz, Wolfram Burgard

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

Results: MCMC Dancers, q=10, n=500

Answers and expectations

Robots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models. Domingo Gallardo

Mobile Robot Localization

Robert Collins CSE586, PSU Intro to Sampling Methods

Robotics 2 Data Association. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

Particle lter for mobile robot tracking and localisation

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

Sampling Based Filters

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

Robotics. Lecture 4: Probabilistic Robotics. See course website for up to date information.

Robert Collins CSE586, PSU Intro to Sampling Methods

Local Positioning with Parallelepiped Moving Grid

Bayesian Methods in Positioning Applications

Introduction to Mobile Robotics Probabilistic Motion Models

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Parallel Particle Filter in Julia

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Self Adaptive Particle Filter

1 Kalman Filter Introduction

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

E190Q Lecture 10 Autonomous Robot Navigation

Variable Resolution Particle Filter

Sensor Tasking and Control

Particle Filters. Outline

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Lecture 2: From Linear Regression to Kalman Filter and Beyond

From Bayes to Extended Kalman Filter

Previously Monte Carlo Integration

Probabilistic Robotics

Simultaneous Localization and Mapping

Sequential Monte Carlo Methods for Bayesian Computation

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Blind Equalization via Particle Filtering

Bayesian Inference and MCMC

Bayesian Regression Linear and Logistic Regression

CSE 473: Artificial Intelligence

TSRT14: Sensor Fusion Lecture 8

Mathematical Formulation of Our Example

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Introduction to Bayesian Learning. Machine Learning Fall 2018

Winter 2019 Math 106 Topics in Applied Mathematics. Lecture 8: Importance Sampling

Kalman Filter Computer Vision (Kris Kitani) Carnegie Mellon University

Autonomous Navigation for Flying Robots

Mobile Robots Localization

Notes on Machine Learning for and

Lecture Particle Filters. Magnus Wiktorsson

CS491/691: Introduction to Aerial Robotics

Markov Models. CS 188: Artificial Intelligence Fall Example. Mini-Forward Algorithm. Stationary Distributions.

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI

Robot Localisation. Henrik I. Christensen. January 12, 2007

Localization. Howie Choset Adapted from slides by Humphrey Hu, Trevor Decker, and Brad Neuman

2D Image Processing. Bayes filter implementation: Kalman filter

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

2D Image Processing. Bayes filter implementation: Kalman filter

The Kalman Filter ImPr Talk

Probability Map Building of Uncertain Dynamic Environments with Indistinguishable Obstacles

Advanced Computational Methods in Statistics: Lecture 5 Sequential Monte Carlo/Particle Filtering

Organization. I MCMC discussion. I project talks. I Lecture.

TSRT14: Sensor Fusion Lecture 9

Transcription:

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization Wolfram Burgard, Cyrill Stachniss, Maren Bennewitz, Kai Arras 1

Motivation Recall: Discrete filter Discretize the continuous state space High memory complexity Fixed resolution (does not adapt to the belief) Particle filters are a way to efficiently represent non-gaussian distribution Basic principle Set of state hypotheses ( particles ) Survival-of-the-fittest 2

Sample-based Localization (sonar)

Mathematical Description Set of weighted samples State hypothesis Importance weight The samples represent the posterior 4

Function Approximation Particle sets can be used to approximate functions The more particles fall into an interval, the higher the probability of that interval How to draw samples form a function/distribution? 5

Rejection Sampling Let us assume that f(x)<1 for all x Sample x from a uniform distribution Sample c from [0,1] if f(x) > c keep the sample otherwise reject the sampe c x f(x) x f (x ) c OK 6

Importance Sampling Principle We can even use a different distribution g to generate samples from f By introducing an importance weight w, we can account for the differences between g and f w = f / g f is often called target g is often called proposal Pre-condition: f(x)>0 à g(x)>0 7

Importance Sampling with Resampling: Landmark Detection Example

Distributions

Distributions Wanted: samples distributed according to p (x z 1, z 2, z 3 ) 10

This is Easy! We can draw samples from p(x z l ) by adding noise to the detection parameters.

Importance Sampling ),...,, ( ) ( ) ( ),...,, ( : Target distribution f 2 1 2 1 n k k n z z z p x p x z p z z z x p = ) ( ) ( ) ( ) ( Sampling distribution g : l l l z p x p x z p z x p = ),...,, ( ) ( ) ( ) ( ),...,, ( w : Importance weights 2 1 2 1 n l k k l l n z z z p x z p z p z x p z z z x p g f = =

Importance Sampling with Resampling Weighted samples After resampling

Particle Filters

Sensor Information: Importance Sampling Bel( x) w α p( z x) Bel α p( z x) Bel Bel ( x) ( x) ( x) = α p( z x)

Robot Motion Bel (, ') ') x x) p( x u x Bel( x d ' The image cannot be displayed. Your computer may not have enough memory to open the image, or the image may have been corrupted. Restart your computer, and then open the file again. If the red x still appears, you may have to delete the image and then insert it again.

Sensor Information: Importance Sampling Bel( x) w α p( z x) Bel α p( z x) Bel Bel ( x) ( x) ( x) = α p( z x)

Robot Motion Bel ( x) p( x u, x') Bel( x') d x'

Particle Filter Algorithm Sample the next generation for particles using the proposal distribution Compute the importance weights : weight = target distribution / proposal distribution Resampling: Replace unlikely samples by more likely ones [Derivation of the MCL equations on the blackboard] 19

Particle Filter Algorithm 1. Algorithm particle_filter( S t-1, u t-1 z t ): 2. S t 3. For i =1 n Generate new samples 4. Sample index j(i) from the discrete distribution given by w t-1 i j ( i) 5. Sample from x p x x, uusing ) and i i 6. w = p( z x ) Compute importance weight i 7. η = η + w t Update normalization factor i i 8. S = S { < x, w > } Insert 9. For =, η = 0 t t t i =1 n t t t t t ( t t 1 t 1 xt u 1 t 1 i i 10. w = w /η Normalize weights t t 20

draw x i t 1 from Bel(x t 1 ) draw x i t from p(x t xi t 1,u t 1 ) Importance factor for x i t : ) ( ) ( ), ( ) ( ), ( ) ( distribution proposal target distribution 1 1 1 1 1 1 t t t t t t t t t t t t i t x z p x Bel u x x p x Bel u x x p x z p w = = η 1 1 1 1 ) ( ), ( ) ( ) ( = t t t t t t t t dx x Bel u x x p x z p x Bel η Particle Filter Algorithm

Resampling Given: Set S of weighted samples. Wanted : Random sample, where the probability of drawing x i is given by w i. Typically done n times with replacement to generate new sample set S.

Resampling W n-1 w n w 1 w 2 W n-1 w n w 1 w 2 w 3 w 3 Roulette wheel Binary search, n log n Stochastic universal sampling Systematic resampling Linear time complexity Easy to implement, low variance

Resampling Algorithm 1. Algorithm systematic_resampling(s,n): 1 2. S ' =, c1 = w 3. For i = 2 n Generate cdf i 4. c i = ci 1 + w 1 5. u ~ U]0, n ], i 1 Initialize threshold 1 = 6. For j =1 n Draw samples 7. While ( u j > c i ) Skip until next threshold reached 8. i = i +1 9. { i 1 S' = S' < x, n > } Insert 10. u = u 1 n Increment threshold j+ 1 j + 11. Return S Also called stochastic universal sampling

Mobile Robot Localization Each particle is a potential pose of the robot Proposal distribution is the motion model of the robot (prediction step) The observation model is used to compute the importance weight (correction step) [For details, see PDF file on the lecture web page] 25

Motion Model Reminder Start

Proximity Sensor Model Reminder Laser sensor Sonar sensor

28

29

30

31

32

33

34

35

36

37

38

39

40

41

42

43

44

45

Sample-based Localization (sonar) 46

Initial Distribution 47

After Incorporating Ten Ultrasound Scans 48

After Incorporating 65 Ultrasound Scans 49

Estimated Path 50

Localization for AIBO robots 51

Using Ceiling Maps for Localization [Dellaert et al. 99]

Vision-based Localization z P(z x) h(x)

Under a Light Measurement z: P(z x):

Next to a Light Measurement z: P(z x):

Elsewhere Measurement z: P(z x):

Global Localization Using Vision

Limitations The approach described so far is able to track the pose of a mobile robot and to globally localize the robot. How can we deal with localization errors (i.e., the kidnapped robot problem)? 58

Approaches Randomly insert samples (the robot can be teleported at any point in time). Insert random samples proportional to the average likelihood of the particles (the robot has been teleported with higher probability when the likelihood of its observations drops). 59

Summary Particle Filters Particle filters are an implementation of recursive Bayesian filtering They represent the posterior by a set of weighted samples They can model non-gaussian distributions Proposal to draw new samples Weight to account for the differences between the proposal and the target Monte Carlo filter, Survival of the fittest, Condensation, Bootstrap filter 60

Summary PF Localization In the context of localization, the particles are propagated according to the motion model. They are then weighted according to the likelihood of the observations. In a re-sampling step, new particles are drawn with a probability proportional to the likelihood of the observation. 61