E190Q Lecture 11 Autonomous Robot Navigation

Similar documents
COS Lecture 16 Autonomous Robot Navigation

E190Q Lecture 10 Autonomous Robot Navigation

ARW Lecture 04 Point Tracking

Vlad Estivill-Castro. Robots for People --- A project for intelligent integrated systems

Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Kalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q

the robot in its current estimated position and orientation (also include a point at the reference point of the robot)

Motion Models (cont) 1 2/15/2012

1 Kalman Filter Introduction

Localización Dinámica de Robots Móviles Basada en Filtrado de Kalman y Triangulación

Consistent Triangulation for Mobile Robot Localization Using Discontinuous Angular Measurements

Mobile Robots Localization

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping

A Study of Covariances within Basic and Extended Kalman Filters

From Bayes to Extended Kalman Filter

Probabilistic Graphical Models

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier

TSRT14: Sensor Fusion Lecture 8

Autonomous Mobile Robot Design

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

The Kalman Filter (part 1) Definition. Rudolf Emil Kalman. Why do we need a filter? Definition. HCI/ComS 575X: Computational Perception.

Censoring and Fusion in Non-linear Distributed Tracking Systems with Application to 2D Radar

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

Distributed Intelligent Systems W4 An Introduction to Localization Methods for Mobile Robots

Lecture 8: Signal Detection and Noise Assumption

Exponential Family and Maximum Likelihood, Gaussian Mixture Models and the EM Algorithm. by Korbinian Schwinger

Autonomous Navigation for Flying Robots

TSRT14: Sensor Fusion Lecture 9

Vlad Estivill-Castro (2016) Robots for People --- A project for intelligent integrated systems

Lecture 2: From Linear Regression to Kalman Filter and Beyond

CS 532: 3D Computer Vision 6 th Set of Notes

2D Image Processing. Bayes filter implementation: Kalman filter

Using the Kalman Filter for SLAM AIMS 2015

Simultaneous Localization and Mapping

Kalman Filter. Man-Wai MAK

Lecture 4: Extended Kalman filter and Statistically Linearized Filter

Gaussian with mean ( µ ) and standard deviation ( σ)

Time Series Analysis

2D Image Processing (Extended) Kalman and particle filter

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

Data Fusion Kalman Filtering Self Localization

LARGE-SCALE TRAFFIC STATE ESTIMATION

Gaussian Process Approximations of Stochastic Differential Equations

1 Introduction. Systems 2: Simulating Errors. Mobile Robot Systems. System Under. Environment

Lecture 6: Bayesian Inference in SDE Models

2D Image Processing. Bayes filter implementation: Kalman filter

Lego NXT: Navigation and localization using infrared distance sensors and Extended Kalman Filter. Miguel Pinto, A. Paulo Moreira, Aníbal Matos

UCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

4 Derivations of the Discrete-Time Kalman Filter

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations

ENGR352 Problem Set 02

Machine Learning 4771

Lecture Note 12: Kalman Filter

Automatic Self-Calibration of a Vision System during Robot Motion

Lecture 7: Optimal Smoothing

Robot Control Basics CS 685

Likelihood as a Method of Multi Sensor Data Fusion for Target Tracking

Stochastic Processes

Introduction to Mobile Robotics Bayes Filter Kalman Filter

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

A Probability Review

CSE 483: Mobile Robotics. Extended Kalman filter for localization(worked out example)

The Kalman Filter. An Algorithm for Dealing with Uncertainty. Steven Janke. May Steven Janke (Seminar) The Kalman Filter May / 29

L11. EKF SLAM: PART I. NA568 Mobile Robotics: Methods & Algorithms

TSRT14: Sensor Fusion Lecture 6. Kalman Filter (KF) Le 6: Kalman filter (KF), approximations (EKF, UKF) Lecture 5: summary

Algorithms for Uncertainty Quantification

Joint GPS and Vision Estimation Using an Adaptive Filter

Chapter 5 continued. Chapter 5 sections

Lecture 9. Time series prediction

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

Bessel s Equation. MATH 365 Ordinary Differential Equations. J. Robert Buchanan. Fall Department of Mathematics

ESTIMATOR STABILITY ANALYSIS IN SLAM. Teresa Vidal-Calleja, Juan Andrade-Cetto, Alberto Sanfeliu

Statistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters. Lecturer: Drew Bagnell Scribe:Greydon Foil 1

Lecture 2: Repetition of probability theory and statistics

MAY THE FORCE BE WITH YOU, YOUNG JEDIS!!!

laplace s method for ordinary differential equations

Simultaneous Localization and Mapping (SLAM) Corso di Robotica Prof. Davide Brugali Università degli Studi di Bergamo

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Data assimilation with and without a model

SRI VIDYA COLLEGE OF ENGINEERING AND TECHNOLOGY UNIT 3 RANDOM PROCESS TWO MARK QUESTIONS

Introduction to Unscented Kalman Filter

Quaternion based Extended Kalman Filter

Computational Physics

Lecture Notes 4 Vector Detection and Estimation. Vector Detection Reconstruction Problem Detection for Vector AGN Channel

CS491/691: Introduction to Aerial Robotics

5 Operations on Multiple Random Variables

State-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49

7. The Multivariate Normal Distribution

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

Data assimilation with and without a model

ECONOMICS 207 SPRING 2006 LABORATORY EXERCISE 5 KEY. 8 = 10(5x 2) = 9(3x + 8), x 50x 20 = 27x x = 92 x = 4. 8x 2 22x + 15 = 0 (2x 3)(4x 5) = 0

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

University of Cambridge Engineering Part IIB Module 3F3: Signal and Pattern Processing Handout 2:. The Multivariate Gaussian & Decision Boundaries

Multiple Random Variables

Corner. Corners are the intersections of two edges of sufficiently different orientations.

Transcription:

E190Q Lecture 11 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 013 1 Figures courtesy of Siegwart & Nourbakhsh

Control Structures Planning Based Control Prior Knowledge Operator Commands Localization Cognition Perception Motion Control

Kalman Filter Localization Introduction to Kalman Filters 1. KF Representations. Two Measurement Sensor Fusion 3. Single Variable Kalman Filtering 4. Multi-Variable KF Representations Kalman Filter Localization 3

KF Representations What do Kalman Filters use to represent the states being estimated? Gaussian Distributions! 4

KF Representations 5 Single variable Gaussian Distribution Symmetrical Uni-modal Characterized by Mean µ Variance σ Properties Propagation of errors Product of Gaussians

KF Representations Single Var. Gaussian Characterization Mean Expected value of a random variable with a continuous Probability Density Function p(x) µ = E[X] = x p(x) dx For a discrete set of K samples K µ = Σ x k /K k=1 6

KF Representations Single Var. Gaussian Characterization Variance Expected value of the difference from the mean squared σ =E[(X-µ) ] = (x µ) p(x) dx For a discrete set of K samples σ = Σ (x k µ ) /K K k=1 7

KF Representations Single variable Gaussian Properties Propagation of Errors 8

KF Representations Single variable Gaussian Properties Product of Gaussians 9

KF Representations Single variable Gaussian Properties We stay in the Gaussian world as long as we start with Gaussians and perform only linear transformations. 10

Kalman Filter Localization Introduction to Kalman Filters 1. KF Representations. Two Measurement Sensor Fusion 3. Single Variable Kalman Filtering 4. Multi-Variable KF Representations Kalman Filter Localization 11

Fusing Two Measurements Example Given two measurements q 1 and q, how do we fuse them to obtain an estimate q? Assume measurements are modeled as random variables that follow a Gaussian distribution with variance σ 1 and σ respectively 1

Fusing Two Measurements Example (cont ): 13

Fusing Two Measurements Example (cont ): Lets frame the problem as minimizing a weighted least squares cost function: 14

Fusing Two Measurements Example (cont ): If n= and w i = 1/σ i q = q 1 + σ 1 (q - q 1 ) σ 1 + σ 15

Kalman Filter Localization Introduction to Kalman Filters 1. KF Representations. Two Measurement Sensor Fusion 3. Single Variable Kalman Filtering 4. Multi-Variable KF Representations Kalman Filter Localization 16

Single Variable KF Example: Fusing two Measurements q = q 1 + σ 1 (q - q 1 ) σ 1 + σ We can reformulate this in KF notation 17 x t = x t-1 + K t (z t - x t-1 ) K t = σ t-1 σ t-1 + σ z

Single Variable KF KF for a Discrete Time System x t = x t-1 + K t (z t - x t-1 ) K t = σ t-1 σ t-1 + σ z σ t = σ t-1 -K t σ t-1 18 Where x t is the current state estimate σ t is the associated variance z t is the most recent measurement K is the Kalman Gain

Kalman Filter Localization Introduction to Kalman Filters 1. KF Representations. Two Measurement Sensor Fusion 3. Single Variable Kalman Filtering 4. Multi-Variable KF Representations Kalman Filter Localization 19

Representations in KF 0 Multi-variable Gaussian Distribution Symmetrical Uni-modal Characterized by Mean Vector µ Covariance Matrix Σ Properties Propagation of errors Product of Gaussians

Representations in KF Multi-Var. Gaussian Characterization Mean Vector Vector of expected values of n random variables µ = E[X] = [ µ 0 µ 1 µ µ n ] T µ i = x i p(x i ) dx i 1

Representations in KF Multi-Var. Gaussian Characterization Covariance Expected value of the difference from the means squared σ ij =Cov[X i, X j ] = E[(X i µ i ) (X j µ j ) ] Covariance is a measure of how much two random variables change together. Positive σ ij when variable i is above its expected value, then the other variable j tends to also be above its µ j Negative σ ij when variable i is above its expected value, then the other variable j tends to be below its µ j

Representations in KF Multi-Var. Gaussian Characterization Covariance For continuous random variables σ ij = (x i µ i ) (x j µ j ) p(x i, x j ) dx i dx j For discrete set of K samples σ ij = Σ (x i,k µ i )(x j,k µ j )/K K k=1 3

Representations in KF Multi-Var. Gaussian Characterization Covariance Matrix Covariance between each pair of random variables σ 00 σ 01 σ 0n Σ = σ 10 σ 11 σ 1n : 4 s Note: σ ii =σ i σ n0 σ n1 σ nn

Representations in KF Multi variable Gaussian Properties Propagation of Errors 5

Representations in KF Multi variable Gaussian Properties Product of Gaussians 6

Next Apply the Kalman Filter to multiple variables in the form of a KF. 7

Kalman Filter Localization Introduction to Kalman Filters Kalman Filter Localization 1. EKF Localization Overview. EKF Prediction 3. EKF Correction 4. Algorithm Summary 8

Extended Kalman Filter Localization Robot State Representation State vector to be estimated, x e.g. x x = y θ Associated Covariance, P σ xx σ xy σ xθ 9 P = σ yx σ yy σ yθ σ θx σ θy σ θθ

Extended Kalman Filter Localization 1. Robot State Representation 30

Extended Kalman Filter Localization Iterative algorithm 1. Prediction Use a motion model and odometry to predict the state of the robot and its covariance x t P t. Correction - Use a sensor model and measurement to predict the state of the robot and its covariance x t P t 31

Kalman Filter Localization Introduction to Kalman Filters Kalman Filter Localization 1. EKF Localization Overview. EKF Prediction 3. EKF Correction 4. Algorithm Summary 3

EKFL Prediction Step Motion Model Lets use a general form of a motion model as a discrete time equation that predicts the current state of the robot given the previous state x t-1 and the odometry u t x t = f(x t-1, u t ) 33

EKFL Prediction Step Motion model For our differential drive robot x t-1 x t-1 = y t-1 θ t-1 u t = Δs r,t Δs l,t 34

EKFL Prediction Step Motion model And the model we derived x t-1 Δs t cos(θ t-1 +Δθ t / ) x t = f( x t-1, u t ) = y t-1 + Δs t sin(θ t-1 +Δθ t / ) θ t-1 Δθ t 35 Δs t = (Δs r,t +Δs l,t )/ Δθ t = (Δs r,t - Δs l,t )/b

EKFL Prediction Step Covariance Recall, the propagation of error equation 36

EKFL Prediction Step Covariance Our equation f() is not linear, so to use the property we will linearize with first order approximation x t = f( x t-1, u t ) F x,t x t-1 + F u,t u t 37 where F x,t = Derivative of f with respect to state x t-1 F u,t = Derivative of f with respect to control u t

EKFL Prediction Step Covariance Here, we linearize the motion model f to obtain P t = F x,t P t-1 F T x,t + F u,t Q t F T u,t where Q t = Motion Error Covariance Matrix 38 F x,t = Derivative of f with respect to state x t-1 F u,t = Derivative of f with respect to control u t

EKFL Prediction Step Covariance Q t = k Δs r,t 0 0 k Δs l,t F x,t = df/dx t df/dy t df/dθ t F u,t = df/dδs r,t df/dδs l,t 39

EKFL Prediction Step 1. Motion Model x t x t-1 40

Kalman Filter Localization Introduction to Kalman Filters Kalman Filter Localization 1. EKF Localization Overview. EKF Prediction 3. EKF Correction 4. Algorithm Summary 41

EKFL Correction Step Innovation We correct by comparing current measurements z t with what we expect to observe z exp,t given our predicted location in the map M. The amount we correct our state is proportional to the innovation v t v t = z t - z exp,t 4

EKFL Correction Step The Measurement Assume our robot measures the relative location of a wall i extracted as line z i t = αi t r i t Ri t = σi αα,t σi αr,t σi rα,t σi rr,t 43

EKFL Correction Step The Measurement Assume our robot measures the relative location of a wall i extracted as line z i t = αi t =g(ρ 1, ρ,, ρ n, β 1, β,, β n ) r i t β β β β β β 44 β

EKFL Correction Step The Measurement R i t = σi αα,t σi αr,t σ i rα,t σi rr,t = G ρβ,t Σ z,t G ρβ,t T 45 where Σ z,t = Sensor Error Covariance Matrix G ρβ,t = Derivative of g() wrt measurements ρ t, β t

EKFL Correction Step ziexp,t = hi( x t, M) = 46 αim θ t ri x t cos(αim) y t sin(αim)

EKFL Correction Step 47 xcos(α) x y ysin(α)

EKFL Correction Step The covariance associate with the innovation is Σ IN,t = H i x,t P t Hi x,t T + R i t where R i t = Line Measurement Error Covariance Matrix H i x,t = Derivative of h with respect to state x t 48

EKFL Correction Step Final updates Update the state estimate x t = x t + K t v t Update the associated covariance matrix P t = P t K t Σ IN,t K t T Both use the Kalman gain Matrix 49 K t = P t H x,t T ( Σ IN,t ) -1

EKFL Correction Step Compare with single var. KF Update the state estimate x t = x t-1 + K t (z t - x t-1 ) Update the associated covariance matrix σ t = σ t-1 -K t σ t-1 50 Both use the Kalman gain Matrix K t = σ t-1 σ t-1 + σ z

EKFL Correction Step Final updates By fusing the prediction of robot position (magenta) with the innovation gained by the measurements (green) we get the updated estimate of the robot position (red) 51

Kalman Filter Localization Introduction to Kalman Filters Kalman Filter Localization 1. EKF Localization Overview. EKF Prediction 3. EKF Correction 4. Algorithm Summary 5

EKFL Summary Prediction 1. x t = f( x t-1, u t ). P t = F x,t P t-1 F x,t T + F u,t Q t F u,t T Correction 3. z i exp,t = hi ( x t, M) 4. v t = z t - z exp,t 5. Σ IN,t = H i x,t P t Hi x,t T + R i t 53 6. x t = x t + K t v t 7. P t = P t K t Σ IN,t K t T 8. K t = P t H x,t T ( Σ IN,t ) -1

EKFL Example 54 http://www.youtube.com/watch?v=8mywutacal4