L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

Similar documents
Bayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.

Autonomous Mobile Robot Design

Introduction to Mobile Robotics Bayes Filter Kalman Filter

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

CS 532: 3D Computer Vision 6 th Set of Notes

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

State Observers and the Kalman filter

Autonomous Navigation for Flying Robots

Robotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard

Introduction to Unscented Kalman Filter

Lecture 2: From Linear Regression to Kalman Filter and Beyond

6.4 Kalman Filter Equations

Particle Filters; Simultaneous Localization and Mapping (Intelligent Autonomous Robotics) Subramanian Ramamoorthy School of Informatics

L03. PROBABILITY REVIEW II COVARIANCE PROJECTION. NA568 Mobile Robotics: Methods & Algorithms

L11. EKF SLAM: PART I. NA568 Mobile Robotics: Methods & Algorithms

2D Image Processing. Bayes filter implementation: Kalman filter

Simultaneous Localization and Mapping (SLAM) Corso di Robotica Prof. Davide Brugali Università degli Studi di Bergamo

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

Introduction to Mobile Robotics Bayes Filter Particle Filter and Monte Carlo Localization

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

The Kalman Filter ImPr Talk

2D Image Processing. Bayes filter implementation: Kalman filter

Robot Localization and Kalman Filters

From Bayes to Extended Kalman Filter

TSRT14: Sensor Fusion Lecture 8

Introduction to Mobile Robotics Probabilistic Robotics

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

CS491/691: Introduction to Aerial Robotics

Lecture 7: Optimal Smoothing

ESTIMATOR STABILITY ANALYSIS IN SLAM. Teresa Vidal-Calleja, Juan Andrade-Cetto, Alberto Sanfeliu

Using the Kalman Filter for SLAM AIMS 2015

Robotics 2 Target Tracking. Giorgio Grisetti, Cyrill Stachniss, Kai Arras, Wolfram Burgard

CIS 390 Fall 2016 Robotics: Planning and Perception Final Review Questions

9 Multi-Model State Estimation

1 Kalman Filter Introduction

SLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

State Estimation for Nonlinear Systems using Restricted Genetic Optimization

Dual Estimation and the Unscented Transformation

2D Image Processing (Extended) Kalman and particle filter

A Study of Covariances within Basic and Extended Kalman Filters

The Unscented Particle Filter

Image Alignment and Mosaicing Feature Tracking and the Kalman Filter

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Ensemble Data Assimilation and Uncertainty Quantification

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

EE 565: Position, Navigation, and Timing

Probabilistic Graphical Models

the robot in its current estimated position and orientation (also include a point at the reference point of the robot)

Gaussians. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Markov localization uses an explicit, discrete representation for the probability of all position in the state space.

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Stochastic Processes, Kernel Regression, Infinite Mixture Models

Unscented Transformation of Vehicle States in SLAM

Lecture 6: Bayesian Inference in SDE Models

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

Gaussian Filtering Strategies for Nonlinear Systems

COS Lecture 16 Autonomous Robot Navigation

Dynamic models 1 Kalman filters, linearization,

Partially Observable Markov Decision Processes (POMDPs)

4 Derivations of the Discrete-Time Kalman Filter

Tracking an Accelerated Target with a Nonlinear Constant Heading Model

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Mobile Robot Localization

X t = a t + r t, (7.1)

Here represents the impulse (or delta) function. is an diagonal matrix of intensities, and is an diagonal matrix of intensities.

PROBABILISTIC REASONING OVER TIME

1 Bayesian Linear Regression (BLR)

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Sensor Tasking and Control

Optimal control and estimation

Lecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay

Conditions for Suboptimal Filter Stability in SLAM

Probabilistic Fundamentals in Robotics

SLAM for Ship Hull Inspection using Exactly Sparse Extended Information Filters

Distributed Data Fusion with Kalman Filters. Simon Julier Computer Science Department University College London

Sequential Monte Carlo Methods for Bayesian Computation

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lessons in Estimation Theory for Signal Processing, Communications, and Control

State Estimation using Moving Horizon Estimation and Particle Filtering

Kalman Filter Computer Vision (Kris Kitani) Carnegie Mellon University

Introduction to Mobile Robotics SLAM: Simultaneous Localization and Mapping

Convergence of Square Root Ensemble Kalman Filters in the Large Ensemble Limit

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

AUTOMOTIVE ENVIRONMENT SENSORS

Sensor Fusion: Particle Filter

Local Positioning with Parallelepiped Moving Grid

Mobile Robot Localization

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier

Modeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems

Temporal probability models. Chapter 15, Sections 1 5 1

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Transcription:

L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms

2 PS2 is out! Landmark-based Localization: EKF, UKF, PF

Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian systems Optimal Unbiased Estimator Non-Gaussian systems Best Linear Unbiased Estimator (BLUE) KF Falling Body Demo

Estimation is.. Data Estimation Engine Estimate Prior Beliefs From P. Newman, Oxford

Minimum Mean Squared Error Estimation Choose so argument is minimised Expectation operator ( average ) From P. Newman, Oxford

Evaluating. From probability theory Very Important Thing From P. Newman, Oxford

Recursive Bayesian Estimation Key idea: one mans posterior is another s prior ;-) Sequence of data (measurements) We want conditional mean (mmse) of x given Z k Can we iteratively calculate this i.e. every time a new measurement comes in, update our estimate? From P. Newman, Oxford

Yes Implicitly dropped dependence on Z k-1 At time k Explains data at time k At time k-1 as function of x at time k And if these distributions are Gaussian turning the handle leads to the Kalman filter From P. Newman, Oxford

Bayes Filter Reminder Prior Prediction Correction

Kalman Filter Distribution Everything is Gaussian 1D 3D Courtesy: K. Arras

Properties of Gaussians Univariate Multivariate We stay in the Gaussian world as long as we start with Gaussians and perform only linear transformations.

12 Gaussian Covariance & Information Parameterizations: A Dual Relationship Covariance Form Information Form Marginalization (sub-block) (Schur complement) Conditioning (Schur complement) (sub-block)

Discrete Kalman Filter 13 Estimates the (n 1) state x t of a discrete-time controlled process that is governed by the linear stochastic difference equation Observed through (k 1) measurements z t

Components of a Kalman Filter 14 A t Matrix (n n) that describes how the state evolves from t-1 to t without controls or noise. B t C t δ t Matrix (n m) that describes how the control u t changes the state from t-1 to t. Matrix (k n) that describes a projection of state x t to an observation z t. Random variables representing the process and measurement noise that are assumed to be independent and normally distributed with covariance R t and Q t, respectively.

Linear Gaussian Systems: Initialization 15 Initial belief is normally distributed:

Linear Gaussian Systems: Dynamics 16 Dynamics are linear function of state and control plus additive noise:

17 Linear Gaussian Systems: Dynamics

Linear Gaussian Systems: Observations 18 Observations are linear function of state plus additive noise:

19 Linear Gaussian Systems: Observations

20 Gaussian Covariance & Information Parameterizations: A Dual Relationship Covariance Form Information Form Marginalization (sub-block) (Schur complement) Conditioning (Schur complement) (sub-block)

Kalman Filter Algorithm

1D Kalman Filter Example (1) prediction measurement correction It's a weighted mean!

1D Kalman Filter Example (2) prediction correction measurement

The Prediction-Correction-Cycle 24 Prediction prediction

The Prediction-Correction-Cycle 25 correction measurement Correction

The Prediction-Correction-Cycle 26 Prediction Correction

Alternative (Equivalent) Covariance Update Expressions Defining innovation/observation covariance as Alternative Update Expressions (see Bar-Shalom Chap 5) 1. 2. 3. Joseph form

What if the statistics are not Gaussian? Structure of KF corresponds to the Best Linear Unbiased Estimator (BLUE) i.e., if we restrict our estimator to the class of linear estimators, then the KF is the best linear MMSE estimator* Affine function of z Estimator Matrix MSE Remarks The best estimator (in the MMSE sense) for Gaussian Random variables is identical to The best linear estimator for arbitrarily distributed random variables with the same firstand second-order moments. This result will be proved (and used) in L07 *Note: a nonlinear estimator could do better!

Falling Body Example Governing Equations CT State-Space Description I.C. y 0, v 0 DT State-Space Description y

ProbRob Notation Initial State Process Model Observation Model

Falling Body: Prediction Process model builds correlation beetween position and velocity Falling body position uncertainty increases during open-loop prediction due to uncertainty in velocity

Falling Body: Correction 0 Zero state uncertainty case:

Falling Body: Correction 1 Infinite state uncertainty case:

Noisy Range Measurements (σ = 12 m) 120 Noisy Measured Falling Body Position 100 80 Measured y [m] 60 40 20 0-20 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 t [s]

180 Estimated Position with 3 Sigma Error Bounds KF Results 160 140 Position 3σ y true y est 120 100 y [m] 80 y [m] 120 100 80 60 40 20 0 x 0 = [89.1859, -8.32792] T diag(p 0 ) 0.5 = [25, 5] T R 0.5 = 12 Estimated Falling Body Position y true y meas -20 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 t [s] y est v [m/s] 60 40 20 0 x 0 = [89.1859, -8.32792] T diag(p 0 ) 0.5 = [25, 5] T R 0.5 = 12-20 -1 0 1 2 3 4 5 t [s] 10 0-10 -20-30 Estimated Velocity with 3 Sigma Error Bounds Velocity 3σ x 0 = [89.1859, -8.32792] T v true v est -40 diag(p 0 ) 0.5 = [25, 5] T R 0.5 = 12-50 -1 0 1 2 3 4 5 t [s]

Commonly used: the i j notation true estimated This is useful for derivations but we can never use it in a calc as x is unknown truth! Data up to t=j

prediction is last estimate no New control u(k) available? yes Input to plant enters here e.g steering wheel angle Prediction x(k k-1) = x(k-1 k-1) P(k k-1) = P(k-1 k-1) x(k k-1) = Fx(k-1 k-1) +Bu(k) P(k k-1) = F P(k-1 k-1) F T +GQG T Delay no New observation z(k) available? yes Data from sensors enter algorithm here e.g compass measurement Prepare For Update x(k k) = x(k k-1) P(k k) = P(k k-1) S = H P(k k-1) H T +R W = P(k k-1) H T S -1 v(k) = z(k) - H x(k k-1) estimate is last prediction Update k=k+1 x(k k) = x(k k-1)+ Wv(k) P(k k) = P(k k-1) - W S W T

Kalman Filter Summary 39 Highly efficient: Polynomial in measurement dimensionality k and state dimensionality n: O(k 2.376 + n 2 ) Optimal for linear Gaussian systems! Most robotics systems are nonlinear!