ROBOTICS 01PEEQW Basilio Bona DAUIN Politecnico di Torino
Probabilistic Fundamentals in Robotics Gaussian Filters
Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile robot localization problem Robotic mapping Probabilistic planning and control Reference textbook Thrun, Burgard, Fox, Probabilistic Robotics, MIT Press, 2006 http://www.probabilistic-robotics.org/ Basilio Bona 3
Basic mathematical framework Recursive state estimation Basic concepts in probability Robot environment Bayes filters Gaussian filters (parametric filters) Kalman filter Extended Kalman Filter Unscented Kalman filter Information filter Nonparametric filters Histogram filter Particle filter Basilio Bona 4
Introduction Gaussian filters are different implementations of Bayes filters for continuous spaces, with specific assumptions on probability distributions Beliefs are represented by multi-variate normal distributions Basilio Bona 5
Multi-variate Gaussian distribution Covariance matrix Mean vector Basilio Bona 6
Examples Bi-dimensional Gaussian with conditional probabilities Mixture of Gaussians Basilio Bona 7
Covariance matrix Basilio Bona 8
Kalman filter (1) Kalman filter (KF) [Swerling: 1958, Kalman: 1960] applies to linear Gaussian systems KF computes the belief for continuous states governed by linear dynamic state equations Beliefs are expressed by normal distributions KF is not applicable to discrete or hybrid state space systems Basilio Bona 9
Kalman filter (2) Basilio Bona 10
Kalman filter (3) Basilio Bona 11
Kalman filter (4) Basilio Bona 12
Kalman filter algorithm (1) Kalman gain Prediction Innovation (residuals) covariance Update Basilio Bona 13
Block diagram z t u t B t µ + t zˆt C t + + At D µ µ t 1 t + K + t residuals Basilio Bona 14
Kalman filter algorithm (2) visible hidden Basilio Bona 15
Kalman filter example Initial state measurement update prediction measurement update Basilio Bona 16
From Kalman filter to extended Kalman filter Kalman filter is based on linearity assumptions Gaussian random variables are expressed by means and covariance matrices of normal distributions Gaussian distributions are transformed into Gaussian distributions Kalman filter is optimal Kalman filter is efficient Basilio Bona 17
Linear transformation of Gaussians Basilio Bona 18
Extended Kalman Filter (EKF) When the linearity assumptions do not hold (as in robot motion models or orientation models) a closed form solution of the predicted belief does not exists Nonlinear state & measurement equations Extended Kalman Filter (EKF) approximates the nonlinear transformations with a linear one Linearization is performed around the most likely value: i.e., the mean value Basilio Bona 19
EKF Example Montecarlo generated distribution Transformed mean value Approximating mean value Approximating Gaussian Approximating Gaussian uses mean and covariance of the Montecarlo generated distribution Basilio Bona 20
EKF Example Basilio Bona 21
EKF Example EKF Gaussian: the normal distribution built using mean and covariance of the true nonlinear distributions Approximating Gaussian: the normal distribution built using mean and covariance of the true nonlinear distributions Approximat ing Gaussian EKF Gaussian Basilio Bona 22
EKF linearization Taylor expansion Depends only on the mean Basilio Bona 23
EKF algorithm Basilio Bona 24
KF vs EKF Basilio Bona 25
Features EKF is a very popular tool for state estimation in robotics It has the same time complexity of the KF It is robust and simple Limitations: rarely state and measurement functions are linear. Goodness of linear approximation depends on Degree of uncertainty Degree of nonlinearity When using EKF the uncertainty must be kept small as much as possible Basilio Bona 26
Uncertainty More uncertain More uncertain Less uncertain Less uncertain Basilio Bona 27
Uncertainty More uncertain Less uncertain Basilio Bona 28
Nonlinearity More nonlinear More linear Basilio Bona 29
Nonlinearity More nonlinear More linear Basilio Bona 30
Example: EKF Localization within a sensor infrastructure Fixed sensors (deployed in known positions inside the environment) Mobile Robot can acquire odometric measurements and distance information from sensors in known positions True position of the mobile robot t=0 KF estimate (time zero) Basilio Bona 31
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry t=0 Basilio Bona 32
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction t=0 Luca Carlone Politecnico di Torino Basilio Bona 33
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. t=0 Luca Carlone Politecnico di Torino Basilio Bona 34
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update t=0 Luca Carlone Politecnico di Torino Basilio Bona 35
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update STEP 2: - Acquire odometry t=0 Luca Carlone Politecnico di Torino Basilio Bona 36
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update STEP 2: - Acquire odometry - Filter Prediction t=0 Luca Carlone Politecnico di Torino Basilio Bona 37
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update t=0 STEP 2: - Acquire odometry - Filter Prediction - Acquire meas. Luca Carlone Politecnico di Torino Basilio Bona 38
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update t=0 STEP 2: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update Luca Carlone Politecnico di Torino Basilio Bona 39
Example: EKF Localization within a sensor infrastructure STEP 1: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update t=0 STEP 2: - Acquire odometry - Filter Prediction - Acquire meas. - Filter Update... Luca Carlone Politecnico di Torino Basilio Bona 40
Unscented Kalman Filter (UKF) UKF performs a stochastic linearization based on a weighted statistical linear regression A deterministic sampling technique (the unscented transform) is used to pick a minimal set of sample points (sigma points) around the mean value of the normal pdf The sigma points are propagated through the nonlinear functions, and then used to compute the mean and covariance of the transformed distribution This approach removes the need to explicitly compute Jacobians, which for complex functions can be difficult to calculate produces a more accurate estimate of the posterior distribution Basilio Bona 41
UKF Basilio Bona 42
UKF Basilio Bona 43
UKF Basilio Bona 44
UKF Algorithm part a) Basilio Bona 45
UKF Algorithm part b) Cross covariance Basilio Bona 46
EKF vs UKF Basilio Bona 47
EKF vs UKF Basilio Bona 48
KF EKF UKF KF EKF UKF Basilio Bona 49
Information filters Belief is represented by Gaussians Moments parameterization Canonical parameterization KF EKF UKF Duality IF EIF Mean Information vector Covariance Information matrix Basilio Bona 50
Multivariate normal distribution Basilio Bona 51
Mahalanobis distance Mahalanobis distance Same Euclidean distance Same Mahalanobis distance Basilio Bona 52
IF algorithm Basilio Bona 53
IF vs KF IF Prediction step requires two matrix inversion KF Prediction step is additive Measurements update is additive Measurements update requires matrix inversion Duality Basilio Bona 54
Extended information filter EIF It is similar to EKF and applies when state and measurement equations are nonlinear Jacobians G and H replace A, B and C matrices State estimate Basilio Bona 55
Practical considerations IF advantages over KF: Simpler global uncertainty representation: set Ω = 0 Numerically more stable (in many but not all robotics applications) Integrates information in simpler way Is naturally fit for multi-robot problems (decentralized data integration => Bayes rule => logarithmic form => addition of terms => arbitrary order) IF limitations: A state estimation is required (inversion of a matrix) Other matrix inversions are necessary (not required for EKF) Computationally inferior to EKF for high-dim state spaces Basilio Bona 56
Final comments In many problems the interaction between state variable is local => structure on Ω => sparseness of Ω but not of Σ Information filters as graphs: sparse information matrix = sparse graph Such graphs are known as Gaussian Markov random fields Basilio Bona 57
Basilio Bona 58
Basilio Bona 59
Basilio Bona 60
Basilio Bona 61
Basilio Bona 62
Basilio Bona 63
Basilio Bona 64
Basilio Bona 65
Basilio Bona 66
Basilio Bona 67
Basilio Bona 68
Basilio Bona 69
Basilio Bona 70
Basilio Bona 71
Basilio Bona 72
Basilio Bona 73
Basilio Bona 74
Basilio Bona 75
Thank you. Any question? Basilio Bona 76