Kalman Filter. Man-Wai MAK
|
|
- Alban Stephens
- 6 years ago
- Views:
Transcription
1 Kalman Filter Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University mwmak References: S. Gannot and A. Yeredor, The Kalman Filter, Springer Handbook of Speech Processing, J. Benesty, M.M. Sondhi and Y. Huang (Eds.), Springer, R. Faragher, Understanding the Basis of the Kalman Filter Via a Simple and Intuitive Derivation, IEEE Signal Processing Magazine, Sept April 20, 2017 Man-Wai MAK (EIE) Kalman Filter April 20, / 24
2 Overview 1 Fundamental of Kalman Filter What is Kalman Filter MMSE Linear Optimal Estimator 2 The Kalman Filter and Its Applications Kalman Filter Formulations 1-Dimensional Example Kalman Filter s Output and Optimal Linear Estimate Man-Wai MAK (EIE) Kalman Filter April 20, / 24
3 What is Kalman Filter Named after Rudolf E. Kalman, the Kalman filter is one of the most important and common data fusion algorithms in use today. The most famous early use of the Kalman filter was in the Apollo navigation computer that took Neil Armstrong to the moon. The Kalman filter has found numerous applications in fields related to control of dynamical systems, e.g., predicting the trajectory of celestial bodies, missiles and microscopic particles. Kalman filters are at work in every satellite navigation devices, every smart phone, UAV, and many computer games. The Kalman filter is a Bayesian model similar to a hidden Markov model but where the state space of the latent variables is continuous and where all latent and observed variables have a Gaussian distribution. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
4 Areas of Applications Kalman filter is suitable for applications where the variables of interest can only be measured indirectly; measurements are available from various sensors, but they might be subject to noise. you have uncertain information about some dynamic systems but you can make an educated guess about what the system is going to do next. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
5 MMSE Linear Optimal Estimator Let x and y be two random vectors with an arbitrary distribution having finite moments up to second order. Denote z = [x T y T ] T. The mean and covariance of z are given by [ ] ηx η z = E{z} = and η y [ ] C zz = E{(z η z )(z η z ) T Cxx C xy } = We estimate x from observation y using C T xy C yy x ˆx = Ay + b or x = Ay + b + ɛ (1) where ɛ = ˆx x is the estimation error. It is desired to find a linear estimator that attains the smallest mean square error (MSE) matrix: P = E{ɛɛ T } Man-Wai MAK (EIE) Kalman Filter April 20, / 24
6 MMSE Linear Optimal Estimator To find the optimal A and b, we rewrite ɛ as ɛ = Ay + b x = A(y η y ) (x η x ) + c where c = b + Aη y η x. Therefore, { [A(y P = E ηy ) (x η x ) + c ] [ A(y η y ) (x η x ) + c ] } T = AE{(y η y )(y η y ) T }A T AE{(y η y )(x η x ) T } E{(x η x )(y η y ) T }A T + E{(x η x )(x η x ) T } + cc T E{x η x }c T ce{(x η x ) T } + AE{y η y }c T + ce{(y η y ) T }A T = AC yy A T AC yx C xy A T + C xx + cc T (2) Man-Wai MAK (EIE) Kalman Filter April 20, / 24
7 MMSE Linear Optimal Estimator Note that only c depends on b in Eq. 2 and that a T cc T a 0 for all non-zero vector a. Therefore, P is minimal when c = 0. This suggests Substituting Eq. 3 into Eq. 2, we have P = AC yy A T AC yx C xy A T + C xx b = η x Aη y (3) = (A C xy C 1 yy )C yy (A C xy C 1 yy ) T + C xx C xy C 1 yy C yy C T yy C T xy = (A C xy C 1 yy )C yy (A C xy C 1 yy ) T + C xx C xy C 1 yy C T xy Therefore, P is minimal when A = C xy C 1 yy, which gives The optimal linear estimator is P min = C xx C xy C 1 yy C T xy. ˆx opt = Ay + b = Ay + η x Aη y = η x + C xy C 1 yy (y η y ) Man-Wai MAK (EIE) Kalman Filter April 20, / 24
8 MMSE Linear Optimal Estimator In summary, when the linear estimator ˆx is optimal, we have E{ɛ} = c = 0 { [A(y E{ɛy T } = E ηy ) (x η x ) + c ] y T} { [A(y = E ηy ) (x η x ) + c ] [ ] } T y η y { [A(y + E ηy ) (x η x ) + c ] } η T y = AC yy C xy + cη T y = 0 A = C xy C 1 yy and c = 0. The condition that the error ɛ and observation y are uncorrelated agrees with the orthogonality principle. The orthogonality principle will be used in deriving the formula of Kalman filters. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
9 Kalman Filter For every new y, we need to recompute the optimal A and b, which involves the computation of covariance matrices and their inverse. The computation become very demanding when the number of observations is very large. The required storage space has to grow when more measurements become available. It is more computationally appealing if we can recursively estimate the optimal A and b when new observation is received without storing the past observations. Such a recursive (linear, optimal) estimation scheme is known as Kalman filtering. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
10 Kalman Filter Formulations The Kalman filter assumes that the state of a system at a time t evolved from the prior state at time t 1 according to the equation: x t = F t x t 1 + B t u t + w t (4) where x t is the state vector containing the terms of interest for the system at time t, e.g., position and velocity; u t is the vector containing any control inputs, e.g., steering angle, throttle setting, and braking force; F t is the state transition matrix which applies the effect of each system state parameter at time t 1 on the system state at time t, e.g., the position and velocity at time t 1 both affect the position at time t; B t is the control input matrix which applies the effect of each control input parameter in the vector u t on the state vector, e.g., applies the effect of the throttle setting on the system velocity and position; w t is the vector containing the process noise with covariance matrix Q t for each parameter in the state vector. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
11 Kalman Filter Formulations In Eq. 4, x t 1 is in analogy to y in Eq. 1, x t is in analogy to ˆx, and w t is in analogy to ɛ. The system is measured according to the model z t = H t x t + v t (5) where z t is the vector of measurements; H t is the transformation matrix that maps the state vector parameters into the measurement domain; v t is a vector containing the measurement noise with covariance matrix R t. The true state x t cannot be directly observed; the Kalman filter provides an algorithm to determine an estimate ˆx t by combining models of the system and noisy measurements of certain parameters or linear functions of parameters. The covariance of estimation error at time t is denoted as P t. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
12 Kalman Filter Formulations The Kalman filter algorithm involves two stages: Prediction and Measurement update. Prediction stage (estimate ˆx t t 1 from previous measurements up to z t 1 ): ˆx t t 1 = F tˆx t 1 t 1 + B t u t (6) P t t 1 = F t P t 1 t 1 F T t + Q t (7) Eq. 6 means that the new best estimate is a prediction made from previous best estimate, plus a correction for known external influences. Eq. 7 means that the new uncertainty is predicted from the old uncertainty, with some additional uncertainty from the environment. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
13 Kalman Filter Formulations Measurement update (obtain ˆx t t from current measurement z t ): ˆx t t = ˆx t t 1 + K t (z t H tˆx t t 1 ) (8) P t t = P t t 1 K t H t P t t 1 (9) K t = P t t 1 H T t (H t P t t 1 H T t + R t ) 1 (10) In Eq. 8, if the measurement z t totally agrees with the prediction, then the current prediction is good enough and update is not necessary. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
14 Kalman Filter Formulations To show Eq. 7, we subtract Eq. 6 from Eq. 4 and compute the covariance of prediction error: P t t 1 = E{(x t ˆx t t 1 )(x t ˆx t t 1 ) T } = E {(F t x t 1 F tˆx t 1 t 1 + w t )(F t x t 1 F tˆx t 1 t 1 + w t ) T} { = F t E (x t 1 ˆx t 1 t 1 )(x t 1 ˆx t 1 t 1 ) T} F T t + F t E{(x t 1 ˆx t 1 t 1 )w T t } + E{w t (x t 1 ˆx t 1 t 1 ) T }F T t + E{w t w T t } = F t P t 1 t 1 F T t + Q t where we have used the property that estimation error and noise are uncorrelated. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
15 Kalman Filter Formulations If w t in Eq. 4 and v t in Eq. 5 follow Gaussian distributions, we may write them as p(x t x t 1 ) = N (x t F t x t 1 + B t u t, Q t ) p(z t x t ) = N (z t H t x t, R t ) The initial state also follows a Gaussian distribution: p(x 1 ) = N (x 1 B t u 1, Q 1 ) Man-Wai MAK (EIE) Kalman Filter April 20, / 24
16 mean Gaussian white noise with covariance Rt. In the derivation that follows, we will consider a simple one-dimensional tracking problem, particularly that of a train moving along a railway line (see Figure 1). We can therefore consider some example vectors and matrices in this problem. The state vector xt contains 1-Dimensional Example F t = ; E B 0 1 and t = > 2 Tt the position and velocity of the train parameters. The estimates of the parameters of interest in the state vector are unknown true value xt is given by We use the following one-dimensional xt x tracking problemt to derive (not t = ; E. therefore now provided by probability P x o t; t-1 = E [( xt-txt ; t-1)( xt-txt ; t-1) ], t density functions (pdfs), rather than discrete values. The Kalman filter is based and taking the difference between (3) The train driver may apply a braking or so vigorously) Eq. accelerating 6 Eq. input to the 10: system, which on Gaussian pdfs, as will become clear and (1) gives H. The true state of the system xt cannot be directly observed, and the Kalman filter provides an algorithm to determine an estimate xt t by combining models of the system and noisy measurements of certain parameters or linear functions of xt; t-1 = Fx t t-1; t- 1+ Btut (3) ; ; ; = t F t- 1 FP t t t (4) Pt Q T t, where Qt is the process noise covariance matrix associated with noisy control inputs. Equation (3) was derived explicitly in the discussion above. We can derive (4) as follows. The variance associated t xt of an with the prediction t-1 Measurement (Noisy) 0 Prediction (Estimate) r [FIG1] This figure shows the one-dim ensional system under consideration. We have the following variables IEEE SIGNAL PROCESSING andmagazine parameters: [129] SEPTEMBER 2012 x t : The unknown (hidden state) distance from the pole to the train s position at time t; z t : Noisy measure of the distance between the pole and the train s RF antenna at time t using time-of-flight techniques; H: A proportional scale to make the time-of-flight measurement compatible with that of distance x t ; u t : The amount of throttle or brake applied by the train s operator at time t; B: A proportional scale to convert u t into distance; w t : Noise of x t ; v t : Measurement noise. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
17 1-Dimensional Example At t = 0, we have an initial estimate (with error) of the train s position: [ lecture NOTES ] continued [FIG2] The initial knowledge of the system at time t = 0. The red Gaussian distribution represents the pdf providing the initial confidence in the estimate of the position of the train. The arrow pointing to the right represents the known initial velocity of the train. xt- xt; t 1 = F( xt 1- xt; t T T 1 t t 1) + wt Kt = Pt t 1Ht ( HtPt t - 1Ht + Rt). (7) from a radio ranging system deployed at At each time step t, we combine two information sources: & Pt; t-1 = E[( F( xt-1-xt t-1; t-1) the track side. The information from the + wt) # ( Fx ( t-1-xt In the remainder of this article, we will predictions and measurements are combined to provide the best possible estimate Predictions (ˆx t 1 t 1 t-1; t-1) ) based on the last known position of the train. derive the measurement update equations [(5) (7)] a radio from ranging first principles. systemof deployed the location of the atrain. the The track system is T + wt)] Measurements (z t ) from = FE[( xt-1-xt t-1; t-1) shown graphically in Figure 1. side. T # ( xt-1- xt t-1; t-1) ] SOLUTIONS The initial state of the system (at T # F + FE[( xt-1 The Kalman filter will be derived here time t = 0 s) is known to a reasonable T - xt t-1; t-1) wt ] by considering a simple one-dimensional tracking problem, specifically that of location of the train is given by a accuracy, as shown in Figure 2. The + E[ wx t t- 1 Man-Wai MAK (EIE) Kalman Filter April 20, / 24
18 T T T ; 1)] [ w ] xt F tw t-1 t- E t - + T & Pt; t-1 = FPt-1; t-1 F + Qt. 1-Dimensional Example The measurement update equations are given by MEASUREMENT. in Figure 3 by a new Gaussian pdf with a new mean and variance. Mathematically this step is represented by (1). The variance has increased [see (2)], representing our reduced certainty in the accuracy of our position estimate compared to t = 0, due to the uncertainty associated with any know the best possible estimate of the location of the train (or more precisely, the location of the radio antenna mount- At t = xtt0, ; t= xtthe t; t-1+ Ktlocation ( zt- Ht x t t; t-1) (5) of the ed on the train roof). isinformation givenis available from two sources: 1) predictions by a Gaussian PDF. Pt; t= Pt; t- 1-KtHtPtt ; - 1, (6) At t = 1, the new train position based on the last is known predicted position and (by Eq. 4), which is where velocity of the train and 2) measurements time t = 1. represented by a new Gaussian PDF with a larger variance: process noise from accelerations or decelerations undertaken from time t = 0 to Prediction (Estimate)??? [FIG3] Here, the predict ion of the location of the train at time t = 1 and the level of uncertainty in that prediction is shown. The confidence in the knowledge of the position of the train has decreased, as we are not certain if the train has undergone any accelerations or decelerations in the intervening period from t = 0 to t = 1. At t = 1, we also have IEEE a SIGNAL measurement PROCESSING MAGAZINE [130] z t SEPTEMBER of the 2012 train position. The measurement error is represented by another Gaussian PDF. Measurement (Noisy) Prediction (Estimate)??? [FIG4] Shows the measur ement of the location of the train at time t = 1 and the level of uncertainty in that noisy measurement, represented by the blue Gaussian pdf. The combined knowledge of this system is provided by multiplying these two pdfs together. Man-Wai MAK (EIE) Kalman Filter April 20, / 24
19 1-Dimensional Example Measurement (Noisy)??? Prediction (Estimate) The best estimate can be obtained by combining our knowledge from the prediction and measurement. Achieved by multiplying the two Gaussian PDFs, resulting in the together. green Gaussian: [FIG4] Shows the measur ement of the location of the train at time t = 1 and the level of uncertainty in that noisy measurement, represented by the blue Gaussian pdf. The combined knowledge of this system is provided by multiplying these two pdfs Measurement (Noisy) Prediction (Estimate)??? [FIG5] Shows the new pdf (green) generated by multiplying the pdfs associated with the prediction and measurement of the train s location at time t = 1. This new pdf provides the best estimate of the location of the train, by fusing the data from the prediction and the measurement. At t = 1, we also make a measurement of the that location the of the variance train using ofthe the blue Gaussian combined function in Gaussian Figure 4 function is smaller, can expanded meaning and then the The measurement pdf represented by The quadratic terms in this new Note the radio positioning system, and this is is given by whole expression rewritten in Gaussian that represented theby predicted the blue Gaussian position pdf in has smaller variation. form ( r n2) Figure 4. The best estimate we can make y ( r;, ) n2 v2 _ e - 2 2v2 2. (9) yfused( r; nfused, vfused) of the location of the train is provided by 2rv2 2 ( r - nfused) combining our knowledge from the prediction and the measurement. This is is fused by multiplying the two together, The information provided by these two pdfs = e v 2 fused, (11) 2rvfused Man-Wai achieved MAK by multiplying (EIE) the two corre- i.e., considering Kalman Filter the prediction and the where April 20, / 24
20 1-Dimensional Example To simplify notations, we omit subscript t and write the PDF of measurement (blue Gaussian) as: 1 N (z, µ z, σ z ) = exp { (z µ z) 2 } 2πσz Because the predicted position is in meter and the measurement (time-of-flight of radio signal) is in second, we need to make the two units compatible. This can be done by converting the prediction x to measurement z by setting H = 1 c, where c is the speed of light, i.e., 2σ 2 z The PDF of prediction becomes r = x c N (r, µ x c, σ x c ) = 1 2π σ x c µx (r exp { c )2 2( σx c )2 Man-Wai MAK (EIE) Kalman Filter April 20, / 24 }
21 1-Dimensional Example Note that both random variables r and z have the same unit (second). Their PDF can now be combined and written in terms of one random variable s: N (s, µ f, σ f ) = = exp { µx (s ( c ))2 2( σx exp { 2π σ x c 2πσz { (s µ f ) 2 } 1 2πσf exp c )2 } 2σ f 2 (s µz )2 2σz 2 } where ( ) σ 2 µ f = µ x + x /c ( σx/c σz 2 µ z µ ) x c ( ) σ σf 2 = σx 2 2 x /c σ 2 x σx/c σz 2 c. (11) Man-Wai MAK (EIE) Kalman Filter April 20, / 24
22 1-Dimensional Example Substituting H = 1 c and K = Hσ2 x H 2 σ 2 x +σ2 z into Eq. 11, we have µ f = µ x + K(µ z Hµ x ) σ 2 f = σ 2 x KHσ 2 x (12) Now we set ˆx t t µ f (posterior mean) ˆx t t 1 µ x P t t σ 2 f (posterior covariance) P t t 1 σ 2 x z t µ z R t σ 2 z H t H K t K Man-Wai MAK (EIE) Kalman Filter April 20, / 24
23 1-Dimensional Example Substituting these variables into Eq. 12, we have µ f = µ x + K(µ z Hµ x ) ˆx t t = ˆx t t 1 + K(z t H tˆx t t 1 ) σ 2 f = σ 2 x KHσ 2 x P t t = P t t 1 K t H t P t t 1 Hσ 2 x K = H 2 σx 2 + σz 2 K t = P t t 1 H T t (H t P t t 1 H T t + R t ) 1 Note that they are the Kalman filter equations (Eq. 8 Eq. 10). Man-Wai MAK (EIE) Kalman Filter April 20, / 24
24 Kalman Filter s Output and Optimal Linear Estimate In a Gaussian framework, the Kalman filter s output is the optimal linear estimate: ˆx t t = E{x t z 0, z 1,..., z t } = µ f,t in the 1-D example The covariance of prediction error, given the measurement up to time t is } P t t = E {(ˆx t t x t )(ˆx t t x t ) T z 0, z 1,..., z t = σ 2 f,t in the 1-D example Man-Wai MAK (EIE) Kalman Filter April 20, / 24
ENGR352 Problem Set 02
engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).
More informationLinear Dynamical Systems
Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations
More informationKalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein
Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time
More informationTime Series Analysis
Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture State space models, 1st part: Model: Sec. 10.1 The
More information4 Derivations of the Discrete-Time Kalman Filter
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time
More informationA Theoretical Overview on Kalman Filtering
A Theoretical Overview on Kalman Filtering Constantinos Mavroeidis Vanier College Presented to professors: IVANOV T. IVAN STAHN CHRISTIAN Email: cmavroeidis@gmail.com June 6, 208 Abstract Kalman filtering
More informationThe Kalman Filter ImPr Talk
The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman
More informationStatistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters. Lecturer: Drew Bagnell Scribe:Greydon Foil 1
Statistical Techniques in Robotics (16-831, F12) Lecture#17 (Wednesday October 31) Kalman Filters Lecturer: Drew Bagnell Scribe:Greydon Foil 1 1 Gauss Markov Model Consider X 1, X 2,...X t, X t+1 to be
More informationECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form
ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER KRISTOFFER P. NIMARK The Kalman Filter We will be concerned with state space systems of the form X t = A t X t 1 + C t u t 0.1 Z t
More informationECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering
ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:
More informationX t = a t + r t, (7.1)
Chapter 7 State Space Models 71 Introduction State Space models, developed over the past 10 20 years, are alternative models for time series They include both the ARIMA models of Chapters 3 6 and the Classical
More informationCS229 Lecture notes. Andrew Ng
CS229 Lecture notes Andrew Ng Part X Factor analysis When we have data x (i) R n that comes from a mixture of several Gaussians, the EM algorithm can be applied to fit a mixture model. In this setting,
More informationPCA and LDA. Man-Wai MAK
PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer
More informationLecture 4: Least Squares (LS) Estimation
ME 233, UC Berkeley, Spring 2014 Xu Chen Lecture 4: Least Squares (LS) Estimation Background and general solution Solution in the Gaussian case Properties Example Big picture general least squares estimation:
More informationECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter
ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter D. Richard Brown III Worcester Polytechnic Institute 09-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 09-Apr-2009 1 /
More informationKalman Filter Computer Vision (Kris Kitani) Carnegie Mellon University
Kalman Filter 16-385 Computer Vision (Kris Kitani) Carnegie Mellon University Examples up to now have been discrete (binary) random variables Kalman filtering can be seen as a special case of a temporal
More informationFrom Bayes to Extended Kalman Filter
From Bayes to Extended Kalman Filter Michal Reinštein Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception http://cmp.felk.cvut.cz/
More informationUCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)
UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, 208 Practice Final Examination (Winter 207) There are 6 problems, each problem with multiple parts. Your answer should be as clear and readable
More informationEIE6207: Estimation Theory
EIE6207: Estimation Theory Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: Steven M.
More informationPCA and LDA. Man-Wai MAK
PCA and LDA Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak References: S.J.D. Prince,Computer
More informationThe Kalman Filter (part 1) Definition. Rudolf Emil Kalman. Why do we need a filter? Definition. HCI/ComS 575X: Computational Perception.
The Kalman Filter (part 1) HCI/ComS 575X: Computational Perception Instructor: Alexander Stoytchev http://www.cs.iastate.edu/~alex/classes/2007_spring_575x/ March 5, 2007 HCI/ComS 575X: Computational Perception
More informationNonlinear State Estimation! Extended Kalman Filters!
Nonlinear State Estimation! Extended Kalman Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Deformation of the probability distribution!! Neighboring-optimal
More informationE190Q Lecture 11 Autonomous Robot Navigation
E190Q Lecture 11 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 013 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator
More informationLecture Notes 4 Vector Detection and Estimation. Vector Detection Reconstruction Problem Detection for Vector AGN Channel
Lecture Notes 4 Vector Detection and Estimation Vector Detection Reconstruction Problem Detection for Vector AGN Channel Vector Linear Estimation Linear Innovation Sequence Kalman Filter EE 278B: Random
More informationUCSD ECE153 Handout #40 Prof. Young-Han Kim Thursday, May 29, Homework Set #8 Due: Thursday, June 5, 2011
UCSD ECE53 Handout #40 Prof. Young-Han Kim Thursday, May 9, 04 Homework Set #8 Due: Thursday, June 5, 0. Discrete-time Wiener process. Let Z n, n 0 be a discrete time white Gaussian noise (WGN) process,
More informationLecture Note 12: Kalman Filter
ECE 645: Estimation Theory Spring 2015 Instructor: Prof. Stanley H. Chan Lecture Note 12: Kalman Filter LaTeX prepared by Stylianos Chatzidakis) May 4, 2015 This lecture note is based on ECE 645Spring
More informationSolutions to Homework Set #6 (Prepared by Lele Wang)
Solutions to Homework Set #6 (Prepared by Lele Wang) Gaussian random vector Given a Gaussian random vector X N (µ, Σ), where µ ( 5 ) T and 0 Σ 4 0 0 0 9 (a) Find the pdfs of i X, ii X + X 3, iii X + X
More informationPILCO: A Model-Based and Data-Efficient Approach to Policy Search
PILCO: A Model-Based and Data-Efficient Approach to Policy Search (M.P. Deisenroth and C.E. Rasmussen) CSC2541 November 4, 2016 PILCO Graphical Model PILCO Probabilistic Inference for Learning COntrol
More informationLecture 16: State Space Model and Kalman Filter Bus 41910, Time Series Analysis, Mr. R. Tsay
Lecture 6: State Space Model and Kalman Filter Bus 490, Time Series Analysis, Mr R Tsay A state space model consists of two equations: S t+ F S t + Ge t+, () Z t HS t + ɛ t (2) where S t is a state vector
More informationEIE6207: Maximum-Likelihood and Bayesian Estimation
EIE6207: Maximum-Likelihood and Bayesian Estimation Man-Wai MAK Dept. of Electronic and Information Engineering, The Hong Kong Polytechnic University enmwmak@polyu.edu.hk http://www.eie.polyu.edu.hk/ mwmak
More informationMiscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier
Miscellaneous Regarding reading materials Reading materials will be provided as needed If no assigned reading, it means I think the material from class is sufficient Should be enough for you to do your
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Lecture 12 Dynamical Models CS/CNS/EE 155 Andreas Krause Homework 3 out tonight Start early!! Announcements Project milestones due today Please email to TAs 2 Parameter learning
More informationL06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms
L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian
More informationState-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Financial Econometrics / 49
State-space Model Eduardo Rossi University of Pavia November 2013 Rossi State-space Model Financial Econometrics - 2013 1 / 49 Outline 1 Introduction 2 The Kalman filter 3 Forecast errors 4 State smoothing
More informationNOISE ROBUST RELATIVE TRANSFER FUNCTION ESTIMATION. M. Schwab, P. Noll, and T. Sikora. Technical University Berlin, Germany Communication System Group
NOISE ROBUST RELATIVE TRANSFER FUNCTION ESTIMATION M. Schwab, P. Noll, and T. Sikora Technical University Berlin, Germany Communication System Group Einsteinufer 17, 1557 Berlin (Germany) {schwab noll
More informationCS491/691: Introduction to Aerial Robotics
CS491/691: Introduction to Aerial Robotics Topic: State Estimation Dr. Kostas Alexis (CSE) World state (or system state) Belief state: Our belief/estimate of the world state World state: Real state of
More informationProbabilistic Graphical Models
Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by
More information1 Kalman Filter Introduction
1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation
More informationUCSD ECE153 Handout #34 Prof. Young-Han Kim Tuesday, May 27, Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE53 Handout #34 Prof Young-Han Kim Tuesday, May 7, 04 Solutions to Homework Set #6 (Prepared by TA Fatemeh Arbabjolfaei) Linear estimator Consider a channel with the observation Y XZ, where the
More informationLecture Note 1: Probability Theory and Statistics
Univ. of Michigan - NAME 568/EECS 568/ROB 530 Winter 2018 Lecture Note 1: Probability Theory and Statistics Lecturer: Maani Ghaffari Jadidi Date: April 6, 2018 For this and all future notes, if you would
More informationCovariance Matrix Simplification For Efficient Uncertainty Management
PASEO MaxEnt 2007 Covariance Matrix Simplification For Efficient Uncertainty Management André Jalobeanu, Jorge A. Gutiérrez PASEO Research Group LSIIT (CNRS/ Univ. Strasbourg) - Illkirch, France *part
More informationUCSD ECE 153 Handout #46 Prof. Young-Han Kim Thursday, June 5, Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei)
UCSD ECE 53 Handout #46 Prof. Young-Han Kim Thursday, June 5, 04 Solutions to Homework Set #8 (Prepared by TA Fatemeh Arbabjolfaei). Discrete-time Wiener process. Let Z n, n 0 be a discrete time white
More informationData assimilation with and without a model
Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,
More informationMathematical Formulation of Our Example
Mathematical Formulation of Our Example We define two binary random variables: open and, where is light on or light off. Our question is: What is? Computer Vision 1 Combining Evidence Suppose our robot
More informationKalman Filtering. Namrata Vaswani. March 29, Kalman Filter as a causal MMSE estimator
Kalman Filtering Namrata Vaswani March 29, 2018 Notes are based on Vincent Poor s book. 1 Kalman Filter as a causal MMSE estimator Consider the following state space model (signal and observation model).
More informationAppendix A : Introduction to Probability and stochastic processes
A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the
More informationMissing Data and Dynamical Systems
U NIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN CS598PS Machine Learning for Signal Processing Missing Data and Dynamical Systems 12 October 2017 Today s lecture Dealing with missing data Tracking and linear
More informationStochastic and Adaptive Optimal Control
Stochastic and Adaptive Optimal Control Robert Stengel Optimal Control and Estimation, MAE 546 Princeton University, 2018! Nonlinear systems with random inputs and perfect measurements! Stochastic neighboring-optimal
More informationUsing the Kalman Filter to Estimate the State of a Maneuvering Aircraft
1 Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft K. Meier and A. Desai Abstract Using sensors that only measure the bearing angle and range of an aircraft, a Kalman filter is implemented
More informationUCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, Homework Set #6 Due: Thursday, May 22, 2011
UCSD ECE153 Handout #30 Prof. Young-Han Kim Thursday, May 15, 2014 Homework Set #6 Due: Thursday, May 22, 2011 1. Linear estimator. Consider a channel with the observation Y = XZ, where the signal X and
More informationChris Bishop s PRML Ch. 8: Graphical Models
Chris Bishop s PRML Ch. 8: Graphical Models January 24, 2008 Introduction Visualize the structure of a probabilistic model Design and motivate new models Insights into the model s properties, in particular
More informationName of the Student: Problems on Discrete & Continuous R.Vs
Engineering Mathematics 05 SUBJECT NAME : Probability & Random Process SUBJECT CODE : MA6 MATERIAL NAME : University Questions MATERIAL CODE : JM08AM004 REGULATION : R008 UPDATED ON : Nov-Dec 04 (Scan
More informationEE 565: Position, Navigation, and Timing
EE 565: Position, Navigation, and Timing Kalman Filtering Example Aly El-Osery Kevin Wedeward Electrical Engineering Department, New Mexico Tech Socorro, New Mexico, USA In Collaboration with Stephen Bruder
More informationKalman Filters, Dynamic Bayesian Networks
Course 16:198:520: Introduction To Artificial Intelligence Lecture 12 Kalman Filters, Dynamic Bayesian Networks Abdeslam Boularias Monday, November 16, 2015 1 / 40 Example: Tracking the trajectory of a
More informationLeast Squares Estimation Namrata Vaswani,
Least Squares Estimation Namrata Vaswani, namrata@iastate.edu Least Squares Estimation 1 Recall: Geometric Intuition for Least Squares Minimize J(x) = y Hx 2 Solution satisfies: H T H ˆx = H T y, i.e.
More informationIntroduction to Probabilistic Graphical Models: Exercises
Introduction to Probabilistic Graphical Models: Exercises Cédric Archambeau Xerox Research Centre Europe cedric.archambeau@xrce.xerox.com Pascal Bootcamp Marseille, France, July 2010 Exercise 1: basics
More information6.4 Kalman Filter Equations
6.4 Kalman Filter Equations 6.4.1 Recap: Auxiliary variables Recall the definition of the auxiliary random variables x p k) and x m k): Init: x m 0) := x0) S1: x p k) := Ak 1)x m k 1) +uk 1) +vk 1) S2:
More informationRobots Autónomos. Depto. CCIA. 2. Bayesian Estimation and sensor models. Domingo Gallardo
Robots Autónomos 2. Bayesian Estimation and sensor models Domingo Gallardo Depto. CCIA http://www.rvg.ua.es/master/robots References Recursive State Estimation: Thrun, chapter 2 Sensor models and robot
More informationCS 532: 3D Computer Vision 6 th Set of Notes
1 CS 532: 3D Computer Vision 6 th Set of Notes Instructor: Philippos Mordohai Webpage: www.cs.stevens.edu/~mordohai E-mail: Philippos.Mordohai@stevens.edu Office: Lieb 215 Lecture Outline Intro to Covariance
More informationInformatics 2B: Learning and Data Lecture 10 Discriminant functions 2. Minimal misclassifications. Decision Boundaries
Overview Gaussians estimated from training data Guido Sanguinetti Informatics B Learning and Data Lecture 1 9 March 1 Today s lecture Posterior probabilities, decision regions and minimising the probability
More informationCheng Soon Ong & Christian Walder. Canberra February June 2018
Cheng Soon Ong & Christian Walder Research Group and College of Engineering and Computer Science Canberra February June 2018 (Many figures from C. M. Bishop, "Pattern Recognition and ") 1of 89 Part II
More informationLecture 4: Extended Kalman filter and Statistically Linearized Filter
Lecture 4: Extended Kalman filter and Statistically Linearized Filter Department of Biomedical Engineering and Computational Science Aalto University February 17, 2011 Contents 1 Overview of EKF 2 Linear
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More information2D Image Processing. Bayes filter implementation: Kalman filter
2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationLecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN
Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and
More informationOpen Economy Macroeconomics: Theory, methods and applications
Open Economy Macroeconomics: Theory, methods and applications Lecture 4: The state space representation and the Kalman Filter Hernán D. Seoane UC3M January, 2016 Today s lecture State space representation
More informationState-space Model. Eduardo Rossi University of Pavia. November Rossi State-space Model Fin. Econometrics / 53
State-space Model Eduardo Rossi University of Pavia November 2014 Rossi State-space Model Fin. Econometrics - 2014 1 / 53 Outline 1 Motivation 2 Introduction 3 The Kalman filter 4 Forecast errors 5 State
More informationModeling and state estimation Examples State estimation Probabilities Bayes filter Particle filter. Modeling. CSC752 Autonomous Robotic Systems
Modeling CSC752 Autonomous Robotic Systems Ubbo Visser Department of Computer Science University of Miami February 21, 2017 Outline 1 Modeling and state estimation 2 Examples 3 State estimation 4 Probabilities
More informationDiscrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 26. Estimation: Regression and Least Squares
CS 70 Discrete Mathematics and Probability Theory Spring 2016 Rao and Walrand Note 26 Estimation: Regression and Least Squares This note explains how to use observations to estimate unobserved random variables.
More information2D Image Processing. Bayes filter implementation: Kalman filter
2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche
More informationGaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012
Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature
More informationMultivariate Gaussians. Sargur Srihari
Multivariate Gaussians Sargur srihari@cedar.buffalo.edu 1 Topics 1. Multivariate Gaussian: Basic Parameterization 2. Covariance and Information Form 3. Operations on Gaussians 4. Independencies in Gaussians
More informationCOM336: Neural Computing
COM336: Neural Computing http://www.dcs.shef.ac.uk/ sjr/com336/ Lecture 2: Density Estimation Steve Renals Department of Computer Science University of Sheffield Sheffield S1 4DP UK email: s.renals@dcs.shef.ac.uk
More informationHuman-Oriented Robotics. Temporal Reasoning. Kai Arras Social Robotics Lab, University of Freiburg
Temporal Reasoning Kai Arras, University of Freiburg 1 Temporal Reasoning Contents Introduction Temporal Reasoning Hidden Markov Models Linear Dynamical Systems (LDS) Kalman Filter 2 Temporal Reasoning
More informationStochastic Processes
Elements of Lecture II Hamid R. Rabiee with thanks to Ali Jalali Overview Reading Assignment Chapter 9 of textbook Further Resources MIT Open Course Ware S. Karlin and H. M. Taylor, A First Course in Stochastic
More informationSensor Fusion: Particle Filter
Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,
More informationControlling Uncertainty in Personal Positioning at Minimal Measurement Cost
Controlling Uncertainty in Personal Positioning at Minimal Measurement Cost Hui Fang 1, Wen-Jing Hsu 1, and Larry Rudolph 2 1 Nanyang Technological University Nanyang Avenue, Singapore 639798 fang0025@ntu.edu.sg,
More informationIntroduction to Mobile Robotics Probabilistic Robotics
Introduction to Mobile Robotics Probabilistic Robotics Wolfram Burgard 1 Probabilistic Robotics Key idea: Explicit representation of uncertainty (using the calculus of probability theory) Perception Action
More informationDimension Reduction. David M. Blei. April 23, 2012
Dimension Reduction David M. Blei April 23, 2012 1 Basic idea Goal: Compute a reduced representation of data from p -dimensional to q-dimensional, where q < p. x 1,...,x p z 1,...,z q (1) We want to do
More informationThese slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop
Music and Machine Learning (IFT68 Winter 8) Prof. Douglas Eck, Université de Montréal These slides follow closely the (English) course textbook Pattern Recognition and Machine Learning by Christopher Bishop
More informationCOS Lecture 16 Autonomous Robot Navigation
COS 495 - Lecture 16 Autonomous Robot Navigation Instructor: Chris Clark Semester: Fall 011 1 Figures courtesy of Siegwart & Nourbakhsh Control Structure Prior Knowledge Operator Commands Localization
More informationContent.
Content Fundamentals of Bayesian Techniques (E. Sucar) Bayesian Filters (O. Aycard) Definition & interests Implementations Hidden Markov models Discrete Bayesian Filters or Markov localization Kalman filters
More informationKalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q
Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I
More informationLecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection
REGLERTEKNIK Lecture Outline AUTOMATIC CONTROL Target Tracking: Lecture 3 Maneuvering Target Tracking Issues Maneuver Detection Emre Özkan emre@isy.liu.se Division of Automatic Control Department of Electrical
More informationFactor Analysis and Kalman Filtering (11/2/04)
CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used
More informationLeast Squares and Kalman Filtering Questions: me,
Least Squares and Kalman Filtering Questions: Email me, namrata@ece.gatech.edu Least Squares and Kalman Filtering 1 Recall: Weighted Least Squares y = Hx + e Minimize Solution: J(x) = (y Hx) T W (y Hx)
More informationBayesian Machine Learning - Lecture 7
Bayesian Machine Learning - Lecture 7 Guido Sanguinetti Institute for Adaptive and Neural Computation School of Informatics University of Edinburgh gsanguin@inf.ed.ac.uk March 4, 2015 Today s lecture 1
More informationGaussian Processes for Sequential Prediction
Gaussian Processes for Sequential Prediction Michael A. Osborne Machine Learning Research Group Department of Engineering Science University of Oxford Gaussian processes are useful for sequential data,
More informationMachine Learning 4771
Machine Learning 4771 Instructor: ony Jebara Kalman Filtering Linear Dynamical Systems and Kalman Filtering Structure from Motion Linear Dynamical Systems Audio: x=pitch y=acoustic waveform Vision: x=object
More information9 Multi-Model State Estimation
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State
More informationERROR COVARIANCE ESTIMATION IN OBJECT TRACKING SCENARIOS USING KALMAN FILTER
ERROR COVARIANCE ESTIMATION IN OBJECT TRACKING SCENARIOS USING KALMAN FILTER Mr K V Sriharsha K.V 1, Dr N V Rao 2 1 Assistant Professor, Department of Information Technology, CVR College of Engg (India)
More information11 - The linear model
11-1 The linear model S. Lall, Stanford 2011.02.15.01 11 - The linear model The linear model The joint pdf and covariance Example: uniform pdfs The importance of the prior Linear measurements with Gaussian
More informationROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino
ROBOTICS 01PEEQW Basilio Bona DAUIN Politecnico di Torino Probabilistic Fundamentals in Robotics Gaussian Filters Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile
More informationIntroduction to Probability and Stocastic Processes - Part I
Introduction to Probability and Stocastic Processes - Part I Lecture 2 Henrik Vie Christensen vie@control.auc.dk Department of Control Engineering Institute of Electronic Systems Aalborg University Denmark
More informationThe Kalman filter. Chapter 6
Chapter 6 The Kalman filter In the last chapter, we saw that in Data Assimilation, we ultimately desire knowledge of the full a posteriori p.d.f., that is the conditional p.d.f. of the state given the
More informationMarkov localization uses an explicit, discrete representation for the probability of all position in the state space.
Markov Kalman Filter Localization Markov localization localization starting from any unknown position recovers from ambiguous situation. However, to update the probability of all positions within the whole
More informationBayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.
Kalman Filter Localization Bayes Filter Reminder Prediction Correction Gaussians p(x) ~ N(µ,σ 2 ) : Properties of Gaussians Univariate p(x) = 1 1 2πσ e 2 (x µ) 2 σ 2 µ Univariate -σ σ Multivariate µ Multivariate
More informationP 1.5 X 4.5 / X 2 and (iii) The smallest value of n for
DHANALAKSHMI COLLEGE OF ENEINEERING, CHENNAI DEPARTMENT OF ELECTRONICS AND COMMUNICATION ENGINEERING MA645 PROBABILITY AND RANDOM PROCESS UNIT I : RANDOM VARIABLES PART B (6 MARKS). A random variable X
More information