TSRT14: Sensor Fusion Lecture 6. Kalman Filter (KF) Le 6: Kalman filter (KF), approximations (EKF, UKF) Lecture 5: summary
|
|
- Cory Henderson
- 6 years ago
- Views:
Transcription
1 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Le 6: Kalman filter KF approximations EKF UKF TSRT14: Sensor Fusion Lecture 6 Kalman filter KF KF approximations EKF UKF Gustaf Hendeby hendeby@isyliuse Whiteboard: ˆ Derivation framework for KF EKF UKF Slides: ˆ Kalman filter summary: main equations robustness sensitivity divergence monitoring user aspects ˆ Nonlinear transforms revisited ˆ Application to derivation of EKF and UKF ˆ User guidelines and interpretations TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Lecture 5: summary ˆ Standard models in global coordinates: ˆ Translation p m t = w p t ˆ 2D orientation for heading h m t = wt h ˆ Coordinated turn model Ẋ = v X v X = ωv Y ω = Ẏ = v Y ˆ Standard models in local coordinates x y ψ ˆ Odometry and dead reckoning for x y ψ v Y = ωv X Kalman Filter KF t X t = X + vt x cosψt dt Yt = Y + t t vt x sinψt dt ψ t = ψ + ψ t dt ˆ Force models for ψ a y a x ˆ 3D orientation q = 1 2 Sωq = 1 Sqω 2
2 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Chapter 7 Overview Kalman filter ˆ Algorithms and derivation ˆ Practical issues ˆ Computational aspects ˆ Filter monitoring The discussion and conclusions do usually apply to all nonlinear filters though it is more concrete in the linear Gaussian case TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Kalman Filter KF Time-varying state space model: x k+1 = F k x k + G k v k covv k = Q k y k = H k x k + e k cove k = R k Time update: ˆx k+1 k = F k ˆx k k P k+1 k = F k P k k Fk T + G kq k G T k Measurement update: ˆx k k = ˆx k k 1 + P k k 1 Hk T H kp k k 1 Hk T + R k 1 y k H k ˆx k k 1 P k k = P k k 1 P k k 1 Hk T H kp k k 1 Hk T + R k 1 H k P k k 1 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 KF Modifications Auxiliary quantities: innovation ε k innovation covariance S k and Kalman gain K k ŷ k = H k ˆx k k 1 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Simulation Example 1/2 Create a constant velocity model simulate and Kalman filter ε k = y k H k ˆx k k 1 = y k ŷ k S k = H k P k k 1 H T k + R k K k = P k k 1 H T k H kp k k 1 H T k + R k 1 = P k k 1 H T k S 1 k Filter form Predictor form ˆx k k = F k 1ˆx k 1 k 1 + K k y k H k F k 1ˆx k 1 k 1 = F k 1 K k H k F k 1 ˆx k 1 k 1 + K k y k T = 5; F = [1 T ; 1 T; 1 ; 1]; G = [T ^2/2 ; T ^2/2; T ; T]; H = [1 ; 1 ]; R = 3* eye 2 ; m = lss F [] H [] G*G R 1/T; m xlabel = { X Y vx vy }; m ylabel = { X Y }; m name = Constant velocity motion model ; z = simulate m 2 ; xhat1 = kalman m z alg 2 k 1; % Time - varying xplot2 z xhat1 conf 9 [1 2] ; Y X ˆx k+1 k = F k ˆx k k 1 + F k K k y k H k ˆx k k 1 = F k F k K k H k ˆx k k 1 + F k K k y k
3 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Simulation Example 2/2 Covariance illustrated as confidence ellipsoids in 2D plots or confidence bands in 1D plots xplot z xhat1 conf 99 y1 y2 y3 y Time Time Time Time TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Tuning the KF ˆ The SNR ratio Q / R is the most crucial it sets the filter speeds Note difference of real system and model used in the KF ˆ Recommentation: fix R according to sensor specification/performance and tune Q motion models are anyway subjective approximation of reality ˆ High SNR in the model gives fast filter that is quick in adapting to changes/maneuvers but with larger uncertainty small bias large variance ˆ Conversely low SNR in the model gives slow filter that is slow in adapting to changes/maneuvers but with small uncertainty large bias small variance ˆ P reflects the belief in the prior x 1 N ˆx 1 P Possible to choose P very large and ˆx 1 arbitrary if no prior information exists ˆ Tune covariances in large steps order of magnitudes! TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Optimality Properties TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Robustness and Sensitivity ˆ For a linear model the KF provides the WLS solution ˆ The KF is the best linear unbiased estimator BLUE ˆ It is the Bayes optimal filter for linear model when x v k e k are Gaussian variables x k+1 y 1:k N ˆx k+1 k P k+1 k x k y 1:k N ˆx k k P k k ε k N S k The following concepts are relevant for all filtering applications but they are most explicit for KF: ˆ Observability is revealed indirectly by P k k ; monitor its rank or better condition number ˆ Divergence tests Monitor performance measures and restart the filter after divergence ˆ Outlier rejection monitor sensor observations ˆ Bias error incorrect model gives bias in estimates ˆ Sensitivity analysis uncertain model contributes to the total covariance ˆ Numerical issues may give complex estimates
4 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Observability 1 Snapshot observability if H k has full rank WLS can be applied to estimate x 2 Classical observability for time-invariant and time/varying case H H k n+1 HF H k n+2 F k n+1 HF 2 O = HF n 1 O k = H k n+3 F k n+2 F k n+1 H k F k 1 F k n+1 3 The covariance matrix P k k extends the observability condition by weighting with the measurement noise and to forget old information according to the process noise Thus the condition number of P k k is the natural indicator of observability! TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Divergence Tests When is ε k ε T k significantly larger than its computed expected value S k = Eε k ε T k note that ε k N S k? Principal reasons: ˆ Model errors ˆ Sensor model errors: offsets drifts incorrect covariances scaling factor in all covariances ˆ Sensor errors: outliers missing data ˆ Numerical issues Solutions: ˆ In the first two cases the filter has to be redesigned ˆ In the last two cases the filter has to be restarted TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Outlier Rejection Let H : ε k N S k then T y k = ε T k S 1 k ε k χ 2 n yk if everything works fine and there is no outlier If T y k > h PF A this is an indication of outlier and the measurement update can be omitted In the case of several sensors each sensor i should be monitored for outliers T y i k = εi k T S 1 k εi k χ2 n y i k TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Sensitivity analysis: parameter uncertainty Sensitivity analysis can be done with respect to uncertain parameters with known covariance matrix using for instance Gauss approximation formula ˆ Assume F θ Gθ Hθ Qθ Rθ have uncertain parameters θ with Eθ = ˆθ and covθ = P θ ˆ The state estimate ˆx k is a stochastic variable with four stochastic sources v k e k x 1 at one hand and θ on the other hand ˆ The law of total variance varx = E varx Y + var EX Y and Gauss approximation formula varhy h Y Ȳ vary h Y Ȳ T gives covˆx k k P k k + dˆx k k dθ P θ dˆxk k dθ T ˆ The gradient dˆx k k /dθ can be computed numerically by simulations
5 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Numerical Issues Some simple fixes if problem occurs: ˆ Assure that the covariance matrix is symmetric P = 5*P + 5*P ˆ Use the more numerically stable Joseph s form for the measurement update of the covariance matrix: P k k = I K k H k P k k 1 I K k H k T + K k R k K T k ˆ Assure that the covariance matrix is positive definite by setting negative eigenvalues in P to zero or small positive values ˆ Avoid singular R = even for constraints ˆ Dithering Increase Q and R if needed; this can account for all kind of model errors Kalman Filter Approximations EKF UKF TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Chapter 8 Overview TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 EKF1 and EKF2 principle ˆ Nonlinear transformations ˆ Details of the EKF algorithms ˆ Numerical methods to compute Jacobian and Hessian in the Taylor expansion ˆ An alternative EKF version without the Ricatti equation ˆ The unscented Kalman filter UKF Apply TT1 and TT2 respectively to the dynamic and observation models For instance x k+1 = fx k + v k = fˆx + g ˆxx ˆx x ˆxT g ξx ˆx ˆ EKF1 neglects the rest term ˆ EKF2 compensates with the mean and covariance of the rest term using ξ = ˆx
6 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 EKF1 and EKF2 Algorithm TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Comments NB! S k = h xˆx k k 1 P k k 1 h xˆx k k 1 T + h eˆx k k 1 R k h eˆx k k 1 T + 1 [ 2 trh ix ˆx k k 1P k k 1 h jx ˆx k k 1P k k 1 ] ij K k = P k k 1 h x ˆx k k 1 T S 1 k ε k = y k hˆx k k 1 1 [ 2 trh ix P k k 1 ] i ˆx k k = ˆx k k 1 + K k ε k P k k = P k k 1 P k k 1 h xˆx k k 1 T S 1 k h x ˆx k k 1P k k [ 2 trh ix ˆx k k 1P k k 1 h jx ˆx k k 1P k k 1 ] ij ˆx k+1 k = fˆx k k + 1 [ 2 trf ix P k k ] i P k+1 k = f x ˆx k kp k k f x ˆx k k T + f v ˆx k kq k f v ˆx k k T + 1 [ 2 trf ix ˆx k kp k k f jx ˆx k kp k k ] ij This form of the EKF2 as given in the book is disregarding second order terms of the process noise! See eg my thesis for the full expressions ˆ The EKF1 using the TT1 transformation is obtained by letting both Hessians f x and h x be zero ˆ Analytic Jacobian and Hessian needed If not available use numerical approximations done in Signal and Systems Lab by default! ˆ The complexity of EKF1 is as in KF n 3 x due to the F P F T operation ˆ The complexity of EKF2 is n 5 x due to the F i P Fj T i j = 1 n x operation for ˆ Dithering is good! That is increase Q and R from the simulated values to account for the approximation errors TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 EKF Variants TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Derivative-Free Algorithms ˆ The standard EKF linearizes around the current state estimate ˆ The linearized Kalman filter linearizes around some reference trajectory ˆ The error state Kalman filter also known as the complementary Kalman filter estimates the state error x k = x k ˆx k with respect to some approximate or reference trajectory Feedforward or feedback configurations linearized Kalman filter = feedforward error state Kalman filter EKF = feedback error state Kalman filter Numeric derivatives are preferred in the following cases: ˆ The nonlinear function is too complex ˆ The derivatives are too complex functions ˆ A user-friendly algorithm is desired with as few user inputs as possible This can be achieved with either numerical approximation or using sigma points!
7 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Nonlinear transformations NLT Consider a second order Taylor expansion of a function z = gx: z = gx = gˆx + g ˆxx ˆx x ˆxT g ξx ˆx }{{} rx;ˆxg ξ The rest term is negligible and EKF works fine if: ˆ the model is almost linear ˆ or the SNR is high so x ˆx can be considered small The size of the rest term can be approximated a priori Note: the size may depend on the choice of state coordinates! If the rest term is large use either of ˆ the second order compensated EKF that compensates for the mean and covariance of rx; ˆx g ξ rx; ˆx g ˆx ˆ the unscented KF UKF TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 TT1: first order Taylor approximation The first order Taylor term gives a contribution to the covariance: x N ˆx P N gˆx [g iˆxp g jˆx T ] ij = N gˆx g ˆxP g ˆx T ˆ This is sometimes called Gauss approximation formula ˆ Here [A] ij means element i j in the matrix A This is used in EKF1 EKF with first order Taylor expansion Leads to a KF where nonlinear functions are approximated with their Jacobians ˆ Compare with the linear transformation rule z = Gx x N ˆx P z N Gˆx GP G T ˆ Note that GP G T can be written [G i P G T j ] ij where G i denotes row i of G TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 TT2: second order Taylor approximation TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 MC: Monte Carlo sampling The second order Taylor term contributes both to the mean and covariance as follows: x N ˆx P N gˆx+ 1 2 [trg i ˆxP ] i [g iˆxp g jˆx T trp g i ˆxP g j ˆx] ij ˆ This is used in EKF2 EKF with second order Taylor expansion Leads to a KF where nonlinear functions are approximated with their Jacobians and Hessians ˆ UKF tries to do this approximation numerically without forming the [ Hessian g x explicitly This reduces the n 5 x complexity in tr P g i ˆxP g j ˆx] ij to n3 x complexity Generate N samples transform them and fit a Gaussian distribution x i N ˆx P z i = gx i µ z = 1 N N i=1 P z = 1 N 1 z i N z i µ z z i T µ z i=1 Not commonly used in nonlinear filtering but a valid and solid approach!
8 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 UT: the unscented transform At first sight similar to MC: Generate 2n x + 1 sigma points transform these and fit a Gaussian distribution: x = ˆx x ±i = ˆx ± n x + λp 1/2 :i i = 1 2 n x z i = gx i n x λ 1 Ez 2n x + λ z + 2n i= n x + λ zi x λ z covz 2n x + λ +1 α2 + β Ez z Ez T + n x i= n x 1 2n x + λ z i Ez z i Ez T TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 UT: design parameters ˆ λ is defined by λ = α 2 n x + κ n x ˆ α controls the spread of the sigma points and is suggested to be chosen around 1 3 ˆ β compensates for the distribution and should be chosen to β = 2 for Gaussian distributions ˆ κ is usually chosen to zero Note ˆ n x + λ = α 2 n x when κ = ˆ The weights sum to one for the mean but sum to 2 α 2 + β 4 for the covariance Note also that the weights are not in [ 1] ˆ The mean has a large negative weight! ˆ If n x + λ then UT and TT2 and hence UKF and EKF2 are identical for n x = 1 otherwise closely related! TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Example 1: squared norm Squared norm of a Gaussian vector has a known distribution: z = gx = x T x x N I n z χ 2 n Theoretical distribution is χ 2 n with mean n and variance 2n The mean and variance are below summarized as a Gaussian distribution The number of Monte Carlo simulations is 1 n TT1 TT2 UT1 UT2 MCT 1 N N 1 2 N 1 2 N 1 2 N N N 2 4 N 2 2 N 2 8 N N N 3 6 N 3 N 3 18 N N N 4 8 N 4 4 N 4 32 N N N 5 1 N 5 1 N 5 5 N Theory N N n 2n N n 3 nn N n 2n 2 N n 2n Conclusion: TT2 works not the unscented transforms TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Example 2: radar Conversion of polar measurements to Cartesian position: x1 cosx z = gx = 2 x 1 sinx 2 X TT1 TT UT1 UT2 MCT Conclusion: UT works better than TT1 and TT2
9 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Example 3: standard sensor networks measurements Standard measurements: n g toa x = x = i=1 x2 i TOA 2D: gx = x X N [3; ] [1 ; 1] TT1 N 3 1 TT2 N UT2 N MCT N g doa x = arctan2x 1 x 2 DOA: gx = arctan2x 2 x 1 X N [3; ] [1 ; 1] TT1 N 111 TT2 N 235 UT2 N MCT N Conclusion: UT works slightly better than TT1 and TT2 Studying RSS measurements g rss x = c c 2 1 log 1 x 2 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 KF EKF and UKF in one framework Lemma 71 If X N Y µx µ y Pxx P xy = N P xy P yy µx µ y P Then the conditional distribution for X given the observed Y = y is Gaussian distributed: X Y = y N µ x + P xy P 1 yy y µ y P xx P xy P 1 yy P yx Connection to the Kalman filter The Kalman gain is in this notation given by K k = P xy P 1 yy gives similar results TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Kalman Filter Algorithm 1/2 Time update: Let x = xk v k N ˆxk k Pk k Q k z = x k+1 = fx k u k v k = g x The transformation approximation UT MC TT1 TT2 gives z N ˆx k+1 k P k+1 k TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Kalman Filter Algorithm 2/2 Measurement update: Let xk x = N e k xk z = = y k ˆxk k 1 x k hx k u k e k ŷ k k 1 Pk k 1 R k = g x The transformation approximation UT MC TT1 TT2 gives ˆxk k 1 P xx k k 1 P xy k k 1 z N The measurement update is now K k = P xy k k 1 P yy P yx k k 1 k k 1 1 P yy k k 1 ˆx k k = ˆx k k 1 + K k yk ŷ k k 1
10 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Comments TSRT14 Lecture 6 37 / 42 Depends mainly on: i SNR ii the degree of nonlinearity iii the degree of non-gaussian noise in particular if any distribution is multi-modal has several local maxima SNR and degree of nonlinearity is connected through the rest EKF1 The filter obtained using TT2 is equivalent to EKF2 term whose expected value is: Ex x T g ξx x = E tr g ξx x x x T The filter obtained using UT is equivalent to UKF The Monte Carlo approach should be the most accurate since it asymptotically computes the correct first and second order moments There is a freedom to mix transform approximations in the time and measurement update Gustaf Hendeby Spring / 42 Virtual Yaw Rate Sensor Small rest term requires either high SNR small P or almost linear functions small f and h If the rest term is small use EKF1 If the rest term is large and the nonlinearities are essentially quadratic example xt x use EKF2 If the rest term is large and the nonlinearities are not essentially quadratic try UKF If the functions are severly nonlinear or any distribution is multi-modal consider filterbanks or particle filter TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Navigation grade IMU gives accurate dead-reckoning but drift may cause return at bad places GPS is restricted for high speeds and high accelerations Fusion of IMU and GPS when pseudo-ranges are available with IMU support to tracking loops inside GPS distance rk3 rk4 and the measurements ω3 rnom + ω4 rnom 2 = 2 B ω3 ω4 ω3 ω4 rk3 rk4 rk3 rk Loose integration: direct fusion approach yk = pk + ek Tight integration: TDOA fusion approach yki = pk pik /c + tk + ek + e2k Model with state vector xk = ψ k ψ k bk yk2 = tr g ξp Sounding Rocket Yaw rate subject to bias orientation error increases linearly in time Wheel speeds also give a gyro where the orientation error grows linearly in e1k TSRT14 Lecture 6 = ψ k + bk + Spring 217 Choice of Nonlinear Filter The filter obtained using TT1 is equivalent to the standard yk1 Gustaf Hendeby
11 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Sounding Rocket Navigation grade IMU gives accurate dead-reckoning but drift may cause return at bad places GPS is restricted for high speeds and high accelerations Fusion of IMU and GPS when pseudo-ranges are available with IMU support to tracking loops inside GPS ˆ Loose integration: direct fusion approach y k = p k + e k ˆ Tight integration: TDOA fusion approach y i k = p k p i k /c + t k + e k Error [m] Error [m] x y z Position error relative reference trajectory Time [s] Position error relative reference trajectory Time [s] x y z TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 MC Leaning Angle ˆ Headlight steering ABS and anti-spin systems require leaning angle ˆ Gyro very expensive for this application ˆ Combination of accelerometers investigated lateral and downward acc worked fine in EKF Model where z y z z a 1 a 2 J are constants relating to geometry and inertias of the motorcycle u = v x x = ϕ ϕ ϕ ψ ψ δay δ az δ ϕ T y = hx = ay ux 4 z yx 3 + z yx 2 4 tanx 1 + g sinx 1 + x 6 a ż = ux 4 tanx 1 z z x x 2 4 tan2 x 1 + g cosx 1 + x 7 ϕ a 1 x 3 + a 2 x 2 4 tanx 1 ux 4 J + x 6 TSRT14 Lecture 6 Gustaf Hendeby Spring / 42 Summary Lecture 6 Summary Key tool for a unified derivation of KF EKF UKF X µx Pxx P N xy Y µ y P xy P yy X Y = y N µ x + P xyp 1 1 yy y µy Pxx PxyP yy Pyx The Kalman gain is in this notation given by K k = P xy P 1 yy ˆ In KF P xy and P yy follow from a linear Gaussian model ˆ In EKF P xy and P yy can be computed from a linearized model requires analytic gradients ˆ In EKF and UKF P xy and P yy computed by NLT for transformation of x T v T T and x T e T T respectively No gradients required just function evaluations
Lecture 9: Modeling and motion models
Sensor Fusion, 2014 Lecture 9: 1 Lecture 9: Modeling and motion models Whiteboard: Principles and some examples. Slides: Sampling formulas. Noise models. Standard motion models. Position as integrated
More informationTSRT14: Sensor Fusion Lecture 9
TSRT14: Sensor Fusion Lecture 9 Simultaneous localization and mapping (SLAM) Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 9 Gustaf Hendeby Spring 2018 1 / 28 Le 9: simultaneous localization and
More informationTSRT14: Sensor Fusion Lecture 8
TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,
More informationTSRT14: Sensor Fusion Lecture 1
TSRT14: Sensor Fusion Lecture 1 Course overview Estimation theory for linear models Gustaf Hendeby gustaf.hendeby@liu.se Course Overview TSRT14 Lecture 1 Gustaf Hendeby Spring 2018 2 / 23 Course Goals:
More informationLecture 7: Optimal Smoothing
Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother
More informationKalman Filter. Predict: Update: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q
Kalman Filter Kalman Filter Predict: x k k 1 = F k x k 1 k 1 + B k u k P k k 1 = F k P k 1 k 1 F T k + Q Update: K = P k k 1 Hk T (H k P k k 1 Hk T + R) 1 x k k = x k k 1 + K(z k H k x k k 1 ) P k k =(I
More informationApplication of state observers in attitude estimation using low-cost sensors
Application of state observers in attitude estimation using low-cost sensors Martin Řezáč Czech Technical University in Prague, Czech Republic March 26, 212 Introduction motivation for inertial estimation
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing
More informationSensor Fusion, 2014 Lecture 1: 1 Lectures
Sensor Fusion, 2014 Lecture 1: 1 Lectures Lecture Content 1 Course overview. Estimation theory for linear models. 2 Estimation theory for nonlinear models 3 Sensor networks and detection theory 4 Nonlinear
More informationEKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics
EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory
More informationEKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics
EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory
More information4 Derivations of the Discrete-Time Kalman Filter
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time
More informationAutonomous Mobile Robot Design
Autonomous Mobile Robot Design Topic: Extended Kalman Filter Dr. Kostas Alexis (CSE) These slides relied on the lectures from C. Stachniss, J. Sturm and the book Probabilistic Robotics from Thurn et al.
More information1 Kalman Filter Introduction
1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation
More informationQuaternion based Extended Kalman Filter
Quaternion based Extended Kalman Filter, Sergio Montenegro About this lecture General introduction to rotations and quaternions. Introduction to Kalman Filter for Attitude Estimation How to implement and
More information6.4 Kalman Filter Equations
6.4 Kalman Filter Equations 6.4.1 Recap: Auxiliary variables Recall the definition of the auxiliary random variables x p k) and x m k): Init: x m 0) := x0) S1: x p k) := Ak 1)x m k 1) +uk 1) +vk 1) S2:
More informationROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino
ROBOTICS 01PEEQW Basilio Bona DAUIN Politecnico di Torino Probabilistic Fundamentals in Robotics Gaussian Filters Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile
More informationProbabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010
Probabilistic Fundamentals in Robotics Gaussian Filters Basilio Bona DAUIN Politecnico di Torino July 2010 Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile robot
More informationLecture 2: From Linear Regression to Kalman Filter and Beyond
Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation
More informationGeneralized Gradient Descent Algorithms
ECE 275AB Lecture 11 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San Diego p. 1/1 Lecture 11 ECE 275A Generalized Gradient Descent Algorithms ECE 275AB Lecture 11 Fall 2008 V1.1 c K. Kreutz-Delgado, UC San
More informationE190Q Lecture 11 Autonomous Robot Navigation
E190Q Lecture 11 Autonomous Robot Navigation Instructor: Chris Clark Semester: Spring 013 1 Figures courtesy of Siegwart & Nourbakhsh Control Structures Planning Based Control Prior Knowledge Operator
More informationL06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms
L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian
More informationBayes Filter Reminder. Kalman Filter Localization. Properties of Gaussians. Gaussians. Prediction. Correction. σ 2. Univariate. 1 2πσ e.
Kalman Filter Localization Bayes Filter Reminder Prediction Correction Gaussians p(x) ~ N(µ,σ 2 ) : Properties of Gaussians Univariate p(x) = 1 1 2πσ e 2 (x µ) 2 σ 2 µ Univariate -σ σ Multivariate µ Multivariate
More informationNonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets
Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,
More informationNonlinear State Estimation! Particle, Sigma-Points Filters!
Nonlinear State Estimation! Particle, Sigma-Points Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Particle filter!! Sigma-Points Unscented Kalman ) filter!!
More information2D Image Processing (Extended) Kalman and particle filter
2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz
More informationEKF and SLAM. McGill COMP 765 Sept 18 th, 2017
EKF and SLAM McGill COMP 765 Sept 18 th, 2017 Outline News and information Instructions for paper presentations Continue on Kalman filter: EKF and extension to mapping Example of a real mapping system:
More informationROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1
ROUST CONSTRINED ESTIMTION VI UNSCENTED TRNSFORMTION Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a a Department of Chemical Engineering, Clarkson University, Potsdam, NY -3699, US.
More informationOptimal control and estimation
Automatic Control 2 Optimal control and estimation Prof. Alberto Bemporad University of Trento Academic year 2010-2011 Prof. Alberto Bemporad (University of Trento) Automatic Control 2 Academic year 2010-2011
More informationA Crash Course on Kalman Filtering
A Crash Course on Kalman Filtering Dan Simon Cleveland State University Fall 2014 1 / 64 Outline Linear Systems Probability State Means and Covariances Least Squares Estimation The Kalman Filter Unknown
More informationUAVBook Supplement Full State Direct and Indirect EKF
UAVBook Supplement Full State Direct and Indirect EKF Randal W. Beard March 14, 217 This supplement will explore alternatives to the state estimation scheme presented in the book. In particular, we will
More informationMiscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier
Miscellaneous Regarding reading materials Reading materials will be provided as needed If no assigned reading, it means I think the material from class is sufficient Should be enough for you to do your
More informationData assimilation with and without a model
Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU
More informationImage Alignment and Mosaicing Feature Tracking and the Kalman Filter
Image Alignment and Mosaicing Feature Tracking and the Kalman Filter Image Alignment Applications Local alignment: Tracking Stereo Global alignment: Camera jitter elimination Image enhancement Panoramic
More informationFrom Bayes to Extended Kalman Filter
From Bayes to Extended Kalman Filter Michal Reinštein Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception http://cmp.felk.cvut.cz/
More informationA Theoretical Overview on Kalman Filtering
A Theoretical Overview on Kalman Filtering Constantinos Mavroeidis Vanier College Presented to professors: IVANOV T. IVAN STAHN CHRISTIAN Email: cmavroeidis@gmail.com June 6, 208 Abstract Kalman filtering
More informationA new unscented Kalman filter with higher order moment-matching
A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This
More informationA Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization
A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline
More informationIntroduction to Unscented Kalman Filter
Introduction to Unscented Kalman Filter 1 Introdution In many scientific fields, we use certain models to describe the dynamics of system, such as mobile robot, vision tracking and so on. The word dynamics
More informationAutomated Tuning of the Nonlinear Complementary Filter for an Attitude Heading Reference Observer
Automated Tuning of the Nonlinear Complementary Filter for an Attitude Heading Reference Observer Oscar De Silva, George K.I. Mann and Raymond G. Gosine Faculty of Engineering and Applied Sciences, Memorial
More informationLecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations
Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Simo Särkkä Aalto University, Finland November 18, 2014 Simo Särkkä (Aalto) Lecture 4: Numerical Solution of SDEs November
More informationData assimilation with and without a model
Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,
More informationSLAM Techniques and Algorithms. Jack Collier. Canada. Recherche et développement pour la défense Canada. Defence Research and Development Canada
SLAM Techniques and Algorithms Jack Collier Defence Research and Development Canada Recherche et développement pour la défense Canada Canada Goals What will we learn Gain an appreciation for what SLAM
More informationOn Underweighting Nonlinear Measurements
On Underweighting Nonlinear Measurements Renato Zanetti The Charles Stark Draper Laboratory, Houston, Texas 7758 Kyle J. DeMars and Robert H. Bishop The University of Texas at Austin, Austin, Texas 78712
More informationSolutions for examination in Sensor Fusion,
Solutions for examination in Sensor Fusion, 17-- 1. (a By conditioning on x k and y k, and the matching noise components w 1,k and w,k, the remaining model becomes ( x l T k+1 = xl k + w l T k, where x
More informationNonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania
Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive
More informationENGR352 Problem Set 02
engr352/engr352p02 September 13, 2018) ENGR352 Problem Set 02 Transfer function of an estimator 1. Using Eq. (1.1.4-27) from the text, find the correct value of r ss (the result given in the text is incorrect).
More information2D Image Processing. Bayes filter implementation: Kalman filter
2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de
More informationA Multiple Target Range and Range-Rate. Tracker Using an Extended Kalman Filter and. a Multilayered Association Scheme
A Multiple Target Range and Range-Rate Tracker Using an Extended Kalman Filter and a Multilayered Association Scheme A thesis submitted by Leah Uftring In partial fufillment of the degree requirements
More informationthe robot in its current estimated position and orientation (also include a point at the reference point of the robot)
CSCI 4190 Introduction to Robotic Algorithms, Spring 006 Assignment : out February 13, due February 3 and March Localization and the extended Kalman filter In this assignment, you will write a program
More informationMEMS Gyroscope Control Systems for Direct Angle Measurements
MEMS Gyroscope Control Systems for Direct Angle Measurements Chien-Yu Chi Mechanical Engineering National Chiao Tung University Hsin-Chu, Taiwan (R.O.C.) 3 Email: chienyu.me93g@nctu.edu.tw Tsung-Lin Chen
More informationState Estimation for Nonlinear Systems using Restricted Genetic Optimization
State Estimation for Nonlinear Systems using Restricted Genetic Optimization Santiago Garrido, Luis Moreno, and Carlos Balaguer Universidad Carlos III de Madrid, Leganés 28911, Madrid (Spain) Abstract.
More informationConvergence of the Ensemble Kalman Filter in Hilbert Space
Convergence of the Ensemble Kalman Filter in Hilbert Space Jan Mandel Center for Computational Mathematics Department of Mathematical and Statistical Sciences University of Colorado Denver Parts based
More informationFUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS
FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,
More informationDraft 01PC-73 Sensor fusion for accurate computation of yaw rate and absolute velocity
Draft PC-73 Sensor fusion for accurate computation of yaw rate and absolute velocity Fredrik Gustafsson Department of Electrical Engineering, Linköping University, Sweden Stefan Ahlqvist, Urban Forssell,
More informationAutomatic Control II Computer exercise 3. LQG Design
Uppsala University Information Technology Systems and Control HN,FS,KN 2000-10 Last revised by HR August 16, 2017 Automatic Control II Computer exercise 3 LQG Design Preparations: Read Chapters 5 and 9
More informationSTATISTICAL ORBIT DETERMINATION
STATISTICAL ORBIT DETERMINATION Satellite Tracking Example of SNC and DMC ASEN 5070 LECTURE 6 4.08.011 1 We will develop a simple state noise compensation (SNC) algorithm. This algorithm adds process noise
More informationA Study of Covariances within Basic and Extended Kalman Filters
A Study of Covariances within Basic and Extended Kalman Filters David Wheeler Kyle Ingersoll December 2, 2013 Abstract This paper explores the role of covariance in the context of Kalman filters. The underlying
More informationNonlinear State Estimation! Extended Kalman Filters!
Nonlinear State Estimation! Extended Kalman Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Deformation of the probability distribution!! Neighboring-optimal
More informationEvery real system has uncertainties, which include system parametric uncertainties, unmodeled dynamics
Sensitivity Analysis of Disturbance Accommodating Control with Kalman Filter Estimation Jemin George and John L. Crassidis University at Buffalo, State University of New York, Amherst, NY, 14-44 The design
More informationOptimization-Based Control
Optimization-Based Control Richard M. Murray Control and Dynamical Systems California Institute of Technology DRAFT v1.7a, 19 February 2008 c California Institute of Technology All rights reserved. This
More informationHeterogeneous Track-to-Track Fusion
Heterogeneous Track-to-Track Fusion Ting Yuan, Yaakov Bar-Shalom and Xin Tian University of Connecticut, ECE Dept. Storrs, CT 06269 E-mail: {tiy, ybs, xin.tian}@ee.uconn.edu T. Yuan, Y. Bar-Shalom and
More informationLecture 4: Extended Kalman filter and Statistically Linearized Filter
Lecture 4: Extended Kalman filter and Statistically Linearized Filter Department of Biomedical Engineering and Computational Science Aalto University February 17, 2011 Contents 1 Overview of EKF 2 Linear
More informationMini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra
Mini-Course 07 Kalman Particle Filters Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra Agenda State Estimation Problems & Kalman Filter Henrique Massard Steady State
More informationLecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations
Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Simo Särkkä Aalto University Tampere University of Technology Lappeenranta University of Technology Finland November
More informationLecture 4: Least Squares (LS) Estimation
ME 233, UC Berkeley, Spring 2014 Xu Chen Lecture 4: Least Squares (LS) Estimation Background and general solution Solution in the Gaussian case Properties Example Big picture general least squares estimation:
More informationCoordinate systems and vectors in three spatial dimensions
PHYS2796 Introduction to Modern Physics (Spring 2015) Notes on Mathematics Prerequisites Jim Napolitano, Department of Physics, Temple University January 7, 2015 This is a brief summary of material on
More informationwith Application to Autonomous Vehicles
Nonlinear with Application to Autonomous Vehicles (Ph.D. Candidate) C. Silvestre (Supervisor) P. Oliveira (Co-supervisor) Institute for s and Robotics Instituto Superior Técnico Portugal January 2010 Presentation
More information2D Image Processing. Bayes filter implementation: Kalman filter
2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche
More informationConstrained State Estimation Using the Unscented Kalman Filter
16th Mediterranean Conference on Control and Automation Congress Centre, Ajaccio, France June 25-27, 28 Constrained State Estimation Using the Unscented Kalman Filter Rambabu Kandepu, Lars Imsland and
More informationHomework: Ball Predictor for Robot Soccer
Homework: Ball Predictor for Robot Soccer ECEn 483 November 17, 2004 1 Introduction The objective of this homework is to prepare you to implement a ball predictor in the lab An extended Kalman filter will
More information1 Introduction ISSN
Techset Composition Ltd, Salisbury Doc: {IEE}CTA/Articles/Pagination/CTA58454.3d www.ietdl.org Published in IET Control Theory and Applications Received on 15th January 2009 Revised on 18th May 2009 ISSN
More informationA Comparitive Study Of Kalman Filter, Extended Kalman Filter And Unscented Kalman Filter For Harmonic Analysis Of The Non-Stationary Signals
International Journal of Scientific & Engineering Research, Volume 3, Issue 7, July-2012 1 A Comparitive Study Of Kalman Filter, Extended Kalman Filter And Unscented Kalman Filter For Harmonic Analysis
More informationOpen Access Target Tracking Algorithm Based on Improved Unscented Kalman Filter
Send Orders for Reprints to reprints@benthamscience.ae he Open Automation and Control Systems Journal, 2015, 7, 991-995 991 Open Access arget racking Algorithm Based on Improved Unscented Kalman Filter
More informationThe Kalman filter. Chapter 6
Chapter 6 The Kalman filter In the last chapter, we saw that in Data Assimilation, we ultimately desire knowledge of the full a posteriori p.d.f., that is the conditional p.d.f. of the state given the
More informationOnline monitoring of MPC disturbance models using closed-loop data
Online monitoring of MPC disturbance models using closed-loop data Brian J. Odelson and James B. Rawlings Department of Chemical Engineering University of Wisconsin-Madison Online Optimization Based Identification
More informationRobotics 2 Target Tracking. Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard
Robotics 2 Target Tracking Kai Arras, Cyrill Stachniss, Maren Bennewitz, Wolfram Burgard Slides by Kai Arras, Gian Diego Tipaldi, v.1.1, Jan 2012 Chapter Contents Target Tracking Overview Applications
More information11 a 12 a 21 a 11 a 22 a 12 a 21. (C.11) A = The determinant of a product of two matrices is given by AB = A B 1 1 = (C.13) and similarly.
C PROPERTIES OF MATRICES 697 to whether the permutation i 1 i 2 i N is even or odd, respectively Note that I =1 Thus, for a 2 2 matrix, the determinant takes the form A = a 11 a 12 = a a 21 a 11 a 22 a
More informationLecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations
Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model
More informationChapter 4 State Estimation
Chapter 4 State Estimation Navigation of an unmanned vehicle, always depends on a good estimation of the vehicle states. Especially if no external sensors or marers are available, more or less complex
More informationNew Fast Kalman filter method
New Fast Kalman filter method Hojat Ghorbanidehno, Hee Sun Lee 1. Introduction Data assimilation methods combine dynamical models of a system with typically noisy observations to obtain estimates of the
More informationOrganization. I MCMC discussion. I project talks. I Lecture.
Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems
More informationGreg Welch and Gary Bishop. University of North Carolina at Chapel Hill Department of Computer Science.
STC Lecture Series An Introduction to the Kalman Filter Greg Welch and Gary Bishop University of North Carolina at Chapel Hill Department of Computer Science http://www.cs.unc.edu/~welch/kalmanlinks.html
More informationLecture 13 and 14: Bayesian estimation theory
1 Lecture 13 and 14: Bayesian estimation theory Spring 2012 - EE 194 Networked estimation and control (Prof. Khan) March 26 2012 I. BAYESIAN ESTIMATORS Mother Nature conducts a random experiment that generates
More informationState estimation and the Kalman filter
State estimation and the Kalman filter PhD, David Di Ruscio Telemark university college Department of Technology Systems and Control Engineering N-3914 Porsgrunn, Norway Fax: +47 35 57 52 50 Tel: +47 35
More informationAUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET. Questions AUTOMATIC CONTROL COMMUNICATION SYSTEMS LINKÖPINGS UNIVERSITET
The Problem Identification of Linear and onlinear Dynamical Systems Theme : Curve Fitting Division of Automatic Control Linköping University Sweden Data from Gripen Questions How do the control surface
More informationVlad Estivill-Castro. Robots for People --- A project for intelligent integrated systems
1 Vlad Estivill-Castro Robots for People --- A project for intelligent integrated systems V. Estivill-Castro 2 Probabilistic Map-based Localization (Kalman Filter) Chapter 5 (textbook) Based on textbook
More informationStatistics and Data Analysis
Statistics and Data Analysis The Crash Course Physics 226, Fall 2013 "There are three kinds of lies: lies, damned lies, and statistics. Mark Twain, allegedly after Benjamin Disraeli Statistics and Data
More informationSYSTEMTEORI - KALMAN FILTER VS LQ CONTROL
SYSTEMTEORI - KALMAN FILTER VS LQ CONTROL 1. Optimal regulator with noisy measurement Consider the following system: ẋ = Ax + Bu + w, x(0) = x 0 where w(t) is white noise with Ew(t) = 0, and x 0 is a stochastic
More information9 Multi-Model State Estimation
Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof. N. Shimkin 9 Multi-Model State
More informationState Estimation Based on Nested Particle Filters
Cleveland State University EngagedScholarship@CSU ETD Archive 2013 State Estimation Based on Nested Particle Filters Swathi Srinivasan Cleveland State University How does access to this work benefit you?
More informationA NEW NONLINEAR FILTER
COMMUNICATIONS IN INFORMATION AND SYSTEMS c 006 International Press Vol 6, No 3, pp 03-0, 006 004 A NEW NONLINEAR FILTER ROBERT J ELLIOTT AND SIMON HAYKIN Abstract A discrete time filter is constructed
More informationLecture 6: Bayesian Inference in SDE Models
Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs
More informationEE 565: Position, Navigation, and Timing
EE 565: Position, Navigation, and Timing Kalman Filtering Example Aly El-Osery Kevin Wedeward Electrical Engineering Department, New Mexico Tech Socorro, New Mexico, USA In Collaboration with Stephen Bruder
More informationDual Estimation and the Unscented Transformation
Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology
More informationKalman Filter and Parameter Identification. Florian Herzog
Kalman Filter and Parameter Identification Florian Herzog 2013 Continuous-time Kalman Filter In this chapter, we shall use stochastic processes with independent increments w 1 (.) and w 2 (.) at the input
More informationLecture 2: Controllability of nonlinear systems
DISC Systems and Control Theory of Nonlinear Systems 1 Lecture 2: Controllability of nonlinear systems Nonlinear Dynamical Control Systems, Chapter 3 See www.math.rug.nl/ arjan (under teaching) for info
More informationLecture 2: Linear Models. Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011
Lecture 2: Linear Models Bruce Walsh lecture notes Seattle SISG -Mixed Model Course version 23 June 2011 1 Quick Review of the Major Points The general linear model can be written as y = X! + e y = vector
More informationNumerical Methods I Solving Nonlinear Equations
Numerical Methods I Solving Nonlinear Equations Aleksandar Donev Courant Institute, NYU 1 donev@courant.nyu.edu 1 MATH-GA 2011.003 / CSCI-GA 2945.003, Fall 2014 October 16th, 2014 A. Donev (Courant Institute)
More informationNonlinear Observer Design for Dynamic Positioning
Author s Name, Company Title of the Paper DYNAMIC POSITIONING CONFERENCE November 15-16, 2005 Control Systems I J.G. Snijders, J.W. van der Woude Delft University of Technology (The Netherlands) J. Westhuis
More information