Lecture 7: Optimal Smoothing

Size: px
Start display at page:

Download "Lecture 7: Optimal Smoothing"

Transcription

1 Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011

2 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother 4 Gaussian Approximation Based Smoothing 5 Particle Smoothing 6 Summary and Demonstration

3 Filtering, Prediction and Smoothing

4 Types of Smoothing Problems Fixed-interval smoothing: estimate states on interval [0, T] given measurements on the same interval. Fixed-point smoothing: estimate state at a fixed point of time in the past. Fixed-lag smoothing: estimate state at a fixed delay in the past. Here we shall only consider fixed-interval smoothing, the others can be quite easily derived from it.

5 Examples of Smoothing Problems Given all the radar measurements of a rocket (or missile) trajectory, what was the exact place of launch? Estimate the whole trajectory of a car based on GPS measurements to calibrate the inertial navigation system accurately. What was the history of chemical/combustion/other process given a batch of measurements from it? Remove noise from audio signal by using smoother to estimate the true audio signal under the noise. Smoothing solution also arises in EM algorithm for estimating the parameters of a state space model.

6 Optimal Smoothing Algorithms Linear Gaussian models Rauch-Tung-Striebel smoother (RTSS). Two-filter smoother. Non-linear Gaussian models Extended Rauch-Tung-Striebel smoother (ERTSS). Unscented Rauch-Tung-Striebel smoother (URTSS). Statistically linearized Rauch-Tung-Striebel smoother (URTSS). Gaussian Rauch-Tung-Striebel smoothers (GRTSS), cubature, Gauss-Hermite, Bayes-Hermite, Monte Carlo. Two-filter versions of the above. Non-linear non-gaussian models Sequential importance resampling based smoother. Rao-Blackwellized particle smoothers. Grid based smoother.

7 Problem Formulation Probabilistic state space model: measurement model: y k p(y k x k ) dynamic model: x k p(x k x k 1 ) Assume that the filtering distributions p(x k y 1:k ) have already been computed for all k = 0,...,T. We want recursive equations of computing the smoothing distribution for all k < T : p(x k y 1:T ). The recursion will go backwards in time, because on the last step, the filtering and smoothing distributions coincide: p(x T y 1:T ).

8 Derivation of Formal Smoothing Equations [1/2] The key: due to the Markov properties of state we have: Thus we get: p(x k x k+1, y 1:T ) = p(x k x k+1, y 1:k ) p(x k x k+1, y 1:T ) = p(x k x k+1, y 1:k ) = p(x k, x k+1 y 1:k ) p(x k+1 y 1:k ) = p(x k+1 x k, y 1:k ) p(x k y 1:k ) p(x k+1 y 1:k ) = p(x k+1 x k ) p(x k y 1:k ). p(x k+1 y 1:k )

9 Derivation of Formal Smoothing Equations [2/2] Assuming that the smoothing distribution of the next step p(x k+1 y 1:T ) is available, we get p(x k, x k+1 y 1:T ) = p(x k x k+1, y 1:T ) p(x k+1 y 1:T ) = p(x k x k+1, y 1:k ) p(x k+1 y 1:T ) = p(x k+1 x k ) p(x k y 1:k ) p(x k+1 y 1:T ) p(x k+1 y 1:k ) Integrating over x k+1 gives [ ] p(xk+1 x p(x k y 1:T ) = p(x k y 1:k ) k ) p(x k+1 y 1:T ) dx k+1 p(x k+1 y 1:k )

10 Bayesian Optimal Smoothing Equations Bayesian Optimal Smoothing Equations The Bayesian optimal smoothing equations consist of prediction step and backward update step: p(x k+1 y 1:k ) = p(x k+1 x k ) p(x k y 1:k ) dx k [ ] p(xk+1 x p(x k y 1:T ) = p(x k y 1:k ) k ) p(x k+1 y 1:T ) dx k+1 p(x k+1 y 1:k ) The recursion is started from the filtering (and smoothing) distribution of the last time step p(x T y 1:T ).

11 Linear-Gaussian Smoothing Problem Gaussian driven linear model, i.e., Gauss-Markov model: x k = A k 1 x k 1 + q k 1 y k = H k x k + r k, In probabilistic terms the model is p(x k x k 1 ) = N(x k A k 1 x k 1, Q k 1 ) p(y k x k ) = N(y k H k x k, R k ). Kalman filter can be used for computing all the Gaussian filtering distributions: p(x k y 1:k ) = N(x k m k, P k ).

12 RTS: Derivation Preliminaries Gaussian probability density ( 1 N(x m, P) = (2π) n/2 exp 1 ) P 1/2 2 (x m)t P 1 (x m), Let x and y have the Gaussian densities p(x) = N(x m, P), p(y x) = N(y H x, R), Then the joint and marginal distributions are ( ) (( ) ( )) x m P P H T N, y H m H P H P H T + R y N(H m, H P H T + R).

13 RTS: Derivation Preliminaries (cont.) If the random variables x and y have the joint Gaussian probability density ( ) (( ) ( )) x a A C N, y b C T, B Then the marginal and conditional densities of x and y are given as follows: x N(a, A) y N(b, B) x y N(a+C B 1 (y b), A C B 1 C T ) y x N(b+C T A 1 (x a), B C T A 1 C).

14 Derivation of Rauch-Tung-Striebel Smoother [1/4] By the Gaussian distribution computation rules we get p(x k, x k+1 y 1:k ) = p(x k+1 x k ) p(x k y 1:k ) where m 1 = ( mk A k m k ) = N(x k+1 A k x k, Q k ) N(x k m k, P k ) ([ ] ) xk = N m1, P 1, x k+1 ( Pk P, P 1 = k A T ) k A k P k A k P k A T k + Q. k

15 Derivation of Rauch-Tung-Striebel Smoother [2/4] By conditioning rule of Gaussian distribution we get p(x k x k+1, y 1:T ) = p(x k x k+1, y 1:k ) = N(x k m 2, P 2 ), where G k = P k A T k (A k P k A T k + Q k) 1 m 2 = m k + G k (x k+1 A k m k ) P 2 = P k G k (A k P k A T k + Q k) G T k.

16 Derivation of Rauch-Tung-Striebel Smoother [3/4] The joint distribution of x k and x k+1 given all the data is p(x k+1, x k y 1:T ) = p(x k x k+1, y 1:T ) p(x k+1 y 1:T ) = N(x k m 2, P 2 ) N(x k+1 m s k+1, Ps k+1 ([ ] ) ) xk+1 = N m3, P 3 x k where ( m s ) m 3 = k+1 m k + G k (m s k+1 A k m k ) ( P s P 3 = k+1 P s ) k+1 GT k G k P s k+1 G k P s k+1 GT k + P. 2

17 Derivation of Rauch-Tung-Striebel Smoother [4/4] The marginal mean and covariance are thus given as m s k = m k + G k (m s k+1 A k m k ) P s k = P k + G k (P s k+1 A k P k A T k Q k) G T k. The smoothing distribution is then Gaussian with the above mean and covariance: p(x k y 1:T ) = N(x k m s k, Ps k ),

18 Rauch-Tung-Striebel Smoother Rauch-Tung-Striebel Smoother Backward recursion equations for the smoothed means m s k and covariances P s k : m k+1 = A k m k P k+1 = A k P k A T k + Q k G k = P k A T k [P k+1 ] 1 m s k = m k + G k [m s k+1 m k+1 ] P s k = P k + G k [P s k+1 P k+1 ] GT k, m k and P k are the mean and covariance computed by the Kalman filter. The recursion is started from the last time step T, with m s T = m T and P s T = P T.

19 RTS Smoother: Car Tracking Example The dynamic model of the car tracking model from the first & third lectures was: x k 1 0 t 0 y k ẋk = t q k 1 ẏk } {{ 1 } A x k 1 y k 1 ẋ k 1 ẏ k 1 where q k is zero mean with a covariance matrix Q. q1 c t3 /3 0 q c 1 t2 /2 0 Q = 0 q2 c t3 /3 0 q2 c t2 /2 q1 c t2 /2 0 q1 c t 0 0 q2 c t2 /2 0 q2 c t

20 Non-Linear Smoothing Problem Non-linear Gaussian state space model: x k = f(x k 1 )+q k 1 y k = h(x k )+r k, We want to compute Gaussian approximations to the smoothing distributions: p(x k y 1:T ) N(x k m s k, Ps k ).

21 Extended Rauch-Tung-Striebel Smoother Derivation The approximate joint distribution of x k and x k+1 is ([ ] ) xk p(x k, x k+1 y 1:k ) = N m1, P x 1, k+1 where ( ) mk m 1 = f(m k ) ( P P 1 = k P k F T x (m ) k) F x (m k ) P k F x (m k ) P k F T x (m. k)+q k The rest of the derivation is analogous to the linear RTS smoother.

22 Extended Rauch-Tung-Striebel Smoother Extended Rauch-Tung-Striebel Smoother The equations for the extended RTS smoother are m k+1 = f(m k) P k+1 = F x(m k ) P k F T x(m k )+Q k G k = P k F T x (m k)[p k+1 ] 1 m s k = m k + G k [m s k+1 m k+1 ] P s k = P k + G k [P s k+1 P k+1 ] GT k, where the matrix F x (m k ) is the Jacobian matrix of f(x) evaluated at m k.

23 Statistically Linearized Rauch-Tung-Striebel Smoother Derivation With statistical linearization we get the approximation ([ ] ) xk p(x k, x k+1 y 1:k ) = N m1, P 1, x k+1 where ( ) mk m 1 = E[f(x k )] ( P P 1 = k E[f(x k )δx T ) k ]T E[f(x k )δx T k ] E[f(x k)δx T k ] P 1 k E[f(x k )δx T. k ]T + Q k The expectations are taken with respect to filtering distribution of x k. The derivation proceeds as with linear RTS smoother.

24 Statistically Linearized Rauch-Tung-Striebel Smoother Statistically Linearized Rauch-Tung-Striebel Smoother The equations for the statistically linearized RTS smoother are m k+1 = E[f(x k)] P k+1 = E[f(x k)δx T k ] P 1 k E[f(x k )δx T k ]T + Q k G k = E[f(x k )δx T k ]T [P k+1 ] 1 m s k = m k + G k [m s k+1 m k+1 ] P s k = P k + G k [P s k+1 P k+1 ] GT k, where the expectations are taken with respect to the filtering distribution x k N(m k, P k ).

25 Gaussian Rauch-Tung-Striebel Smoother Derivation With Gaussian moment matching we get the approximation ( [ ] [ ] [ ] ) xk mk Pk D p(x k, x k+1 y 1:k ) = N, k+1 x k+1 D T k+1 P, k+1 m k+1 where m k+1 = P k+1 = f(x k ) N(x k m k, P k ) dx k [f(x k ) m k+1 ][f(x k) m k+1 ]T N(x k m k, P k ) dx k + Q k D k+1 = [x k m k ][f(x k ) m k+1 ]T N(x k m k, P k ) dx k.

26 Gaussian Rauch-Tung-Striebel Smoother Gaussian Rauch-Tung-Striebel Smoother The equations for the Gaussian RTS smoother are m k+1 = f(x k ) N(x k m k, P k ) dx k P k+1 = [f(x k ) m k+1 ][f(x k) m k+1 ]T N(x k m k, P k ) dx k + Q k D k+1 = [x k m k ][f(x k ) m k+1 ]T N(x k m k, P k ) dx k G k = D k+1 [P k+1 ] 1 m s k = m k + G k (m s k+1 m k+1 ) P s k = P k + G k (P s k+1 P k+1 ) GT k.

27 Cubature Smoother Derivation [1/2] Recall the 3rd order spherical Gaussian integral rule: g(x) N(x m, P) dx where 1 2n 2n i=1 g(m+ Pξ (i) ), { ξ (i) n ei, i = 1,...,n = n e i n, i = n+1,...,2n, where e i denotes a unit vector to the direction of coordinate axis i.

28 Cubature Smoother Derivation [2/2] We get the approximation ( [ xk p(x k, x k+1 y 1:k ) = N x k+1 ] [ mk m k+1 ] [ ] ) Pk D, k+1 D T k+1 P, k+1 where X (i) k m k+1 = 1 2n = m k + P k ξ (i) P k+1 = 1 2n D k+1 = 1 2n 2n i=1 2n i=1 2n i=1 f(x (i) k ) [f(x (i) k [X (i) k ) m k+1 ][f(x(i) k ) m k+1 ]T + Q k m k ][f(x (i) k ) m k+1 ]T.

29 Cubature Rauch-Tung-Striebel Smoother [1/3] Cubature Rauch-Tung-Striebel Smoother 1 Form the sigma points: X (i) k = m k + P k ξ (i), i = 1,...,2n, where the unit sigma points are defined as { ξ (i) n ei, i = 1,...,n = n e i n, i = n+1,...,2n. 2 Propagate the sigma points through the dynamic model: ˆX (i) k+1 = f(x(i) k ), i = 1,...,2n.

30 Cubature Rauch-Tung-Striebel Smoother [2/3] Cubature Rauch-Tung-Striebel Smoother (cont.) 3 Compute the predicted mean m k+1, the predicted covariance P k+1 and the cross-covariance D k+1: m k+1 = 1 2n P k+1 = 1 2n D k+1 = 1 2n 2n i=1 2n i=1 2n i=1 ˆX (i) k+1 ( ˆX (i) k+1 m k+1 )( ˆX (i) k+1 m k+1 )T + Q k (X (i) k m k )( ˆX (i) k+1 m k+1 )T.

31 Cubature Rauch-Tung-Striebel Smoother [3/3] Cubature Rauch-Tung-Striebel Smoother (cont.) 4 Compute the gain G k, mean m s k and covariance Ps k as follows: G k = D k+1 [P k+1 ] 1 m s k = m k + G k (m s k+1 m k+1 ) P s k = P k + G k (P s k+1 P k+1 ) GT k.

32 Unscented Rauch-Tung-Striebel Smoother [1/3] Unscented Rauch-Tung-Striebel Smoother 1 Form the sigma points: X (0) k = m k, X (i) k X (i+n) k = m k + [ ] n+λ Pk i = m k [ ] n+λ Pk, i = 1,...,n. i 2 Propagate the sigma points through the dynamic model: ˆX (i) k+1 = f(x(i) k ), i = 0,...,2n.

33 Unscented Rauch-Tung-Striebel Smoother [2/3] Unscented Rauch-Tung-Striebel Smoother (cont.) 3 Compute predicted mean, covariance and cross-covariance: m k+1 = 2n i=0 P k+1 = 2n D k+1 = i=0 2n i=0 W (m) i ˆX (i) k+1 W (c) i ( ˆX (i) k+1 m k+1 )( ˆX (i) k+1 m k+1 )T + Q k W (c) i (X (i) k m k )( ˆX (i) k+1 m k+1 )T,

34 Unscented Rauch-Tung-Striebel Smoother [3/3] Unscented Rauch-Tung-Striebel Smoother (cont.) 4 Compute gain smoothed mean and smoothed covariance: as follows: G k = D k+1 [P k+1 ] 1 m s k = m k + G k (m s k+1 m k+1 ) P s k = P k + G k (P s k+1 P k+1 ) GT k.

35 Other Gaussian RTS Smoothers Gauss-Hermite RTS smoother is based on multidimensional Gauss-Hermite integration. Bayes-Hermite or Gaussian Process RTS smoother uses Gaussian process based quadrature (Bayes-Hermite). Monte Carlo integration based RTS smoothers. Central differences etc.

36 Particle Smoothing [1/2] The smoothing solution can be obtained from SIR by storing the whole state histories into the particles. Special care is needed on the resampling step. The smoothed distribution approximation is then of the form N p(x k y 1:T ) w (i) T δ(x k x (i) k ). i=1 where x (i) k is the kth component in x (i) 1:T. Unfortunately, the approximation is often quite degenerate. Specialized algorithms for particle smoothing exists.

37 Particle Smoothing [2/2] Recall the Rao-Blackwellized particle filtering model: s k p(s k s k 1 ) x k = A(s k 1 ) x k 1 + q k, q k N(0, Q) y k = H(s k ) x k + r k, r k N(0, R) The principle of Rao-Blackwellized particle smoothing is the following: 1 During filtering store the whole sampled state and Kalman filter histories to the particles. 2 At the smoothing step, apply Rauch-Tung-Striebel smoothers to each of the Kalman filter histories in the particles. The smoothing distribution approximation will then be of the form p(x k, s k y 1:T ) N i=1 w (i) T δ(s k s (i) k ) N(x k m s,(i) k, P s,(i) k ).

38 Summary Optimal smoothing is used for computing estimates of state trajectories given the measurements on the whole trajectory. Rauch-Tung-Striebel (RTS) smoother is the closed form smoother for linear Gaussian models. Extended, statistically linearized and unscented RTS smoothers are the approximate nonlinear smoothers corresponding to EKF, SLF and UKF. Gaussian RTS smoothers: cubature RTS smoother, Gauss-Hermite RTS smoothers and various others Particle smoothing can be done by storing the whole state histories in SIR algorithm. Rao-Blackwellized particle smoother is a combination of particle smoothing and RTS smoothing.

39 Matlab Demo: Pendulum [1/2] Pendulum model: ( ) ( x 1 k x 1 xk 2 = k 1 + x2 k 1 t ) ( ) 0 xk 1 2 g sin(x1 k 1 }{{ ) t + q k 1 } f(x k 1 ) y k = sin(xk 1 }{{} ) +r k, h(x k ) The required Jacobian matrix for ERTSS: ( ) 1 t F x (x) = g cos(x 1 ) t 1

40 Matlab Demo: Pendulum [2/2] The required expected value for SLRTSS is ( ) m E[f(x)] = 1 + m 2 t m 2 g sin(m 1 ) exp( P 11 /2) t And the cross term: ( ) E[f(x)(x m) T c11 c ] = 12, c 21 c 22 where c 11 = P 11 + t P 12 c 12 = P 12 + t P 22 c 21 = P 12 g t cos(m 1 ) P 11 exp( P 11 /2) c 22 = P 22 g t cos(m 1 ) P 12 exp( P 11 /2)

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Lecture 4: Extended Kalman filter and Statistically Linearized Filter

Lecture 4: Extended Kalman filter and Statistically Linearized Filter Lecture 4: Extended Kalman filter and Statistically Linearized Filter Department of Biomedical Engineering and Computational Science Aalto University February 17, 2011 Contents 1 Overview of EKF 2 Linear

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Process Approximations Simo Särkkä Aalto University Tampere University of Technology Lappeenranta University of Technology Finland November

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION 1 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 3 6, 1, SANTANDER, SPAIN RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T

More information

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations

Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Lecture 4: Numerical Solution of SDEs, Itô Taylor Series, Gaussian Approximations Simo Särkkä Aalto University, Finland November 18, 2014 Simo Särkkä (Aalto) Lecture 4: Numerical Solution of SDEs November

More information

BAYESIAN ESTIMATION OF TIME-VARYING SYSTEMS:

BAYESIAN ESTIMATION OF TIME-VARYING SYSTEMS: Written material for the course S-114.4202 held in Spring 2010 Version 1.1 (April 22, 2010) Copyright (C) Simo Särä, 2009 2010. All rights reserved. BAYESIAN ESTIMATION OF TIME-VARYING SYSTEMS: Discrete-Time

More information

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart

Robotics. Mobile Robotics. Marc Toussaint U Stuttgart Robotics Mobile Robotics State estimation, Bayes filter, odometry, particle filter, Kalman filter, SLAM, joint Bayes filter, EKF SLAM, particle SLAM, graph-based SLAM Marc Toussaint U Stuttgart DARPA Grand

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

Smoothers: Types and Benchmarks

Smoothers: Types and Benchmarks Smoothers: Types and Benchmarks Patrick N. Raanes Oxford University, NERSC 8th International EnKF Workshop May 27, 2013 Chris Farmer, Irene Moroz Laurent Bertino NERSC Geir Evensen Abstract Talk builds

More information

Extension of the Sparse Grid Quadrature Filter

Extension of the Sparse Grid Quadrature Filter Extension of the Sparse Grid Quadrature Filter Yang Cheng Mississippi State University Mississippi State, MS 39762 Email: cheng@ae.msstate.edu Yang Tian Harbin Institute of Technology Harbin, Heilongjiang

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 12 Dynamical Models CS/CNS/EE 155 Andreas Krause Homework 3 out tonight Start early!! Announcements Project milestones due today Please email to TAs 2 Parameter learning

More information

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES 2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES Simo Särä Aalto University, 02150 Espoo, Finland Jouni Hartiainen

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS

SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS XIX International Conference on Water Resources CMWR 2012 University of Illinois at Urbana-Champain June 17-22,2012 SEQUENTIAL MONTE CARLO METHODS FOR THE CALIBRATION OF STOCHASTIC RAINFALL-RUNOFF MODELS

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Simo Särkkä E-mail: simo.sarkka@hut.fi Aki Vehtari E-mail: aki.vehtari@hut.fi Jouko Lampinen E-mail: jouko.lampinen@hut.fi Abstract

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications

More information

Introduction to Unscented Kalman Filter

Introduction to Unscented Kalman Filter Introduction to Unscented Kalman Filter 1 Introdution In many scientific fields, we use certain models to describe the dynamics of system, such as mobile robot, vision tracking and so on. The word dynamics

More information

2D Image Processing (Extended) Kalman and particle filter

2D Image Processing (Extended) Kalman and particle filter 2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

More information

Particle Filtering for Data-Driven Simulation and Optimization

Particle Filtering for Data-Driven Simulation and Optimization Particle Filtering for Data-Driven Simulation and Optimization John R. Birge The University of Chicago Booth School of Business Includes joint work with Nicholas Polson. JRBirge INFORMS Phoenix, October

More information

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive

More information

Machine Learning 4771

Machine Learning 4771 Machine Learning 4771 Instructor: ony Jebara Kalman Filtering Linear Dynamical Systems and Kalman Filtering Structure from Motion Linear Dynamical Systems Audio: x=pitch y=acoustic waveform Vision: x=object

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

State Estimation Based on Nested Particle Filters

State Estimation Based on Nested Particle Filters Cleveland State University EngagedScholarship@CSU ETD Archive 2013 State Estimation Based on Nested Particle Filters Swathi Srinivasan Cleveland State University How does access to this work benefit you?

More information

Optimal filtering with Kalman filters and smoothers a Manual for Matlab toolbox EKF/UKF

Optimal filtering with Kalman filters and smoothers a Manual for Matlab toolbox EKF/UKF Optimal filtering with Kalman filters and smoothers a Manual for Matlab toolbox EKF/UKF Jouni Hartikainen and Simo Särkkä Laboratory of Computational Engineering, Helsinki University of Technology, P.O.Box

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory

More information

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory

More information

Local Positioning with Parallelepiped Moving Grid

Local Positioning with Parallelepiped Moving Grid Local Positioning with Parallelepiped Moving Grid, WPNC06 16.3.2006, niilo.sirola@tut.fi p. 1/?? TA M P E R E U N I V E R S I T Y O F T E C H N O L O G Y M a t h e m a t i c s Local Positioning with Parallelepiped

More information

TSRT14: Sensor Fusion Lecture 6. Kalman Filter (KF) Le 6: Kalman filter (KF), approximations (EKF, UKF) Lecture 5: summary

TSRT14: Sensor Fusion Lecture 6. Kalman Filter (KF) Le 6: Kalman filter (KF), approximations (EKF, UKF) Lecture 5: summary TSRT14 Lecture 6 Gustaf Hendeby Spring 217 1 / 42 Le 6: Kalman filter KF approximations EKF UKF TSRT14: Sensor Fusion Lecture 6 Kalman filter KF KF approximations EKF UKF Gustaf Hendeby hendeby@isyliuse

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

On Continuous-Discrete Cubature Kalman Filtering

On Continuous-Discrete Cubature Kalman Filtering On Continuous-Discrete Cubature Kalman Filtering Simo Särkkä Arno Solin Aalto University, P.O. Box 12200. FI-00076 AALTO, Finland. (Tel: +358 50 512 4393; e-mail: simo.sarkka@aalto.fi) Aalto University,

More information

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 7 Sequential Monte Carlo methods III 7 April 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Gaussian Processes for Sequential Prediction

Gaussian Processes for Sequential Prediction Gaussian Processes for Sequential Prediction Michael A. Osborne Machine Learning Research Group Department of Engineering Science University of Oxford Gaussian processes are useful for sequential data,

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

Crib Sheet : Linear Kalman Smoothing

Crib Sheet : Linear Kalman Smoothing Crib Sheet : Linear Kalman Smoothing Gabriel A. Terejanu Department of Computer Science and Engineering University at Buffalo, Buffalo, NY 14260 terejanu@buffalo.edu 1 Introduction Smoothing can be separated

More information

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007

Particle Filtering a brief introductory tutorial. Frank Wood Gatsby, August 2007 Particle Filtering a brief introductory tutorial Frank Wood Gatsby, August 2007 Problem: Target Tracking A ballistic projectile has been launched in our direction and may or may not land near enough to

More information

Optimal filtering with Kalman filters and smoothers a Manual for Matlab toolbox EKF/UKF

Optimal filtering with Kalman filters and smoothers a Manual for Matlab toolbox EKF/UKF Optimal filtering with Kalman filters and smoothers a Manual for Matlab toolbox EKF/UKF Jouni Hartiainen and Simo Särä Department of Biomedical Engineering and Computational Science, Helsini University

More information

AUTOMOTIVE ENVIRONMENT SENSORS

AUTOMOTIVE ENVIRONMENT SENSORS AUTOMOTIVE ENVIRONMENT SENSORS Lecture 5. Localization BME KÖZLEKEDÉSMÉRNÖKI ÉS JÁRMŰMÉRNÖKI KAR 32708-2/2017/INTFIN SZÁMÚ EMMI ÁLTAL TÁMOGATOTT TANANYAG Related concepts Concepts related to vehicles moving

More information

EM-algorithm for Training of State-space Models with Application to Time Series Prediction

EM-algorithm for Training of State-space Models with Application to Time Series Prediction EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research

More information

Organization. I MCMC discussion. I project talks. I Lecture.

Organization. I MCMC discussion. I project talks. I Lecture. Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems

More information

Lecture Particle Filters. Magnus Wiktorsson

Lecture Particle Filters. Magnus Wiktorsson Lecture Particle Filters Magnus Wiktorsson Monte Carlo filters The filter recursions could only be solved for HMMs and for linear, Gaussian models. Idea: Approximate any model with a HMM. Replace p(x)

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 12: Gaussian Belief Propagation, State Space Models and Kalman Filters Guest Kalman Filter Lecture by

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Dual Estimation and the Unscented Transformation

Dual Estimation and the Unscented Transformation Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology

More information

State Estimation of Linear and Nonlinear Dynamic Systems

State Estimation of Linear and Nonlinear Dynamic Systems State Estimation of Linear and Nonlinear Dynamic Systems Part I: Linear Systems with Gaussian Noise James B. Rawlings and Fernando V. Lima Department of Chemical and Biological Engineering University of

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Brown University CSCI 2950-P, Spring 2013 Prof. Erik Sudderth Lecture 13: Learning in Gaussian Graphical Models, Non-Gaussian Inference, Monte Carlo Methods Some figures

More information

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS A Thesis by SIRISH BODDIKURAPATI Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of

More information

A Tree Search Approach to Target Tracking in Clutter

A Tree Search Approach to Target Tracking in Clutter 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 2009 A Tree Search Approach to Target Tracking in Clutter Jill K. Nelson and Hossein Roufarshbaf Department of Electrical

More information

Nonlinear Filtering. With Polynomial Chaos. Raktim Bhattacharya. Aerospace Engineering, Texas A&M University uq.tamu.edu

Nonlinear Filtering. With Polynomial Chaos. Raktim Bhattacharya. Aerospace Engineering, Texas A&M University uq.tamu.edu Nonlinear Filtering With Polynomial Chaos Raktim Bhattacharya Aerospace Engineering, Texas A&M University uq.tamu.edu Nonlinear Filtering with PC Problem Setup. Dynamics: ẋ = f(x, ) Sensor Model: ỹ = h(x)

More information

Nonlinear State Estimation! Particle, Sigma-Points Filters!

Nonlinear State Estimation! Particle, Sigma-Points Filters! Nonlinear State Estimation! Particle, Sigma-Points Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Particle filter!! Sigma-Points Unscented Kalman ) filter!!

More information

State Estimation of Linear and Nonlinear Dynamic Systems

State Estimation of Linear and Nonlinear Dynamic Systems State Estimation of Linear and Nonlinear Dynamic Systems Part III: Nonlinear Systems: Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) James B. Rawlings and Fernando V. Lima Department of

More information

IN particle filter (PF) applications, knowledge of the computational

IN particle filter (PF) applications, knowledge of the computational Complexity Analysis of the Marginalized Particle Filter Rickard Karlsson, Thomas Schön and Fredrik Gustafsson, Member IEEE Abstract In this paper the computational complexity of the marginalized particle

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 16 Advanced topics in computational statistics 18 May 2017 Computer Intensive Methods (1) Plan of

More information

A new unscented Kalman filter with higher order moment-matching

A new unscented Kalman filter with higher order moment-matching A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This

More information

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 29 Gaussian Mixtures Proposal Density in Particle Filter for Trac-Before-Detect Ondřej Straa, Miroslav Šimandl and Jindřich

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

Bayesian Filtering for Dynamic Systems with Applications to Tracking

Bayesian Filtering for Dynamic Systems with Applications to Tracking Bayesian Filtering for Dynamic Systems with Applications to Tracking by Anup Dhital A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science Supervisors: Prof.

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: Extended Kalman Filter Dr. Kostas Alexis (CSE) These slides relied on the lectures from C. Stachniss, J. Sturm and the book Probabilistic Robotics from Thurn et al.

More information

Survey on Recursive Bayesian State Estimation

Survey on Recursive Bayesian State Estimation HELSINKI UNIVERSITY OF TECHNOLOGY Faculty of Information and Natural Sciences Department of Mathematics and Systems Analysis Mat-2.418 Independent research projects in applied mathematics Survey on Recursive

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

A NEW NONLINEAR FILTER

A NEW NONLINEAR FILTER COMMUNICATIONS IN INFORMATION AND SYSTEMS c 006 International Press Vol 6, No 3, pp 03-0, 006 004 A NEW NONLINEAR FILTER ROBERT J ELLIOTT AND SIMON HAYKIN Abstract A discrete time filter is constructed

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Constrained State Estimation Using the Unscented Kalman Filter

Constrained State Estimation Using the Unscented Kalman Filter 16th Mediterranean Conference on Control and Automation Congress Centre, Ajaccio, France June 25-27, 28 Constrained State Estimation Using the Unscented Kalman Filter Rambabu Kandepu, Lars Imsland and

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Computer Intensive Methods in Mathematical Statistics

Computer Intensive Methods in Mathematical Statistics Computer Intensive Methods in Mathematical Statistics Department of mathematics johawes@kth.se Lecture 5 Sequential Monte Carlo methods I 31 March 2017 Computer Intensive Methods (1) Plan of today s lecture

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

4 Derivations of the Discrete-Time Kalman Filter

4 Derivations of the Discrete-Time Kalman Filter Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

Lecture 4: State Estimation in Hidden Markov Models (cont.)

Lecture 4: State Estimation in Hidden Markov Models (cont.) EE378A Statistical Signal Processing Lecture 4-04/13/2017 Lecture 4: State Estimation in Hidden Markov Models (cont.) Lecturer: Tsachy Weissman Scribe: David Wugofski In this lecture we build on previous

More information

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 2004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier

Miscellaneous. Regarding reading materials. Again, ask questions (if you have) and ask them earlier Miscellaneous Regarding reading materials Reading materials will be provided as needed If no assigned reading, it means I think the material from class is sufficient Should be enough for you to do your

More information

Introduction to Particle Filters for Data Assimilation

Introduction to Particle Filters for Data Assimilation Introduction to Particle Filters for Data Assimilation Mike Dowd Dept of Mathematics & Statistics (and Dept of Oceanography Dalhousie University, Halifax, Canada STATMOS Summer School in Data Assimila5on,

More information

Iterative Filtering and Smoothing In Non-Linear and Non-Gaussian Systems Using Conditional Moments

Iterative Filtering and Smoothing In Non-Linear and Non-Gaussian Systems Using Conditional Moments Iterative Filtering and Smoothing In Non-Linear and Non-Gaussian Systems Using Conditional Moments Filip Tronarp, Ángel F. García-Fernández, and Simo Särkkä This is a post-print of a paper published in

More information

Factor Analysis and Kalman Filtering (11/2/04)

Factor Analysis and Kalman Filtering (11/2/04) CS281A/Stat241A: Statistical Learning Theory Factor Analysis and Kalman Filtering (11/2/04) Lecturer: Michael I. Jordan Scribes: Byung-Gon Chun and Sunghoon Kim 1 Factor Analysis Factor analysis is used

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background

More information

1 Kalman Filter Introduction

1 Kalman Filter Introduction 1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation

More information

Gaussian Mixture Filter in Hybrid Navigation

Gaussian Mixture Filter in Hybrid Navigation Digest of TISE Seminar 2007, editor Pertti Koivisto, pages 1-5. Gaussian Mixture Filter in Hybrid Navigation Simo Ali-Löytty Institute of Mathematics Tampere University of Technology simo.ali-loytty@tut.fi

More information

A New Nonlinear Filtering Method for Ballistic Target Tracking

A New Nonlinear Filtering Method for Ballistic Target Tracking th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 9 A New Nonlinear Filtering Method for Ballistic arget racing Chunling Wu Institute of Electronic & Information Engineering

More information

Moving Horizon Estimation with Dynamic Programming

Moving Horizon Estimation with Dynamic Programming Cleveland State University EngagedScholarship@CSU ETD Archive 2013 Moving Horizon Estimation with Dynamic Programming Mohan Kumar Ramalingam Cleveland State University How does access to this work benefit

More information

Mobile Robot Localization

Mobile Robot Localization Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations

More information

Towards a Bayesian model for Cyber Security

Towards a Bayesian model for Cyber Security Towards a Bayesian model for Cyber Security Mark Briers (mbriers@turing.ac.uk) Joint work with Henry Clausen and Prof. Niall Adams (Imperial College London) 27 September 2017 The Alan Turing Institute

More information

6.4 Kalman Filter Equations

6.4 Kalman Filter Equations 6.4 Kalman Filter Equations 6.4.1 Recap: Auxiliary variables Recall the definition of the auxiliary random variables x p k) and x m k): Init: x m 0) := x0) S1: x p k) := Ak 1)x m k 1) +uk 1) +vk 1) S2:

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,

More information

State Space Gaussian Processes with Non-Gaussian Likelihoods

State Space Gaussian Processes with Non-Gaussian Likelihoods State Space Gaussian Processes with Non-Gaussian Likelihoods Hannes Nickisch 1 Arno Solin 2 Alexander Grigorievskiy 2,3 1 Philips Research, 2 Aalto University, 3 Silo.AI ICML2018 July 13, 2018 Outline

More information

Introduction to Machine Learning

Introduction to Machine Learning Introduction to Machine Learning Brown University CSCI 1950-F, Spring 2012 Prof. Erik Sudderth Lecture 25: Markov Chain Monte Carlo (MCMC) Course Review and Advanced Topics Many figures courtesy Kevin

More information

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu

Lecture: Gaussian Process Regression. STAT 6474 Instructor: Hongxiao Zhu Lecture: Gaussian Process Regression STAT 6474 Instructor: Hongxiao Zhu Motivation Reference: Marc Deisenroth s tutorial on Robot Learning. 2 Fast Learning for Autonomous Robots with Gaussian Processes

More information