Survey on Recursive Bayesian State Estimation

Size: px
Start display at page:

Download "Survey on Recursive Bayesian State Estimation"

Transcription

1 HELSINKI UNIVERSITY OF TECHNOLOGY Faculty of Information and Natural Sciences Department of Mathematics and Systems Analysis Mat Independent research projects in applied mathematics Survey on Recursive Bayesian State Estimation October 22, 28 Eero Nevalainen 58348W

2 Abstract This literature survey considers state estimation of noisy dynamic systems from imperfect measurements. The classical tool of estimation is the Kalman filter, which is a recursive, optimal estimation algorithm for linear Gaussian systems. The Kalman filter is briefly presented along with key properties of the underlying theory. The emphasis of the survey is on new algorithms suited for state estimation in nonlinear systems. The algorithms proposed in the literature are found to be mainly variations or combinations of three primary approaches, namely Gaussian approximations, exact finite solutions and discrete approximations. These approaches are described along with typical characteristics of applicability, required computational cost and accuracy. The survey is concluded with a mention of some combined approaches and other research topics related to state estimation. 1

3 Contents Contents 2 1 Introduction 3 2 The filtering problem Noise Continuous dynamics Discrete dynamics Discretized continuous dynamics Example Kalman filter System properties Taking measurements into account The discrete-time Kalman filter Non linear filtering Gaussian approximations Sigma point filters Exact finite solutions Daum Discrete approximation Particle filters Summary of nonlinear filtering Other developments Combined approaches to nonlinear filtering Variational Bayesian methods Conclusion 2 References 21 2

4 1 Introduction The Kalman Filter [9, 1, 14] is a wonderful thing. It 1 first saw use in the Apollo navigation computer, and has been used by the navigation industry ever since. It is used in chemical plant control, weather forecasting, communications systems, multisensor devices and for tracking birds and planes. It is a tool for estimating the state of a dynamic system, and therefore it s use is limited only by imagination. As a tool of applied probability theory, the Kalman filter can be considered an extension of least squares estimation. It has become important for real life applications not only due to it s wide applicability, but because it is well suited for implementation on a digital computer. Under the model assumptions, it is an optimal estimator, and the algorithm is recursive, which makes it suitable for use in real-time systems. Although the Kalman filter was conceived in the late 195s, recent years have a seen a large increase in publications related to Kalman filtering. The increase is due both to a widespread adoption in applications, as well as ongoing research to extend the Kalman filter to systems that do not satistfy the model assumptions of the original Kalman filter. In this survey, we first present the underlying theory of the classical Kalman filter and then review common approaches to nonlinear filtering. 2 The filtering problem The Kalman filter solves the problem of estimating the state of a dynamic system based on measurements, when both the dynamics and measurements are inaccurate or subject to noise. In order to proceed we must define the mathematical models of the system. Kalman filtering theory actually accommodates several variations of the problem with respect to how time is handled. These are: a continuous time system and measurements a continuous time system and discrete measurements fully discrete system 2 Perhaps the most common type of filtering problem solved with the Kalman filter is the continuous-discrete filtering problem, in which the system evolves according to linear continuous time dynamics and measurements of the system are obtained at discrete times. This type of system is also known as a sampled-data system. We present a version of the problem where the dynamics are time-invariant and do not include deterministic 1 Actually an Extended Kalman filter 2 with known transition times 3

5 control inputs. This allows us to transform the continuous system dynamic to a discrete one in closed form and without making assumptions about the control inputs [1, 13]. 2.1 Noise In order to the present the estimator for a noisy system we must first define what noise is. In the context of estimation, noise is a random (stochastic) input or output value of the system, whose exact value cannot be known beforehand. While there are many types of noise, we only need to define the most frequently used type, which is additive white Gaussian noise. We begin with white noise [1], defined below. Definition 2.1 (White noise). A random process w(t) whose autocovariance is zero for any two different times is called white noise. In this case V(t 1,t 2 ) = E[[w(t 1 ) w(t 1 )][w(t 2 ) w(t 2 )]] = σ 2 (t 1 )δ(t 1 t 2 ) where w = E[w] and δ( ) is the Dirac delta function. A stationary white noise process has the autocovariance V(t 1,t 2 ) = σ 2 δ(t 1 t 2 ) The term white noise originates from the fact that the stationary process has a flat power spectral density [1] of σ 2, analogous to white light which contains all frequencies. The σ 2 term can in some sense be interpreted as the variance of the process, and has been called the infinitesimal variance or instantaneous variance in literature. To get to Gaussian white noise, we need to specify what Gaussian noise is. Although it can be done independently [1], we choose to define it through Brownian motion [13], presented below. Definition 2.2 (Standard Brownian motion). A process β(t) is called a standard Brownian motion 3 if it has the following properties: 1. β() =. 2. β(t 1 ),β(t 2 ) β(t 1 ),...,β(t k ) β(t k 1 ) are independent for all t 1 < t 2 <... < t k 1 < t k <. 3. β(t) β(s) N(,t s) for every < s < t <. 4. β(t) is continuous 3 Standard Brownian motion is also called Wiener process 4

6 Zero mean white Gaussian noise can now be defined as the formal time derivative of Brownian motion, w(t) = dβ/dt, with the Gaussian property coming from Property 3. It should be noted that this definition is theoretically somewhat problematic since Brownian motion is nowhere differentiable. These problems are usually ignored in engineering and physics applications, since zero mean white Gaussian noise is convenient for modeling and the slight lack of mathematical rigor rarely causes problems. To complete our definition of noise we extend it to multiple dimensions: Definition 2.3 (Stationary n-dimensional zero mean white Gaussian noise). An n-dimensional random process w(t) = (w 1 (t),...,w n (t)) T where w i (t) are independent stationary Gaussian white noise processes with spectral densities q i is called stationary n-dimensional zero mean white Gaussian noise with a power spectral density Q c = diag(q 1,...,q n ). The above definition is sufficiently general for our purposes. The matrix Q c is also known as the diffusion matrix of the associated n-dimensional Brownian motion. We restate the definition of Q c in the multidimensional case. V(t,s) = E [ w(t) w(s) T] = Q c δ(t s) (1) 2.2 Continuous dynamics In continuous and continuous-discrete Kalman filtering the system evolves according to a continuous-time linear stochastic dynamic, that can be written as ẋ = Fx + Lw (2) where x(t) R n is the state F R n n is the system or feedback matrix w R s is zero-mean white noise, with a power spectral density Q c = diag(q 1,...,q s ). L R n s is the noise gain Equation (2) often also includes a control input term Gu, but we have chosen to omit it for simplicity. In its presented form, the equation if clearly divided to a deterministic part Fx and additive noise. The solution to (2) from an initial state x at t = is x(t) = e Ft x + t e F(t τ) L w(τ) dτ (3) 5

7 2.3 Discrete dynamics A discrete time linear stochastic dynamic model can be written as where x R n is the state A is the state transition matrix q k 1 N(,Q k 1 ) is the process noise x k = A k 1 x k 1 + q k 1 (4) the subscript indexes ( ) k represent the value (.) at the time t k The state transition matrix A and diffusion matrix Q are generally allowed to vary in time. The solution from an initial state x can be written for the general case [16], but it is somewhat uninformative. If the system is time-invariant, A k = A Q k = Q k the solution is familiar: k 1 x k = A k x + A k i 1 q i i= 2.4 Discretized continuous dynamics The standard Kalman filter can handle both continuous and discrete dynamics by converting the continuous dynamic to it s discrete equivalent. For linear time invariant (LTI) dynamics (2) the conversion is exact and depends only on the time difference t k. The conversion is performed by equating the continuous (3) and discrete (4) dynamics from an initial state x = x(t ) to x 1 = x(t) with each other (t > t ). 4 x(t) = e Ft x + t x 1 = A x + q e F(t τ) L w(τ) dτ [3] We start by noting that the expected value of the noise terms is zero. This makes the conversion of the deterministic part rather easy: E[x(t)] = e Ft x = A x = E[x 1 ] so A = e Ft or A = exp(f t). (5) 4 Equation (3) is repeated here for the benefit of readability. The bracket notation [3] distinguishes it as a reference. 6

8 The noise term takes bit more effort. Remembering that We can derive [ t E t t E [ w(τ) w(s) T] = Q c δ(τ s) [1] q k 1 = Q k 1 = E [ q k 1 q T ] k 1 t exp(f(t τ))l w(τ) dτ t q T k 1 = w(s) T L T exp(f(t s)) T ds E [ q k 1 q T k 1] = t ] exp(f(t τ)) L w(τ) dτ w(s) T L T exp(f(t s)) T ds = exp(f(t τ)) L E [ w(τ)w(s) T] L T exp(f(t s)) T ds dτ }{{} =Q c δ(τ s) Using the definition of the delta function f(s) δ(τ s) ds = f(τ) we obtain Q k 1 = t exp(f(t τ)) L Q c L T exp(f(t τ)) T dτ (6) The derivation can easily be extended for all timesteps doing the replacements x x k 1, x 1 x k, t t = t k t k 1, etc. In some special cases a solution to (6) can be obtained in closed form. Fortunately the matrix Q k 1 can be efficiently computed by matrix fraction decomposition [13] even when a closed form solution is unavailable. In the LTI-case both matrices can be solved as functions of the time increment t k = t k+1 t k, that is, A k ( t k ) and Q k ( t k ). 2.5 Example Having shown that continuous time models can be accurately discretized, we present an example. This discretization example is known as the Wiener velocity model [16] due to the simple reason that the velocity in the model is a Wiener process or Brownian motion. In terms of white noise the acceleration is a white noise process with spectral density q: d 2 x(t) dt 2 = w(t). (7) Written in state space form this becomes: ] [ ][ ] [ ] [ẋ1 1 x1 = + w(t) (8) ẋ 2 x 2 1 }{{}}{{} F L 7

9 where x 1 (t) x(t) is the actual process and x 2 (t) is its derivative. The corresponding discretized model with a time step t will have the following system and noise matrices: F t is nilpotent with (F t) 2 = so [ ] 1 t A( t) = exp(f t) = I + F t = 1 (9) Q( t) = = = q = q t t t / t exp(f( t τ)) L Q c L T exp(f( t τ)) T dτ [ ] [ ] 1 t τ q [ 1 ] [ ] 1 dτ 1 1 t τ 1 [ ] ( t τ) 2 t τ dτ t τ 1 [ 1 3 ( t τ)3 tτ 1 2 τ2 tτ 1 2 τ2 τ [ 1 3 ( t)3 1 ] 2 ( t)2 Q( t) = q 1 2 ( t)2 t ] (1) Note that the obtained covariance matrix for a discretized continuous process is not equal to a discrete approximation of the process. Assuming a fixed amount of noise for each timestep t would yield the covariance matrix [ 1 4 ( t)4 1 ] 2 ( t)3 Q fixed = q 1 2 ( t)3 ( t) 2 Comparing Q fixed with Q( t) we note that white noise increases the uncertainty of a state parameter at a rate t rather that t. White noise and Brownian motion are thus different from nice mathematical functions. 3 Kalman filter We now present the discrete time Kalman filter in a bottom up fashion. The Kalman filter is the optimal estimator when the initial distribution for state of the system is a (multivariate) Gaussian distribution, the dynamic model and measurement model are both linear functions and system and measurement noise are both additive white Gaussian noise.. 8

10 x k 1 x k y k 1 y k Figure 1: Hidden Markov model implied by the system and measurement equations Under these assumptions the Kalman filter provides the best possible estimate for the state of the dynamic system. Recall that the state evolves according to the linear stochastic equation x k = A k 1 x k 1 + q k 1 [4] where q k 1 N(,Q k 1 ). The measurements equation is also a linear stochastic equation: y k = H k x k + r k, (11) where y k R m is the measurement H k R m n is the measurement model matrix r k N(,R k ) is the measurement noise 3.1 System properties The first thing to notice about equations (4, 11) is that they imply a dependence structure illustrated in Figure 1 called the Markov property. Property 3.1 (Markov property of the states). The Markov property of the states means two things: 1. given x k 1 the state x k is independent of everything else in the past p(x k x 1:k 1,y 1:k 1 ) = p(x k x k 1 ) (12) 2. given x k the state x k 1 is independent of everything else in the future p(x k 1 x k:t,y k:t ) = p(x k 1 x k ) (13) 9

11 A similar property holds for the measurements: Property 3.2 (Conditional independence of measurements). The measurement y k is conditionally independent of everything else given x k : p(y k x 1:T,y 1:T ) = p(y k x k ) These independence properties give rise to the efficient recursive structure of the Kalman filter. From the history up to time k, only x k 1 and y k are needed for figuring out the distribution of x k. p(x k x :k 1,y 1:k ) = p(x k x k 1,y k ) Of course, x k 1 is not known in practice. We can only estimate it s distribution p(x k 1 y 1:k 1 ). The Markov property and the Chapman-Kolmogorov equation [13] now state: p(x k y 1:k 1 ) = p(x k x k 1 )p(x k 1 y 1:k 1 )dx k 1. (14) This means that we can calculate a best estimate for x k based on our best estimate of x k 1 and the transition probability p(x k x k 1 ). When the system dynamic is the linear stochastic equation (4), the transition probability p(x k x k 1 ) in (14) is p(x k x k 1 ) = N( A k 1 x k 1, Q k 1 ). Equation (14) now has a very important property for Gaussian x k 1 : Property 3.3 (Gaussian dynamics). Given the linear stochastic state dynamic (4) and a Gaussian marginal distribution of x k 1 : p(x k 1 ) = N(m k 1,P k 1 ), the marginal distribution of x k is also Gaussian with the distribution p(x k ) = N( m k, P k ), m k = A k 1 m k 1 P k = A k 1 P k 1 A T k 1 + Q k 1 This property is extremely useful because, as we shall see, the distribution of x k 1 is Gaussian under the assumptions made in the beginning of this section. 1

12 3.2 Taking measurements into account The remaining task is to combine the prediction of our dynamic model with the obtained measurement. The mechanism for combining the two pieces of information can be found from Bayes rule or Bayes theorem, which states that p(b A) p(a) p(a B) = p(b) for the events A and B. Bayes rule applies equally for probability densities, but the denominator is best interpreted as a normalizing constant: p(x y) = p(y x) p(x) p(y) = p(y x) p(x) x p(y x) p(x)dx (15) where p(x) is the prior distribution of x before the measurement is taken into account p(y x) is the likelihood model of y, that is. the probability distribution for y given a certain x. p(x y) is the posterior distribution of x, which takes into account both the prior information and measurement. Mapping Bayes rule to our discrete time filtering problem yields the following identities: The prior distribution of x k before considering the measurement y k is p (x k ) p(x k y 1:k 1 ). The likelihood distribution of y k given a certain x k is p(y k x k ). The posterior distribution of x k is simply p(x k ) p(x k y 1:k ). With this notation, Bayes theorem can be restated as p(x k ) = p(y k x k ) p (x k ) x p(y k x k ) p (x k )dx k. (16) Bayes rule can be used as such for arbitrary distributions. However, under our assumptions of the system the equation (16) has the following property that completes our construction of the Kalman filter. 11

13 Property 3.4 (Gaussian update). Given a Gaussian prior distribution p (x k ) = N(m k, P k ), a measurement y k and a Gaussian measurement likelihood distribution p(y k x k ) = N(H k x k, R k ), the posterior distribution of x k is also Gaussian with the distribution p(x k ) = N( m k, P k ), K k = P k HT k (H kp k HT k + R k) 1 m k = m k + K k(y k H k m k ) P k = (I K k H k )P k Due to the properties ( ) the estimated probability distribution of the state can be represented at all times with a fixed size vector m and matrix P. Additionally, only the most recent estimate needs to be stored and so the Kalman filter operates with a fixed amount of memory. 3.3 The discrete-time Kalman filter We summarize the above results into a single algorithm.. Initialize p (x) with mean m and variance P : 1. Predict p (x k ): 2. Update p(x k ): p(x ) = N(m,P ). m k = A k 1 m k 1 (17) P k = A k 1 P k 1 A T k 1 + Q k 1. (18) K k = P k HT k (H kp k HT k + R k) 1 (19) m k = m k + K k(y k H k m k ) (2) P k = (I K k H k )P k (21) Step is performed once and steps 1 and 2 are repeated as necessary for the times k =,1,... If a measurement is not available at a certain time, the update can simply be skipped, allowing the filter to accommodate missing values. Alternatively, if multiple independent measurements are available at time k, the update step can be performed several times with different yk i, Hi k and Ri k. Used in this way, the Kalman filter can perform information and sensor fusion 5. The Kalman filter provides estimates of x k from measurements up to and including k. It is also possible to derive a similar recursive algorithm 5 Although the information is by no means required to arrive at the same timestep k 12

14 that estimates x k based on measurements obtained at times k + 1,...,T, which is known as a Kalman smoother. Classical Kalman filtering theory also includes the Kalman-Bucy filter and smoother, which are fully continuous time solutions to the estimation problem. As such, they d have to be implemented as analog circuits. Lastly, it should be noted that the equations of the Kalman filter have different forms in the literature. While theoretically equivalent, the forms have differences in computational effort and numerical stability(round-off errors) [11, 14, 16]. 4 Non linear filtering The Kalman filter is the exact, closed form solution to the problem of linear stochastic (Gaussian) estimation. Most real life problems do not however fully satisfy the linearity assumption, yet we would still like to use the Kalman filter or some similar algorithm. The first notion is to perform a local linearization of the system and use the filter equations as before. This is now known as extended Kalman filtering and was developed [1] soon after Kalman s initial publication. Extended Kalman filters (EKFs) are no longer optimal estimators, but they perform reasonably well when the problem is approximately linear. However, if the system has a high degree on nonlinearity, the EKF estimates quickly diverge from the actual state. Some problems that have a high degree on nonlinearity include tracking maneuvering targets and bearings only tracking. A fundamental problem in the general nonlinear case is that the exact posterior density function of the state needs to be stored in an infinite dimensional vector, which of course not possible. Thus significant research has been devoted to both finding more general cases of problems with finite solution, as well as efficient and accurate approximation methods. The primary approaches to nonlinear filtering can be divided into three categories, as shown in Figure 2. These categories will be examined in the following subsections. 4.1 Gaussian approximations The filters in the Gaussian approximation family are extremely similar to the Kalman filter. They represent the posterior pdf with a Gaussian approximation and store the same sufficient statistic(mean and covariance). As already mentioned, the EKF was the first and simplest approach to nonlinear filtering. Given the nonlinear filtering model x k = f(x k 1 ) + q k 1 y k = h(x k ) + r k (22) 13

15 Figure 2: Categories on nonlinear filtering techniques 14

16 and comparing with x k = A k 1 x k 1 + q k 1 [4] y k = H k x k + r k [11] the EKF operates by assuming that the model is linear enough and replacing A k 1 and H k in the covariance equations with the Jacobians of f() and h(), evaluated at the mean estimates m k 1 and m k. In doing so, the EKF approximates the nonlinear functions f() and h() with their first order Taylor series. There are several variations on the standard EKF that aim to improve it s performance. The variations include increasing the order of approximation to a second order Taylor series as well as several engineering tricks such as coordinate transformations [5] Sigma point filters The aim of Gaussian filters is to numerically estimate the mean and covariance of the system. Sigma point filters use a clever deterministic sampling technique, which picks a minimal set of sampling points, sigma points, to represent the initial distribution. The sampling points are then propagated through the nonlinear equations and posterior estimates mean and covariance are calculated as weighted sums of sample points. The use of sampling means that there is no need to calculate Jacobians, as in the EKF. Individual filters in the sigma point family include the unscented Kalman filter (UKF) [8] and the central difference Kalman filter (CDKF) [6]. The accuracy of sigma point filters includes higher order terms in the Taylor series and the UKF has been reported to be accurate up to the third order [15]. Despite the increased accuracy, the computational cost of these methods is comparable to the EKF and the UKF can indeed be expressed in matrix form [13]. The use of the sigma point technique can also be considered as a sort of adaptive grid method, which is why it is linked to grid based methods in Figure Exact finite solutions Exact finite solutions to the filtering problem can only be obtained in special cases. A filter which is exact and finite has a fixed and finite dimension M, yet it exactly represents the posterior distribution of the state conditioned on all measurements. Quoting Daum [5]: This is a miracle!. This miracle happens for special types of systems. The first class was discovered by Beneš [2] and published in According to [3], this class of systems admitted a linear measurement model and the following dynamic model: dx t dt = f(x t) + w t 15

17 where w t is Gaussian zero-mean white noise as before and the function f(x) obeys: tr[ x f] + f T f = x T Ax + b T x + c. The sufficient statistic of the Beneš filter is also a pair (m,p), but it is no longer necessarily the mean and covariance of the distribution Daum Daum extended the class of exact finite filters with three progressively more general filters in 1986, which reportedly included the Kalman and Beneš filters as special cases, and finally the new nonlinear filter [4], a generalization of all the above. The new nonlinear filter generalized the conditional probability density of the filter to the exponential family of distributions, which includes among others the normal, gamma, beta and chi-square distributions. The use of the exponential family is based on two properties: 1. Likelihood functions in the exponential family have conjugate priors 2. PDFs in the exponential family have fixed finite dimensional sufficient statistics A family of distributions p(x) is a conjugate prior of a family of likelihood distribution p(y x) if the resulting posterior distributions p(x y) in (15) belong to the same family as the prior. As an example the Gaussian pdf is it s own conjugate prior, which is the reason for Property 3.4 on page 12. Combined, these properties indicate that the filter can use the same fixed sufficient statistic to represent the distribution of the state, conditioned on all measurements. The exponential family may be the largest possible family of distributions for which an exact finite dimensional filter can be constructed. This is due to the Darmois-Pitman-Koopman theorem which states that Property 2 is true only for exponential family distributions. The theorem has been proven only for a parameter estimation problem, but all known filtering problems with exact fixed finite solutions belong to the exponential family. The new nonlinear filter remains the most general exact finite filter. Daum states [5] that the computational complexity of the exact nonlinear recursive filter is Roughly the same order as the EKF for most practical problems, and polynomial in d in general. 16

18 4.3 Discrete approximation Discrete approximation filters operate on a very simple principle, discretization. They approximate a continuous probability density with point probabilities, e.g. pairs of weights and coordinates [3]: p(x) N w i δ(x x i ) i=1 Given a two-dimensional filtering problem, one might construct a simple grid based filter by dividing the relevant area of the state space to a 1 1 grid and applying the system equations to each grid point in turn, updating the weights. This simple grid based approach has a number of problems: The relevant area of the state space must be defined beforehand. The grid must be sufficiently dense to accurately represent the state space. A large part of the calculations are usually performed on grid points with very little weight. The amount of required grid points grows exponentially as the dimension d increases. Despite these drawbacks all discretization based methods share the advantage that (with large enough N) they can represent probability densities of any form, particularly multimodal densities Particle filters Ideally we would like to discretize only the relevant part of the state space and ignore the rest. Particle filters (PFs) are a class of filtering algorithms that try to achieve this using Monte Carlo methods. They work by generating random samples from the distributions involved. If the samples can be generated efficiently, the sampling procedure will guarantee that the filter is only expending calculations on relevant parts of the state space. A simple particle filter, known as Sequential Importance Sampling (SIS) 6, works as follows [3]:. Initiate p(x ) by drawing N particles from the prior distribution. 1. Propagate particles from step k 1 through the system dynamic and add noise: x i k = f(xi k 1 ) + qi k 1. 6 Also known as the bootstrap filter, among others. 17

19 2. Update the particle weights by multiplying with the likelihood function w i, k = w i k 1 p(y k x i k ). 3. Normalize the weights w i k = w i, k N i=1 wi, k. Steps 1-3 are repeated as necessary. The steps 2 and 3 form the Bayes rule update for the particles. Given the particles (w k,x k ) i one can approximately estimate the expectation of any function of the posterior distribution g(x k ): E[g(x k )] 1 N g(x i k N ) wi k The simple particle filter unfortunately suffers from degeneracy. If a measurement y k is received which is far from most of the particles, only a small amount of the particles will receive any significant weight. This problem can easily increase with time. The standard approach to countering degeneracy is to perform resampling when necessary so that particles with high weight are split to several particles and particles with low weight are discarded. Splitting a particle into several will improve diversity after the dynamic system noise is applied to the individual particles, while discarding insignificant particles helps to keep computational cost contained. Sequential Importance Resampling (SIR) is a PF that performs resampling at each step. However, if the noise in the system dynamics is small, the resampling filters are still subject to sample impoverishment. This means that particles with high weights are selected multiple times and the particles concentrate to a tiny portion of the state space. Much of the research on particle filtering deals on other ways of obtaining new samples on following steps 7, e.g. avoiding impoverishment, and on improving the overall efficiency of the filter 8. The filter would be optimally efficient if the prior samples x i k could be generated so that each would receive equal weight in the measurement update. The pdf from which the samples are generated is referred to as the importance or proposal density. The design of a good proposal density is key to getting PFs to perform well on high dimensional problems [5]. 7 Regularized Particle Filter(RPF) 8 Auxiliary SIR filter(asir) i=1 18

20 Posterior pdf representation Computational cost. Strength Weakness Gaussian appr. Gaussian matrix(d 3 ) cheap, well known strong nonlinearities Exact solution Discrete approximation Exponential point probabilities family polynomial in particles, as many as d, usually d 3 possible exact solution can represent any pdf, multimodal pdfs limited to computational cost, exponential sample impoverishment family Table 1: Properties off nonlinear approaches 4.4 Summary of nonlinear filtering Most current approaches to nonlinear filtering can be divided into three categories, as shown in Figure 2. A summary of their properties is given in Table 1. Particle filters can be applied on any nonlinear system and can represent any type of posterior densities. They are computationally the most costly, and are not generally regarded to be appropriate for real-time embedded systems. Gaussian approximations perform a local linearization 9 of the nonlinear dynamic, and propagate the mean and covariance similar to the Kalman filter. They are widely employed in real-time embedded systems due to their low computational cost, but remain limited to near linear systems. Fixed finite nonlinear filters are exact solutions to problems with pdfs in the exponential family. Although they have received comparatively little attention, their performance is typically similar to Gaussian approximation filters and they can solve exactly some types of nonlinear systems. 5 Other developments The previous sections introduced the classical Kalman filter and some nonlinear extensions. The Kalman filter is a recursive algorithm for estimating the current or future state of a system. There are similar algorithms for estimating the past state of a system, known as Kalman smoothers (continuous and discrete). The Viterbi algorithm [12] is also related to Kalman filters and smoothers. It is used for estimating a most likely path of states. Besides the pure approaches described in section 4, there are several combined approaches to nonlinear estimation. There are also avenues off 9 either analytical or statistical(quadrature) 19

21 research on reducing computational cost in sparsely connected systems, as well as on estimating the parameters of an unknown stochastic system. 5.1 Combined approaches to nonlinear filtering The approaches described in section 4 are very much complementary. It is not very surprising that several authors have proposed filters that combine two different approaches to get improved performance. A very common trick is to use the result of an EKF/UKF as the proposal density for a PF. Another trick is replace to point probabilities of particles with something more expressive. The Local Linearization Particle Filter [3] operates by using a separate EKF/UKF to propagate a Gaussian distribution for each sample. Occasionally a part of the model is conditionally linear Gaussian, so that the state is split to two parts, x and θ, where for a given θ the model for x is linear Gaussian with matrices A(θ), Q(θ),... In these cases (typically significant) computational savings can be achieved by solving the linear Gaussian subproblem exactly with a Kalman filter and the rest with, say, a particle filter. These type of filters are known as Rao-Blackwellized filters [13]. 5.2 Variational Bayesian methods What if the model structure is known, but it s parameters are not? Variational Bayesian methods are a class of machine learning algorithms that are used for determining model parameters. They work by turning the original estimation problem into an optimization problem and then setting a tractable structure for the posterior pdfs [7]. The Variational Bayesian Expectation Maximization (VBEM) [1] works by alternating between the optimization of the state posteriors (E-step) and system paramaters (M-step). 6 Conclusion In this survey we have considered the problem of state estimation of noisy dynamic systems from imperfect measurements. The approaches found in the literature have been found to be variations or combinations of either Gaussian approximations, discrete approximations or exact finite solutions. Gaussian approximate filters have generally been considered best suited for real-time applications due to their lower and fixed computational cost. Discrete approximations (particle filters) reportedly have, with careful implementation, the ability to cope with very large nonlinearities. Exact finite filters have been developed for the exponential family of distributions, but these filters have received little attention in publications. In general, choosing an appropriate filter for a given problem requires good knowledge of various filtering techniques and the problem to be solved. 2

22 References [1] Matthew J. Beal. Variational Algorithms for Approximate Bayesian Inference. PhD thesis, University of London, 23. [2] V. E. Beneš. Exact finite-dimensional filters for certain diffusions with nonlinear drift. In Stochastics: An International Journal of Probability and Stochastic Processes, pages 65 92, [3] Sanjeev Arulampalm Branko Ristic and Neil Gordon. Beyond the Kalman filter particle filters for tracking applications. Artech House, 24. [4] Frederick E. Daum. Beyond Kalman filters: practical design of nonlinear filters. In Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, volume 2561, pages , September [5] Frederick E. Daum. Nonlinear filters: Beyond the Kalman filter. In IEEE Aerospace and Electronics Systems Magazine, pages 57 69, August 25. [6] Kazufumi Ito and Kaiqi Xiong. Gaussian filters for nonlinear filtering problems. In IEEE Transactions on Automatic Control, pages , May 2. [7] Tommi S. Jaakkola. Tutorial on variational approximation methods. In Advanced Mean Field Methods - Theory and Practice, pages MIT Press, 21. [8] Simon J. Julier and Jeffrey K. Uhlmann. A new extension of the Kalman filter to nonlinear systems. In In Int. Symp. Aerospace/Defense Sensing, Simul. and Controls, pages , [9] Rudolph Emil Kalman. A new approach to linear filtering and prediction problems. Transactions of the ASME Journal of Basic Engineering, 82(Series D):35 45, 196. [1] Angus P. Andrews Mohinder S. Grewal. Kalman filtering : theory and practice using MATLAB. Wiley-Interscience, 21. [11] Angus P. Andrews Mohinder S. Grewal, Lawrence R. Weill. Global Positioning Systems, Inertial Navigation, and Integration. John Wiley & Sons, Inc., 21. [12] Stuart J. Russell and Peter Norvig. Artificial intelligence a modern approach, 2. ed. Prentice Hall,

23 [13] Simo Särkkä. Recursive Bayesian Inference on Stochastic Differential Equations. Doctoral dissertation, Helsinki University of Technology, 26. [14] Dan Simon. Optimal state estimation : Kalman, H and nonlinear approaches. Wiley-Interscience, 26. [15] E.A. Wan and R. Van Der Merwe. The unscented Kalman filter for nonlinear estimation. In Adaptive Systems for Signal Processing, Communications, and Control Symposium. AS-SPCC. The IEEE, pages , 2. [16] X. Rong Li Yaakov Bar-Shalom and Thiagalingam Kirubarajan. Estimation with applications to tracking and navigation theory, algorithms and software. Wiley,

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations

Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Lecture 6: Multiple Model Filtering, Particle Filtering and Other Approximations Department of Biomedical Engineering and Computational Science Aalto University April 28, 2010 Contents 1 Multiple Model

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density

Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Time Series Prediction by Kalman Smoother with Cross-Validated Noise Density Simo Särkkä E-mail: simo.sarkka@hut.fi Aki Vehtari E-mail: aki.vehtari@hut.fi Jouko Lampinen E-mail: jouko.lampinen@hut.fi Abstract

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Rao-Blackwellized Particle Filter for Multiple Target Tracking

Rao-Blackwellized Particle Filter for Multiple Target Tracking Rao-Blackwellized Particle Filter for Multiple Target Tracking Simo Särkkä, Aki Vehtari, Jouko Lampinen Helsinki University of Technology, Finland Abstract In this article we propose a new Rao-Blackwellized

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive

More information

TSRT14: Sensor Fusion Lecture 8

TSRT14: Sensor Fusion Lecture 8 TSRT14: Sensor Fusion Lecture 8 Particle filter theory Marginalized particle filter Gustaf Hendeby gustaf.hendeby@liu.se TSRT14 Lecture 8 Gustaf Hendeby Spring 2018 1 / 25 Le 8: particle filter theory,

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Particle Filters. Outline

Particle Filters. Outline Particle Filters M. Sami Fadali Professor of EE University of Nevada Outline Monte Carlo integration. Particle filter. Importance sampling. Degeneracy Resampling Example. 1 2 Monte Carlo Integration Numerical

More information

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering

ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering ECE276A: Sensing & Estimation in Robotics Lecture 10: Gaussian Mixture and Particle Filtering Lecturer: Nikolay Atanasov: natanasov@ucsd.edu Teaching Assistants: Siwei Guo: s9guo@eng.ucsd.edu Anwesan Pal:

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations PREPRINT 1 Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations Simo Särä, Member, IEEE and Aapo Nummenmaa Abstract This article considers the application of variational Bayesian

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES 2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES Simo Särä Aalto University, 02150 Espoo, Finland Jouni Hartiainen

More information

Probabilistic Methods in Multiple Target Tracking. Review and Bibliography

Probabilistic Methods in Multiple Target Tracking. Review and Bibliography Probabilistic Methods in Multiple Target Tracking Review and Bibliography Research report B36. ISBN 951-22-6938-4. Simo Särkkä, Toni Tamminen, Aki Vehtari and Jouko Lampinen Laboratory of Computational

More information

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER

EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER EVALUATING SYMMETRIC INFORMATION GAP BETWEEN DYNAMICAL SYSTEMS USING PARTICLE FILTER Zhen Zhen 1, Jun Young Lee 2, and Abdus Saboor 3 1 Mingde College, Guizhou University, China zhenz2000@21cn.com 2 Department

More information

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations

Lecture 1: Pragmatic Introduction to Stochastic Differential Equations Lecture 1: Pragmatic Introduction to Stochastic Differential Equations Simo Särkkä Aalto University, Finland (visiting at Oxford University, UK) November 13, 2013 Simo Särkkä (Aalto) Lecture 1: Pragmatic

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information

A new unscented Kalman filter with higher order moment-matching

A new unscented Kalman filter with higher order moment-matching A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche

More information

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms

L09. PARTICLE FILTERING. NA568 Mobile Robotics: Methods & Algorithms L09. PARTICLE FILTERING NA568 Mobile Robotics: Methods & Algorithms Particle Filters Different approach to state estimation Instead of parametric description of state (and uncertainty), use a set of state

More information

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017

EKF and SLAM. McGill COMP 765 Sept 18 th, 2017 EKF and SLAM McGill COMP 765 Sept 18 th, 2017 Outline News and information Instructions for paper presentations Continue on Kalman filter: EKF and extension to mapping Example of a real mapping system:

More information

2D Image Processing. Bayes filter implementation: Kalman filter

2D Image Processing. Bayes filter implementation: Kalman filter 2D Image Processing Bayes filter implementation: Kalman filter Prof. Didier Stricker Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz http://av.dfki.de

More information

Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements

Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements Fredri Orderud Sem Sælands vei 7-9, NO-7491 Trondheim Abstract The Etended Kalman Filter (EKF) has long

More information

Efficient Sensitivity Analysis in Hidden Markov Models

Efficient Sensitivity Analysis in Hidden Markov Models Efficient Sensitivity Analysis in Hidden Markov Models Silja Renooij Department of Information and Computing Sciences, Utrecht University P.O. Box 80.089, 3508 TB Utrecht, The Netherlands silja@cs.uu.nl

More information

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance

The Kalman Filter. Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience. Sarah Dance The Kalman Filter Data Assimilation & Inverse Problems from Weather Forecasting to Neuroscience Sarah Dance School of Mathematical and Physical Sciences, University of Reading s.l.dance@reading.ac.uk July

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS

FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS FUNDAMENTAL FILTERING LIMITATIONS IN LINEAR NON-GAUSSIAN SYSTEMS Gustaf Hendeby Fredrik Gustafsson Division of Automatic Control Department of Electrical Engineering, Linköpings universitet, SE-58 83 Linköping,

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

An introduction to particle filters

An introduction to particle filters An introduction to particle filters Andreas Svensson Department of Information Technology Uppsala University June 10, 2014 June 10, 2014, 1 / 16 Andreas Svensson - An introduction to particle filters Outline

More information

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION 1 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 3 6, 1, SANTANDER, SPAIN RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

Dynamic System Identification using HDMR-Bayesian Technique

Dynamic System Identification using HDMR-Bayesian Technique Dynamic System Identification using HDMR-Bayesian Technique *Shereena O A 1) and Dr. B N Rao 2) 1), 2) Department of Civil Engineering, IIT Madras, Chennai 600036, Tamil Nadu, India 1) ce14d020@smail.iitm.ac.in

More information

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline

More information

Dual Estimation and the Unscented Transformation

Dual Estimation and the Unscented Transformation Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology

More information

AUTOMOTIVE ENVIRONMENT SENSORS

AUTOMOTIVE ENVIRONMENT SENSORS AUTOMOTIVE ENVIRONMENT SENSORS Lecture 5. Localization BME KÖZLEKEDÉSMÉRNÖKI ÉS JÁRMŰMÉRNÖKI KAR 32708-2/2017/INTFIN SZÁMÚ EMMI ÁLTAL TÁMOGATOTT TANANYAG Related concepts Concepts related to vehicles moving

More information

A Tree Search Approach to Target Tracking in Clutter

A Tree Search Approach to Target Tracking in Clutter 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 2009 A Tree Search Approach to Target Tracking in Clutter Jill K. Nelson and Hossein Roufarshbaf Department of Electrical

More information

Nonparametric Bayesian Methods (Gaussian Processes)

Nonparametric Bayesian Methods (Gaussian Processes) [70240413 Statistical Machine Learning, Spring, 2015] Nonparametric Bayesian Methods (Gaussian Processes) Jun Zhu dcszj@mail.tsinghua.edu.cn http://bigml.cs.tsinghua.edu.cn/~jun State Key Lab of Intelligent

More information

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning

Why do we care? Examples. Bayes Rule. What room am I in? Handling uncertainty over time: predicting, estimating, recognizing, learning Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where

More information

Gaussian Processes for Sequential Prediction

Gaussian Processes for Sequential Prediction Gaussian Processes for Sequential Prediction Michael A. Osborne Machine Learning Research Group Department of Engineering Science University of Oxford Gaussian processes are useful for sequential data,

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI

SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS. A Thesis SIRISH BODDIKURAPATI SEQUENTIAL MONTE CARLO METHODS WITH APPLICATIONS TO COMMUNICATION CHANNELS A Thesis by SIRISH BODDIKURAPATI Submitted to the Office of Graduate Studies of Texas A&M University in partial fulfillment of

More information

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS

RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS RAO-BLACKWELLISED PARTICLE FILTERS: EXAMPLES OF APPLICATIONS Frédéric Mustière e-mail: mustiere@site.uottawa.ca Miodrag Bolić e-mail: mbolic@site.uottawa.ca Martin Bouchard e-mail: bouchard@site.uottawa.ca

More information

STA 4273H: Statistical Machine Learning

STA 4273H: Statistical Machine Learning STA 4273H: Statistical Machine Learning Russ Salakhutdinov Department of Statistics! rsalakhu@utstat.toronto.edu! http://www.utstat.utoronto.ca/~rsalakhu/ Sidney Smith Hall, Room 6002 Lecture 11 Project

More information

STA 4273H: Sta-s-cal Machine Learning

STA 4273H: Sta-s-cal Machine Learning STA 4273H: Sta-s-cal Machine Learning Russ Salakhutdinov Department of Computer Science! Department of Statistical Sciences! rsalakhu@cs.toronto.edu! h0p://www.cs.utoronto.ca/~rsalakhu/ Lecture 2 In our

More information

Constrained State Estimation Using the Unscented Kalman Filter

Constrained State Estimation Using the Unscented Kalman Filter 16th Mediterranean Conference on Control and Automation Congress Centre, Ajaccio, France June 25-27, 28 Constrained State Estimation Using the Unscented Kalman Filter Rambabu Kandepu, Lars Imsland and

More information

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering

Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Density Approximation Based on Dirac Mixtures with Regard to Nonlinear Estimation and Filtering Oliver C. Schrempf, Dietrich Brunn, Uwe D. Hanebeck Intelligent Sensor-Actuator-Systems Laboratory Institute

More information

1 Kalman Filter Introduction

1 Kalman Filter Introduction 1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation

More information

Mobile Robot Localization

Mobile Robot Localization Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations

More information

On Continuous-Discrete Cubature Kalman Filtering

On Continuous-Discrete Cubature Kalman Filtering On Continuous-Discrete Cubature Kalman Filtering Simo Särkkä Arno Solin Aalto University, P.O. Box 12200. FI-00076 AALTO, Finland. (Tel: +358 50 512 4393; e-mail: simo.sarkka@aalto.fi) Aalto University,

More information

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time

Why do we care? Measurements. Handling uncertainty over time: predicting, estimating, recognizing, learning. Dealing with time Handling uncertainty over time: predicting, estimating, recognizing, learning Chris Atkeson 2004 Why do we care? Speech recognition makes use of dependence of words and phonemes across time. Knowing where

More information

A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER

A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER Esra SAATCI Aydın AKAN 2 e-mail: esra.saatci@iku.edu.tr e-mail: akan@istanbul.edu.tr Department of Electronic Eng.,

More information

Riccati difference equations to non linear extended Kalman filter constraints

Riccati difference equations to non linear extended Kalman filter constraints International Journal of Scientific & Engineering Research Volume 3, Issue 12, December-2012 1 Riccati difference equations to non linear extended Kalman filter constraints Abstract Elizabeth.S 1 & Jothilakshmi.R

More information

From Bayes to Extended Kalman Filter

From Bayes to Extended Kalman Filter From Bayes to Extended Kalman Filter Michal Reinštein Czech Technical University in Prague Faculty of Electrical Engineering, Department of Cybernetics Center for Machine Perception http://cmp.felk.cvut.cz/

More information

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection

Lecture Outline. Target Tracking: Lecture 3 Maneuvering Target Tracking Issues. Maneuver Illustration. Maneuver Illustration. Maneuver Detection REGLERTEKNIK Lecture Outline AUTOMATIC CONTROL Target Tracking: Lecture 3 Maneuvering Target Tracking Issues Maneuver Detection Emre Özkan emre@isy.liu.se Division of Automatic Control Department of Electrical

More information

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden.

System identification and sensor fusion in dynamical systems. Thomas Schön Division of Systems and Control, Uppsala University, Sweden. System identification and sensor fusion in dynamical systems Thomas Schön Division of Systems and Control, Uppsala University, Sweden. The system identification and sensor fusion problem Inertial sensors

More information

Learning Gaussian Process Models from Uncertain Data

Learning Gaussian Process Models from Uncertain Data Learning Gaussian Process Models from Uncertain Data Patrick Dallaire, Camille Besse, and Brahim Chaib-draa DAMAS Laboratory, Computer Science & Software Engineering Department, Laval University, Canada

More information

NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH

NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH NONLINEAR STATISTICAL SIGNAL PROCESSING: A PARTI- CLE FILTERING APPROACH J. V. Candy (tsoftware@aol.com) University of California, Lawrence Livermore National Lab. & Santa Barbara Livermore CA 94551 USA

More information

Sequential Bayesian Inference for Dynamic State Space. Model Parameters

Sequential Bayesian Inference for Dynamic State Space. Model Parameters Sequential Bayesian Inference for Dynamic State Space Model Parameters Arnab Bhattacharya and Simon Wilson 1. INTRODUCTION Dynamic state-space models [24], consisting of a latent Markov process X 0,X 1,...

More information

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA

PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Contents in latter part Linear Dynamical Systems What is different from HMM? Kalman filter Its strength and limitation Particle Filter

More information

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1 ROUST CONSTRINED ESTIMTION VI UNSCENTED TRNSFORMTION Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a a Department of Chemical Engineering, Clarkson University, Potsdam, NY -3699, US.

More information

The Scaled Unscented Transformation

The Scaled Unscented Transformation The Scaled Unscented Transformation Simon J. Julier, IDAK Industries, 91 Missouri Blvd., #179 Jefferson City, MO 6519 E-mail:sjulier@idak.com Abstract This paper describes a generalisation of the unscented

More information

Mobile Robot Localization

Mobile Robot Localization Mobile Robot Localization 1 The Problem of Robot Localization Given a map of the environment, how can a robot determine its pose (planar coordinates + orientation)? Two sources of uncertainty: - observations

More information

Bayesian Methods in Positioning Applications

Bayesian Methods in Positioning Applications Bayesian Methods in Positioning Applications Vedran Dizdarević v.dizdarevic@tugraz.at Graz University of Technology, Austria 24. May 2006 Bayesian Methods in Positioning Applications p.1/21 Outline Problem

More information

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft

Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft 1 Using the Kalman Filter to Estimate the State of a Maneuvering Aircraft K. Meier and A. Desai Abstract Using sensors that only measure the bearing angle and range of an aircraft, a Kalman filter is implemented

More information

Lecture 9. Time series prediction

Lecture 9. Time series prediction Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture

More information

Introduction to Unscented Kalman Filter

Introduction to Unscented Kalman Filter Introduction to Unscented Kalman Filter 1 Introdution In many scientific fields, we use certain models to describe the dynamics of system, such as mobile robot, vision tracking and so on. The word dynamics

More information

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes

Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Sequential Monte Carlo methods for filtering of unobservable components of multidimensional diffusion Markov processes Ellida M. Khazen * 13395 Coppermine Rd. Apartment 410 Herndon VA 20171 USA Abstract

More information

Probabilistic Graphical Models

Probabilistic Graphical Models Probabilistic Graphical Models Lecture 12 Dynamical Models CS/CNS/EE 155 Andreas Krause Homework 3 out tonight Start early!! Announcements Project milestones due today Please email to TAs 2 Parameter learning

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

2D Image Processing (Extended) Kalman and particle filter

2D Image Processing (Extended) Kalman and particle filter 2D Image Processing (Extended) Kalman and particle filter Prof. Didier Stricker Dr. Gabriele Bleser Kaiserlautern University http://ags.cs.uni-kl.de/ DFKI Deutsches Forschungszentrum für Künstliche Intelligenz

More information

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007

Sequential Monte Carlo and Particle Filtering. Frank Wood Gatsby, November 2007 Sequential Monte Carlo and Particle Filtering Frank Wood Gatsby, November 2007 Importance Sampling Recall: Let s say that we want to compute some expectation (integral) E p [f] = p(x)f(x)dx and we remember

More information

Variational Learning : From exponential families to multilinear systems

Variational Learning : From exponential families to multilinear systems Variational Learning : From exponential families to multilinear systems Ananth Ranganathan th February 005 Abstract This note aims to give a general overview of variational inference on graphical models.

More information

F denotes cumulative density. denotes probability density function; (.)

F denotes cumulative density. denotes probability density function; (.) BAYESIAN ANALYSIS: FOREWORDS Notation. System means the real thing and a model is an assumed mathematical form for the system.. he probability model class M contains the set of the all admissible models

More information

Local Positioning with Parallelepiped Moving Grid

Local Positioning with Parallelepiped Moving Grid Local Positioning with Parallelepiped Moving Grid, WPNC06 16.3.2006, niilo.sirola@tut.fi p. 1/?? TA M P E R E U N I V E R S I T Y O F T E C H N O L O G Y M a t h e m a t i c s Local Positioning with Parallelepiped

More information

Blind Equalization via Particle Filtering

Blind Equalization via Particle Filtering Blind Equalization via Particle Filtering Yuki Yoshida, Kazunori Hayashi, Hideaki Sakai Department of System Science, Graduate School of Informatics, Kyoto University Historical Remarks A sequential Monte

More information

Ground Moving Target Parameter Estimation for Stripmap SAR Using the Unscented Kalman Filter

Ground Moving Target Parameter Estimation for Stripmap SAR Using the Unscented Kalman Filter Ground Moving Target Parameter Estimation for Stripmap SAR Using the Unscented Kalman Filter Bhashyam Balaji, Christoph Gierull and Anthony Damini Radar Sensing and Exploitation Section, Defence Research

More information

Tracking an Accelerated Target with a Nonlinear Constant Heading Model

Tracking an Accelerated Target with a Nonlinear Constant Heading Model Tracking an Accelerated Target with a Nonlinear Constant Heading Model Rong Yang, Gee Wah Ng DSO National Laboratories 20 Science Park Drive Singapore 118230 yrong@dsoorgsg ngeewah@dsoorgsg Abstract This

More information

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France.

AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER. Qi Cheng and Pascal Bondon. CNRS UMR 8506, Université Paris XI, France. AN EFFICIENT TWO-STAGE SAMPLING METHOD IN PARTICLE FILTER Qi Cheng and Pascal Bondon CNRS UMR 8506, Université Paris XI, France. August 27, 2011 Abstract We present a modified bootstrap filter to draw

More information

Autonomous Navigation for Flying Robots

Autonomous Navigation for Flying Robots Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.2: Kalman Filter Jürgen Sturm Technische Universität München Motivation Bayes filter is a useful tool for state

More information

Expectation Propagation in Dynamical Systems

Expectation Propagation in Dynamical Systems Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex

More information

17 : Markov Chain Monte Carlo

17 : Markov Chain Monte Carlo 10-708: Probabilistic Graphical Models, Spring 2015 17 : Markov Chain Monte Carlo Lecturer: Eric P. Xing Scribes: Heran Lin, Bin Deng, Yun Huang 1 Review of Monte Carlo Methods 1.1 Overview Monte Carlo

More information

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models

Density Propagation for Continuous Temporal Chains Generative and Discriminative Models $ Technical Report, University of Toronto, CSRG-501, October 2004 Density Propagation for Continuous Temporal Chains Generative and Discriminative Models Cristian Sminchisescu and Allan Jepson Department

More information

Auxiliary Particle Methods

Auxiliary Particle Methods Auxiliary Particle Methods Perspectives & Applications Adam M. Johansen 1 adam.johansen@bristol.ac.uk Oxford University Man Institute 29th May 2008 1 Collaborators include: Arnaud Doucet, Nick Whiteley

More information

Online tests of Kalman filter consistency

Online tests of Kalman filter consistency Tampere University of Technology Online tests of Kalman filter consistency Citation Piché, R. (216). Online tests of Kalman filter consistency. International Journal of Adaptive Control and Signal Processing,

More information

A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem

A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem Gabriel Terejanu a Puneet Singla b Tarunraj Singh b Peter D. Scott a Graduate Student Assistant Professor Professor

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

A KALMAN FILTERING TUTORIAL FOR UNDERGRADUATE STUDENTS

A KALMAN FILTERING TUTORIAL FOR UNDERGRADUATE STUDENTS A KALMAN FILTERING TUTORIAL FOR UNDERGRADUATE STUDENTS Matthew B. Rhudy 1, Roger A. Salguero 1 and Keaton Holappa 2 1 Division of Engineering, Pennsylvania State University, Reading, PA, 19610, USA 2 Bosch

More information

CS491/691: Introduction to Aerial Robotics

CS491/691: Introduction to Aerial Robotics CS491/691: Introduction to Aerial Robotics Topic: State Estimation Dr. Kostas Alexis (CSE) World state (or system state) Belief state: Our belief/estimate of the world state World state: Real state of

More information

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino ROBOTICS 01PEEQW Basilio Bona DAUIN Politecnico di Torino Probabilistic Fundamentals in Robotics Gaussian Filters Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile

More information

The Kalman filter. Chapter 6

The Kalman filter. Chapter 6 Chapter 6 The Kalman filter In the last chapter, we saw that in Data Assimilation, we ultimately desire knowledge of the full a posteriori p.d.f., that is the conditional p.d.f. of the state given the

More information

Lecture : Probabilistic Machine Learning

Lecture : Probabilistic Machine Learning Lecture : Probabilistic Machine Learning Riashat Islam Reasoning and Learning Lab McGill University September 11, 2018 ML : Many Methods with Many Links Modelling Views of Machine Learning Machine Learning

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters

Exercises Tutorial at ICASSP 2016 Learning Nonlinear Dynamical Models Using Particle Filters Exercises Tutorial at ICASSP 216 Learning Nonlinear Dynamical Models Using Particle Filters Andreas Svensson, Johan Dahlin and Thomas B. Schön March 18, 216 Good luck! 1 [Bootstrap particle filter for

More information