Unscented Kalman Filter/Smoother for a CBRN Puff-Based Dispersion Model

Size: px
Start display at page:

Download "Unscented Kalman Filter/Smoother for a CBRN Puff-Based Dispersion Model"

Transcription

1 Unscented Kalman Filter/Smoother for a CBRN Puff-Based Dispersion Model Gabriel Terejanu Department of Computer Science and Engineering University at Buffalo Buffalo, NY terejanu@buffalo.edu Tarunraj Singh Department of Mechanical and Aerospace Engineering University at Buffalo Buffalo, NY tsingh@eng.buffalo.edu Peter D. Scott Department of Computer Science and Engineering University at Buffalo Buffalo, NY peter@buffalo.edu Abstract Fixed interval smoothing for systems with nonlinear process and measurement models is studied and applied to the assimilation of sensor data in a Chemical, Biological, Radiological or Nuclear CBRN incident scenario. A two-filter smoother that uses a Bacward Sigma-Point Information Filter, and also a forward-bacward Rauch-Tung-Striebel RTS smoothing form are re-derived using the weighted statistical linearization concept. Both methods are derived in the context of the Unscented Kalman Filter. The square root version of the resulting RTS Unscented Kalman Filter / Smoother is applied to a CBRN dispersion puff-based model with variable state dimension, and the data assimilation performance of the method is compared with a Particle Filter implementation. Keywords: data assimilation, sigma-point filtering, unscented alman smoother, variable state dimension, chemical dispersion, puff-based model. I. INTRODUCTION For a linear system, under the assumption of Gaussian probability distributions, the problem of estimating the states of the system has an exact closed form solution given by the Kalman Filter [10]. If the probability distributions are non- Gaussian or the system is nonlinear, different assumptions and approximations have to be made for both accuracy and tractability. The systems considered here are in general nonlinear, but the assumption that the uncertainty can be adequately modeled as Gaussian is made. The estimation of the unnown system states given the underlying dynamics of the system and a set of observations may be stated as a filtering, smoothing or prediction problem. Given a fixed discrete time interval, {t 1,t 2,...t N }, over which observations are available, the problem of filtering is to find the best state at time t given all the observations prior to t and including t. The smoothing problem is to find the best state at time t given all the observations up to time t N,t t N.Fort >t N one can define the prediction problem forecasting the state of the system for time t using all the observations in the given interval. The nonlinear filtering problem has been extensively studied and various methods are provided in literature. Among the most useful ones are the Extended Kalman Filter EKF, the Ensemble Kalman Filter EnKF, the Unscented Kalman Filter UKF, and the Particle Filter PF. The EKF is historically the first, and still the most widely adopted approach to solve the nonlinear estimation problem. It is based on the assumption that the nonlinear system dynamics can be accurately modeled by a first-order Taylor series expansion [2]. The EnKF is a reduced ran filter which propagates the states through nonlinearity and updates a relatively small ensemble of samples from which an assumed Gaussian distribution captures the main characteristics in the uncertainty [3]. The PF also uses a sampling approach to estimate the higher order moments of the posterior probability distribution by propagating and updating a number of particles, but without assuming Gaussian statistics [17]. The UKF, which is a derivative-free alternative to EKF, overcomes the differentiation problem by using a deterministic sampling approach [8], [21]. The state distribution is represented using a minimal set of carefully chosen sample points, called sigma points. Along with the Central Difference Kalman Filter and the Divided Difference Filter, the Unscented Kalman Filter belongs to a more general class of filters referred to as Sigma-Point Kalman Filters or Linear Regression Kalman Filters, which use the statistical linearization technique [5], [11]. This technique is used to linearize a nonlinear function of a random variable through a linear regression between n points drawn from the prior distribution of the random variable. Since we are considering the spread of the random variable during linearization, the technique tends to be more accurate than Taylor series linearization used in the EKF, particularly in the presence of strong nonlinearities [18]. The nonlinear smoothing problem has received less attention than the filtering problem in the literature. In many critical applications, however, its advantages compared to filtering mandates its consideration. A significantly better situation assessment can often be made by waiting for several subsequent observations to decrease the uncertainty in state estimates at a given time. In critical applications, the trade-off between state estimation accuracy and decision latency may favor this delay. In CBRN dispersion scenarios, the problem of source determination is of this character. Additionally, CBRN sensor readings with low confidence at time t mayresultinmisleading assessment of the plume dynamics until corroborated or contradicted by smoothing from later readings. The fixed-

2 interval smoothing problem is the most important of the three basic smoothing forms, the other two being fixed-point and fixed-lag, and can be used to derive the other two [1]. For that reason, we focus here on the fixed-interval smoothing problem. Concerning the use of deterministic sampling in the smoothing context, Wan and van der Merwe introduced the EM - Unscented Smoother [22] which uses a neural networ for both the bacward and the forward dynamics. Briers and Doucet have developed a two-filter smoother called the Artificial Smoother [1] that uses an artificial prior distribution over the bacward information state to ensure integrability. Zoeter and Ypma use dynamical programming to pass messages using sigma-points [23]. Following the wor of Mao et al. [12], Psiai and Wada have derived two variants of the sigma-point smoother. They provide an RTS form of the smoother and an second form that uses a pseudo-measurement [14]. The present paper has two principal goals. First, the Bacward Information Sigma-Point Filter BSPF, the Fraser-Potter two-filter form of the UKS [4] and the RTS single-filter form of the UKS [12], [14] are each re-derived using the approach of Weighted Statistical Linearization WSL [5]. WSL is a general modeling framewor for optimal estimation which, in this setting, permits a more direct derivation of the general nonlinear estimates and error statistics than previous approaches. Second, a square-root form of the RTS UKS is numerically tested in a simulated chemical plume dispersion scenario using a standard Gaussian puff model and noisy concentration sensors. Results are compared with the variabledimension Bootstrap Particle Filter of Reddy et al. [16]. The WSL, the UKF and the Sigma-Point Information Filter SPIF are introduced in Section II. The BSPIF and the smoother update for the two filter smoothing formulation are derived in Section III. The derivation of the RTS form of the UKS based on two filter formulation is described in Section IV. Numerical results of the RTS SR-UKS on the test scenario are presented in Section V and the conclusions and future wor are discussed in Section VI. A. Problem Statement II. SIGMA-POINT KALMAN FILTERS Consider the following nonlinear system, described by the difference equation and the observation model with additive Gaussian noise: x 1 = fx w 1 z = hx v 2 The initial state x 0 is a random vector with nown mean µ 0 = E[x 0 ] and covariance P 0 = E[x 0 µ 0 x 0 µ 0 T ]. In the following we assume that the random vector w captures uncertainties in the model and v denotes the measurement noise. Both are temporally uncorrelated, zero-mean random sequences with nown covariances and both of them are uncorrelated with the initial state x 0. Also the two random vectors w and v are uncorrelated. The following notations are made for the noise covariances: E[w w T ]=Q E[v v T ]=R 3 B. Weighted Statistical Linearization WSL Rather than linearizing a nonlinear function of a random variable about a single point using the truncated Taylor series, as in the EKF, we can select n points from the prior distribution of the state and linearize the function at hand through a linear regression using these points [18]. Consider a generally nonlinear function y = fx, x is a random vector of given mean x and covariance P xx,anda 3-tuple of points W i, X i, Y i Y i = fx i and W i are weights that sum to one. Our goal is to linearize the nonlinear function fx Ax b e, e is a zero mean random variable. The following statistics can be computed: x = P xx = ȳ = P yy = P xy = W i X i 4 W i X i xx i x T 5 W i Y i 6 W i Y i ȳy i ȳ T 7 W i X i xy i ȳ T 8 The point-wise error is given by E i = Y i AX i b. By minimizing the weighted sum-of-squared-errors one can compute the parameters A and b. ē = W i E i =0 9 {A, b} = argmin W i E i E i T 10 Given the above statistics, the solution of the minimization problem is given by: A = P T xy P xx 11 b = ȳ A x 12 P ee = P yy AP xx A T 13 The WSL provides an alternative linear model to the firstorder Taylor series expansion, being very convenient if the function is not differentiable or hard to differentiate. Also, the method tends to be more accurate than the first-order Taylor expansion in uncertainty propagation since it explicitly accounts for the probabilistic spread of the random variables during linearization [5], [6], [18].

3 C. Unscented Kalman Filter UKF The UKF is founded on the intuition that it is easier to approximate a probability distribution that it is to approximate an arbitrary nonlinear function or transformation [8]. The sigma points are a deterministic sample set, or cloud, of states chosen so that their mean and covariance are exactly x and P. Each sigma point is propagated through the nonlinear state equations yielding a cloud of transformed points. The estimated mean and covariance of the predicted state are then computed based on the statistics of this set of propagated sigma points. This process is referred to as the unscented transformation. The unscented transformation is a general method for calculating the statistics of a random vector which undergoes a nonlinear transformation [21]. Let X f be a set of 2n 1 sigma points n is the dimension of the state space and their associated weights: X f = { X i f, W im f, W ic f } i =0...2n 14 The first set of weights, {W im f }, is used to compute the first order moment the mean and the second set of weights, {W ic f }, is used to compute the second order moment the covariance. The subscript f indicates that we are in the forward pass. The sigma-points are selected to capture the first and the second order moments of the prior distribution [7], [21]. In Table I the point selection scheme to provide a scaled unscented transformation is shown. Where n λp f X 0 f = x f Xf i = x f qn λp f i i =1...n Xf i = x f qn λp f i i = n 1...2n Table I SELECTION OF SIGMA POINTS W 0m f = λ nλ W 0c f = λ nλ 1 α2 β W im 1 f = 2nλ W ic f = 1 2nλ W im 1 f = 2nλ W ic f = 1 2nλ is the ith row or column of the matrix, and λ = α 2 nκ n. The scaling parameter α controls the position of sigma points and usually 10 4 α 1. While any selection of scaling parameters matches the second-order statistics of the state, larger values of α set the sigma points further from the mean. The scaling parameter κ is usually set either to 0 for state estimation or to 3 n for parameter estimation [21]. Under no assumptions about the nonlinear function and Gaussian distributions the optimal value for β is set to 2 [8]. The lielihoods of the predicted states are determined using the generally nonlinear measurement mapping and the sample statistics once again computed. Given the new observation z, the data assimilation step is derived as in the Kalman Filter the best state estimate is computed as a linear i combination of the forecast estimate and the innovation as shownintableii. For Gaussian inputs, the unscented transformation gives accuracy to the third order moment and for non-gaussian uncertainties the approximations are accurate at least to the second order with partially higher order moment matching based on the choice of the scaling parameters [8], [18]. Table II UNSCENTED KALMAN FILTER Propagate through process model X i f1 = fx i f for all i x f1 = P 2n i=0 Wim f Xf1 i P f1 = P 2n i=0 Wic f X i f1 x f1 P f1 = P f1 Q q X f1 = hx f1 x f1 γ P f1 γ = n λ Propagate through measurement model Z i f1 = hx i f1 for all i z f1 = P 2n i=0 Wim f Zi f1 P zz f1 = P 2n i=0 Wic f P zz f1 = Pzz f1 R P xz f1 = P 2n i=0 Wic f Data Assimilation Step K f1 = P xz f1 Pzz f1 Z i f1 z f1 x f1 = x f1 K f1ez 1 z f1 P f1 = P f1 K f1p zz f1 KT f1 T Xf1 i x f1 i x f1 qp γ f1 Zf1 i z f1 T T Xf1 i x f1 Zf1 i z f1 D. Sigma-Point Information Filter SPIF The Kalman filter recursively propagates the covariance of the state estimate, while the information filter propagates its inverse. This is a useful substitution when little or no initial information is available to begin filter recursions, as for the bacward pass of a smoother. Given the process model 1, weighted statistical linearization can be used yielding F. Equation 1 becomes and P x f x f1 follows: x 1 = F x f w e 1 15 F = P T x f x f1 P f 16 f = x f1 F x f 17 P 1 = P f1 F P f FT 18 is calculated in the forward step as

4 Table III CROSS COVARIANCE OF NONLINEAR MAPPING P x = P 2n f x i=0 Wic f f1 X i f x f X i f1 x f1 Here the statistical linearization is accomplished by using sigma points drawn from N x f, P f. The forecast error covariance P f1 is given in Table II and it does not incorporate the effect of process noise, it is only capturing the errors in the state transition linearization. The same strategy holds for the measurement equation 2, using sigma points drawn from N x f, P f. z = H x h v e 2 19 H = P xz f T Pf 20 h = z f H x f 21 P 2 = P zz f H P f H T 22 Also the error covariance P zz f does not incorporate the effect of measurement noise. Both P xz f and P zz f are given in Table II. With the above determinants one can derive The Forward Sigma-Point Information Filter as shown in [20]: Table IV FORWARD SIGMA-POINT INFORMATION FILTER Prediction y f1 = Y f1 F x f f Y f1 = F P f FT Q P 1 Update y f1 = y f1 H T R P 2 e e ez h Y f1 = Y f1 H T R P 2 H III. UNSCENTED KALMAN SMOOTHER - TWO FILTER FORMULATION In a fixed-interval smoothing problem, the entire batch of observations over the interval are used to estimate the entire batch of states. A two filter formulation, one running forward and one bacward, may be employed. The smoothed estimate may be expressed as a linear combination of the forward estimate and the bacward estimate [4]. For our problem the forward filter is UKF and the bacward filter is selected to be a Bacward Sigma-Point Information Filter. This choice is governed by the lac of information with which to initialize the filter at the end of the fixed interval. A. Bacward Sigma-Point Information Filter BSPIF To derive the unscented bacward information filter we use weighted statistical linearization as described in Section II-B, T with the point selection scheme given by Julier [8], following the approach of Crassidis [2]. Given the following bacwards Marovian model [9], which is a sample path equivalent to 15, one can find the bacward forecast, or hindcast estimate: x = F B 1x 1 F f w B 1 23 F B 1 = F Q B 1 = F B 1 [ I ] Q P 1 Π 1 Q P 1 F T Π 1 = F Π F T Q P 1 Π 0 = P 0 We may assume that Π 1 becomes large very fast due to the large initial uncertainty specific to the chem-bio dispersion problem and the assumption that the F has supraunitary singular values, avoiding this way to properly model the bacward model. For the simulation presented in this paper, det Π 1. Thus one may choose to neglect the contribution of Π 1, resulting in FB 1 F and a simplified bacward model. x b = F x b1 f 24 As will be demonstrated, no inverse dynamics will be used to construct the hindcast. The three quantities, F, f and P 1, are assumed to have been determined and stored in the forward pass. The hindcast error and error covariance are given by: x b = F P b = F x b1 F w F e1 25 F P b1 Q P 1 T 26 Given the forward pass results, sigma points for the linearization of the measurement update may be drawn from N x f, P f for a better accuracy. P zz f z = H x h v e 2 27 H = P xz f T P f 28 h = z f H x f 29 P 2 = P zz f H P f H T 30 in 30 does not include the effect of measurement noise and is computed in Table V. Z i f = hx i f Table V BSPIF AUXILIARY QUANTITIES for all i z f = P 2n i=0 Wim f Zi f P zz f = P T 2n i=0 Wic f Zf i z f Zf i z f P xz f = P T 2n i=0 Wic f Xf i x f Zf i z f

5 As with the process model, the three quantities determined for the measurement model, H, h and P2, are determined and stored in the forward pass. The filtered bacward estimate after assimilating the current measurement is computed as: x b = x b K b z z b 31 z b = H x b h 32 The bacward gain is determined by minimizing the trace of the filtered bacward error covariance. The error and covariance of the filtered bacward estimate are given by: x b = x b K b x b v e 2 33 P b = P b K b H P b H T R P 2 K T b P b H T K T b K b H P b 34 Minimizing the trace of filtered bacward error covariance yields the bacward gain K b = P b H T H P b H T R P 2 35 Substituting the bacward gain bac into 34, the bacward error covariance becomes: The same procedure yields the bacward information estimate y b from 31: y b = y b Y b K b z h 45 Combining 37 and 45, y b = y b H T R P 2 z h 46 The following table summarizes the BSPIF. Table VI BACKWARD SIGMA-POINT INFORMATION FILTER Initialization y bn =0 Y bn =0 Bacward Update y b = y b H T R P 2 ez h Y b = Y b H T R P 2 H Bacward Propagation K b = Y b1»y b1 Q P 1 y b = FT I K b y b1 Y b1 f Y b = FT I K b Y b1 F P b = P b P b H T [H P b H T R ] H P 2 e e P b 36 = I K b H P b 37 In the information form of this filter, the bacward estimates and the covariance matrices are replaced by the associated information states and information matrices given by y b Y b x b y b Y b x b 38 Y b P b Y b P b 39 Substituting these relations into 24, 26, 31, 36, after applying the matrix inversion lemma and some algebra, Y b = F T I K b Y b1 F 40 Y b = Y b H T R P 2 H 41 ] K b = Y b1 [Y b1 Q P 1 42 The hindcast estimate equation 24 becomes the baccast information estimate equation [ Y y b = Y ] b F b1 y b1 f 43 Substituting 40 into 43, y b = F T I K b y b1 Y b1 f 44 B. Smoother Update for the two filter formulation By seeing an unbiased smoothed estimate that is a linear combination of the forward and bacward estimates, one can compute the smoothed error covariance provided that Π 1 is negligible: P s = [P f P b ] 47 Using the matrix inversion lemma in 47 and following the derivation as in [2], the smoother update may be summarized without any modification as follows in Table VII. Table VII SMOOTHER UPDATE FOR THE TWO FILTER FORMULATION K s = P f Y b I P f Y b P s = I K s P f x s = I K s x f Ps y b The entire UKS two-filter algorithm is described by the order of the following tables: I, II, III, V, VI, VII. IV. UNSCENTED KALMAN SMOOTHER -RTSFORM A more computationally efficient form for fixed interval smoothing in the linear case was introduced by Rauch, Tung and Striebel [15]. The RTS smoother combines the bacward filter and the smoother update into a single-step correction to

6 the forward filter estimate which is based on the information gain in the forward pass. The following assumptions are made about the two linearized versions of the measurement nonlinear function in order to lin the forward and the bacward information filter: H H = H 48 h h = h 49 P 2 P 2 = P 2 50 The errors in these approximations are small as long as the differences in the linearizations which obtain at the forecast state and at the updated state after the observation are small. This assumption can be violated in strongly nonlinear regions of certain process models, such as bifurcation manifolds, but for the present derivation sufficient linearization smoothness is assumed to justify these approximations. Applying the matrix inversion lemma in 47 the smoothed error covariance may be rewritten as: P s = P f P f P b P f P f 51 Using 26, 18 and the expression for P f1 from Table II, the inverse of the sum becomes: P b f P = F T P f1 b1 P F 52 The only bacward information reference in the above equation is the bacward error covariance, which may be expressed in terms of the smoothed and the forecast error covariance. P b1 = P s 1 Yf1 53 Substituting 53 in 52, factoring out Y f1 and applying the the matrix inversion lemma: P b P f = F T Y f1 P f1 Ps 1 Y f1 F 54 Substituting 54 bac in 51 and further using 16: P s = P f Ks P f1 Ps 1 K s T 55 K s = P x P f x f1 56 f1 The same steps may be used to eliminate the bacward information in the smoothed estimate: Substituting 43 bac in 57: x s = Ps y f Ps Y b F x s = P s y f Ps y b 57 [ Y b1 ] y b1 f 58 Incrementing the time step in 57 and writing y b1 in terms of y b1 by using 46: [ x s 1 = P s 1 y f1 Ps 1 y b1 HT 1 ] R 1 P 2 e 1 e z1 1 h 1 59 Expressing y b1 in terms of xs 1 and using the time update equation for y f1 from Table IV: y b1 = P s 1 x s 1 y f1 60 Using 60, 58 and 47, [ Y x s = P s y f Ps Y b F b1 P s 1 x s 1 ] Y b1 y f1 f 61 Substituting 47 for Y b and 53 for Y, b1 equation 61 becomes: { x s = P s y f Φ P s 1 P f1 [ ] } P s 1 x s 1 y f1 f 62 Φ = K s P f1 Ps 1 P f1 63 Substituting 63 bac in 62 and expanding P s in the first term of the sum, after some algebra x s = x f Ks xs 1 x f1 64 The following table summarizes the RTS correction to Unscented Kalman Filter Table II: Table VIII SMOOTHER UPDATE -RTSFORM K s = P x f x f1 P f1 x s = x f Ks xs 1 x f1 P s = P f Ks P f1 Ps 1 K s T The entire UKS RTS algorithm is described by the order of the following tables: I, II, III, VIII. V. NUMERICAL RESULTS Note that in order to compute each new set of sigma points we need the square root of the posterior covariance matrix P f = S f S f T. Since the update is applied to the full posterior covariance the algorithm can be reframed to propagate the square root matrix directly, S f. A square root version of the UKF, based on Cholesy factorization, Cholesy ran 1 update and QR decomposition, is provided in [19]. Along the same lines a square root version of the RTS Smoother may be formulated. Since QR-decomposition and Cholesy factorization tend to control better the round off errors and there are no matrix inversions, the SRUKF has better numerical properties than the UKF and it also guarantees positive semi-definiteness of the error covariance. The performance of Square Root Unscented Kalman Filter / Smoother is tested on a representative atmospheric dispersion puff-based model, based on RIso Mesoscale PUFF

7 RIMPUFF [13], and compared with a Particle Filter data assimilation technique reported by Reddy et al. [16]. RIMPUFF is a Lagrangian mesoscale dispersion model that predicts the distribution of airborne material over time and space. The amount of material released is represented as a series of Gaussian-shaped puffs. Due to the simplicity of the model and the lac of nowledge about the initial conditions release source the model forecast tends to become less accurate for longer simulations. Thus a data assimilation scheme is required to correct model predictions by use of observed sensor data. The dynamics of the ith puff are given by 65 to 69. X1 i = X i u X i T 65 Y1 i = Y i u Y i T 66 σxy i 1 = p y x i y 1q 67 x i 1 = x i u 2 X u2 Y T 68 Q i 1 = Q i 69 X i,yi is the position of the center of the puff, σ i xy 1 and Q i 1 is the puff spread and puff mass and xi 1 is the downwind distance from the source; u =u X i,u Y i is the wind vector at the puff center and p y,q y are Karlsruhe- Jülich diffusion coefficients which depend on weather conditions [16]. To verify the results for SRUKF and SRUKS, a truth model simulation has been carried out using a 2D RIMMPUFF model and a dispersion region of m 2 with 200 m resolution and sensors located every 1 m. The total simulation time is 3600 sec andthesamplingintervalis20 sec. The release occurs every three time steps for the first 900 sec and the wind direction changes after half hour from 15 to 60. The wind speed is 5 m/sec with 20% standard deviation [16]. Given this parametrization the concentration dosage yielded by the truth model after one hour is illustrated in Figure 1. y Figure 1. Concentration dosage after one hour x The measurements are obtained from the generated truth data using a continuous measurement model, the concentration at a grid point is computed by summing the contributions of all the puffs [13], [16]. The concentration at each grid point x g,y g is given by 70. [ ] N Q i exp 1 X i x g 2 Y i y g πσ 2 xyi σ 2 xyi, N is the number of puffs. The difficulty of the problem is significantly increased by the five sources of uncertainty used: duration of the chemical release from the source, initial puff parameters, uncertain parameters wind fluctuations, modeling deficiencies, puff splitting and observation uncertainty. The model evolved with variable dimension, caused by accumulation of source puffs and puff-splitting. The results presented in Figure 2 have been Monte-Carlo averaged over 50 runs. RMS Error Figure 2. RMS Error Comparison Model PF UKF UKS time [min] The forecast error covariance was increased to maintain positive-definiteness when puff-splitting occurred, an heuristic used to compute the smoothed gain. Comparing the SRUK Filter and the Particle Filter, their performance is somewhat similar and the Filter is performing better for few time intervals. Since no systematic effort was made to tune the parameters of either algorithm, no clear conclusion can be drawn from this observed performance difference, except that neither scheme demonstrated a clear advantage in this test environment. Computationally, the Particle Filter has an increasing advantage over time as it does not suffer from the exponential increase in the number of sigma points as the state space multiplies that attends the sigma point schemes. Comparing the SRUK Filter and Smoother performance in Figure 2, it is noted that sharp error peas occasionally experienced by the Filter are significantly reduced by the Smoother. It is characteristic of statistical estimation that sudden excursions into difficult-to-estimate state domains are better tolerated when they can be smoothed by data acquired after a given estimation time as well as before. This suggests that in the CBRN response setting, smoothing may be worth

8 the computational and decision-latency costs abrupt transitions in state space are both mission-relevant and possible. Gridwise Figure 3, due to the number of points for which the Smoother performs better, as well as the magnitude of the difference between absolute error of the Filter and the absolute error of the Smoother, suggests that the Smoother is more suited for situation assessment or alarm triggering. Figure 3. Surface plot of error difference VI. CONCLUSION The fixed-interval Sigma-Point Kalman Smoother has been studied. Two solutions to this nonlinear smoothing problem are re-derived using weighted statistical linearization: the forwardbacward two-filter formulation and the RTS formulation. The Bacward Sigma-Point Information Filter is derived under the assumption that the propagation of the initial covariance of the state vector becomes large very fast, maing its inverse negligible. The derivation of the RTS smoother provides a better understanding of the difference between the two filter formulation and the RTS form in the context of nonlinear estimation. Due to the measurement mapping linearization assumption and by rewriting the bacward information, in the two filter formulation, in terms of the smoothed and the forward information we yield the same result as Psiai [14] for the RTS form of the UKS. The SRUKS RTS form has been tested on a puff-based dispersion model and its performance compared with the PF and SRUK Filter. The Smoother avoids sudden large errors experienced by the SRUK Filter and overall gridwise provides better estimates. The PF has an increasing computational advantage over time compared to Sigma Point estimators due to its dimensional invariance as the number of puffs increases. Based on these pilot studies, both the PF and SRUK Smoother merit consideration and further investigation in the important CBRN data assimilation application domain. ACKNOWLEDGMENT This wor was supported by the Defense Threat Reduction Agency DTRA under Contract No. W911NF-06-C The authors gratefully acnowledge the support and constructive suggestions of Dr. John Hannan of DTRA. REFERENCES [1] M. Briers, A. Doucet, and S. Masell. Smoothing Algorithms for State- Space Models. Submission IEEE Trans. on Signal Processing, [2] J. Crassidis and J. Junins. Optimal Estimation of Dynamic Systems. CRC Press, [3] G. Evensen. The Ensemble Kalman Filter: Theoretical Formulation and Practical Implementation. Ocean Dynamics, 53: , [4] D. Fraser and J. Potter. The Optimum Linear Smoother as a Combination of Two Optimum Linear Filters. Automatic Control, IEEE Transactions on, 144: , Aug [5] A. Gelb. Applied Optimal Estimation. M.I.T. Press, [6] K.P. Georgaaos and R.L. Bras. A Statistical Linearization Approach to Real Time Nonlinear Flood Routing. Technical report, Water Resources and Hydrodynamics, Massachusetts Institute of Thechnology, [7] S.J. Julier. The Scaled Unscented Transformation. In American Control Conference, volume 6, pages , [8] S.J. Julier and J.K. Uhlmann. Unscented Filtering and Nonlinear Estimation. Proceedings of the IEEE, 923: , [9] T. Kailath, A.H. Sayed, and B. Hassibi. Linear Estimation. Prentice Hall, [10] R.E. Kalman. A New Approach to Linear Filtering and Prediction Problems. Trans.ASME - Journal of Basic Engineering, 82 Series D:35 45, [11] T. Lefebvre and H. Bruynincx. Kalman Filters for Nonlinear Systems: A Comparison of Performance. Technical report, Department of Mechanical Engineering, Katholiee Universiteit Leuven, Belgium, [12] X. Mao, M. Wada, and H. Hashimoto. Nonlinear iterative algorithm for gps positioning with bias model. In The 7th International IEEE Conference on Intelligent Transportation Systems, [13] S.T. Nielsen, S. Deme, and T. Mielsen. Description of the Atmospheric Dispersion Module RIMPUFF. Technical Report RODOSWG2-TN98-02, Riso National Laboratory, P.O.Box 49, DK Rosilde, Denmar, [14] M. Psiai and M. Wada. Derivation and Simulation Testing of a Sigma- Points Smoother. Journal of Guidance, Control, and Dynamics, 30:78 86, [15] H.E. Rauch, F. Tung, and C.T. Striebel. Maximum Lielihood Estimates of Linear Dynamic Systems. J. Amer. Inst. Aeronautics and Astronautics, 3 8: , [16] K.V.U. Reddy, Y.Cheng, T.Singh, and P.D.Scott. Data Assimilation in Variable Dimension Dispersion Models using Particle Filters. In review at The 10th International Conference on Information Fusion, Quebec City, Canada, [17] B. Ristic, S. Arulampalam, and N. Gordon. Beyond the Kalman Filter: Particle Filters for Tracing Applications. Arthech House, [18] R. van der Merwe. Sigma-Point Kalman Filters for Probabilistic Inference in Dynamic State-Space Models. Technical report, In Worshop on Advances in Machine Learning, Montreal, [19] R. van der Merwe and E. Wan. The Square-Root Unscented Kalman Filter for State and Parameter-Estimation. In International Conference on Acoustics, Speech, and Signal Processing ICASSP, Salt Lae City, Utah,, [20] T. Vercauteren and X. Wang. Decentralized Sigma-Point Information Filters for Target Tracing in Collaborative Sensor Networs. IEEE Transactions on Signal Processing, 53: , [21] E. Wan and R. van der Merwe. The Unscented Kalman Filter. Wiley Publishing, [22] E. Wan, R. van der Merwe, and A. Nelson. Dual Estimation and the Unscented Transformation. In Advances in Neural Information Processing Systems 12, pages MIT Press, [23] O. Zoeter, A. Ypma, and T. Heses. Improved Unscented Kalman Smoothing for Stoc Volatility Estimation. In IEEE Worshop on Machine Learning for Signal Processing, 2004.

Data Assimilation in Variable Dimension Dispersion Models using Particle Filters

Data Assimilation in Variable Dimension Dispersion Models using Particle Filters Data Assimilation in Variable Dimension Dispersion Models using Particle Filters K. V. Umamaheswara Reddy Dept. of MAE venatar@buffalo.edu Yang Cheng Dept. of MAE cheng3@buffalo.edu Tarunraj Singh Dept.

More information

Crib Sheet : Linear Kalman Smoothing

Crib Sheet : Linear Kalman Smoothing Crib Sheet : Linear Kalman Smoothing Gabriel A. Terejanu Department of Computer Science and Engineering University at Buffalo, Buffalo, NY 14260 terejanu@buffalo.edu 1 Introduction Smoothing can be separated

More information

Data Assimilation in Variable Dimension Dispersion Models using Particle Filters

Data Assimilation in Variable Dimension Dispersion Models using Particle Filters Data Assimilation in Variable Dimension Dispersion Models using Particle Filters K. V. Umamaheswara Reddy Dept. of MAE University at Buffalo Buffalo, NY, U.S.A. venatar@buffalo.edu Yang Cheng Dept. of

More information

Extended Kalman Filter Tutorial

Extended Kalman Filter Tutorial Extended Kalman Filter Tutorial Gabriel A. Terejanu Department of Computer Science and Engineering University at Buffalo, Buffalo, NY 14260 terejanu@buffalo.edu 1 Dynamic process Consider the following

More information

in a Rao-Blackwellised Unscented Kalman Filter

in a Rao-Blackwellised Unscented Kalman Filter A Rao-Blacwellised Unscented Kalman Filter Mar Briers QinetiQ Ltd. Malvern Technology Centre Malvern, UK. m.briers@signal.qinetiq.com Simon R. Masell QinetiQ Ltd. Malvern Technology Centre Malvern, UK.

More information

Data Assimilation for Dispersion Models

Data Assimilation for Dispersion Models Data Assimilation for Dispersion Models K. V. Umamaheswara Reddy Dept. of Mechanical and Aerospace Engg. State University of New Yor at Buffalo Buffalo, NY, U.S.A. venatar@buffalo.edu Yang Cheng Dept.

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond January 18, 2017 Contents 1 Batch and Recursive Estimation 2 Towards Bayesian Filtering 3 Kalman Filter and Bayesian Filtering and Smoothing

More information

SIGMA POINT GAUSSIAN SUM FILTER DESIGN USING SQUARE ROOT UNSCENTED FILTERS

SIGMA POINT GAUSSIAN SUM FILTER DESIGN USING SQUARE ROOT UNSCENTED FILTERS SIGMA POINT GAUSSIAN SUM FILTER DESIGN USING SQUARE ROOT UNSCENTED FILTERS Miroslav Šimandl, Jindřich Duní Department of Cybernetics and Research Centre: Data - Algorithms - Decision University of West

More information

A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem

A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem A Novel Gaussian Sum Filter Method for Accurate Solution to Nonlinear Filtering Problem Gabriel Terejanu a Puneet Singla b Tarunraj Singh b Peter D. Scott a Graduate Student Assistant Professor Professor

More information

A new unscented Kalman filter with higher order moment-matching

A new unscented Kalman filter with higher order moment-matching A new unscented Kalman filter with higher order moment-matching KSENIA PONOMAREVA, PARESH DATE AND ZIDONG WANG Department of Mathematical Sciences, Brunel University, Uxbridge, UB8 3PH, UK. Abstract This

More information

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1

ROBUST CONSTRAINED ESTIMATION VIA UNSCENTED TRANSFORMATION. Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a 1 ROUST CONSTRINED ESTIMTION VI UNSCENTED TRNSFORMTION Pramod Vachhani a, Shankar Narasimhan b and Raghunathan Rengaswamy a a Department of Chemical Engineering, Clarkson University, Potsdam, NY -3699, US.

More information

Extended Object and Group Tracking with Elliptic Random Hypersurface Models

Extended Object and Group Tracking with Elliptic Random Hypersurface Models Extended Object and Group Tracing with Elliptic Random Hypersurface Models Marcus Baum Benjamin Noac and Uwe D. Hanebec Intelligent Sensor-Actuator-Systems Laboratory ISAS Institute for Anthropomatics

More information

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra

Mini-Course 07 Kalman Particle Filters. Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra Mini-Course 07 Kalman Particle Filters Henrique Massard da Fonseca Cesar Cunha Pacheco Wellington Bettencurte Julio Dutra Agenda State Estimation Problems & Kalman Filter Henrique Massard Steady State

More information

Extension of the Sparse Grid Quadrature Filter

Extension of the Sparse Grid Quadrature Filter Extension of the Sparse Grid Quadrature Filter Yang Cheng Mississippi State University Mississippi State, MS 39762 Email: cheng@ae.msstate.edu Yang Tian Harbin Institute of Technology Harbin, Heilongjiang

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Dan Cawford Manfred Opper John Shawe-Taylor May, 2006 1 Introduction Some of the most complex models routinely run

More information

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania

Nonlinear and/or Non-normal Filtering. Jesús Fernández-Villaverde University of Pennsylvania Nonlinear and/or Non-normal Filtering Jesús Fernández-Villaverde University of Pennsylvania 1 Motivation Nonlinear and/or non-gaussian filtering, smoothing, and forecasting (NLGF) problems are pervasive

More information

Lecture 7: Optimal Smoothing

Lecture 7: Optimal Smoothing Department of Biomedical Engineering and Computational Science Aalto University March 17, 2011 Contents 1 What is Optimal Smoothing? 2 Bayesian Optimal Smoothing Equations 3 Rauch-Tung-Striebel Smoother

More information

A New Nonlinear Filtering Method for Ballistic Target Tracking

A New Nonlinear Filtering Method for Ballistic Target Tracking th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 9 A New Nonlinear Filtering Method for Ballistic arget racing Chunling Wu Institute of Electronic & Information Engineering

More information

The Unscented Particle Filter

The Unscented Particle Filter The Unscented Particle Filter Rudolph van der Merwe (OGI) Nando de Freitas (UC Bereley) Arnaud Doucet (Cambridge University) Eric Wan (OGI) Outline Optimal Estimation & Filtering Optimal Recursive Bayesian

More information

Dual Estimation and the Unscented Transformation

Dual Estimation and the Unscented Transformation Dual Estimation and the Unscented Transformation Eric A. Wan ericwan@ece.ogi.edu Rudolph van der Merwe rudmerwe@ece.ogi.edu Alex T. Nelson atnelson@ece.ogi.edu Oregon Graduate Institute of Science & Technology

More information

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets

Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets Nonlinear Estimation Techniques for Impact Point Prediction of Ballistic Targets J. Clayton Kerce a, George C. Brown a, and David F. Hardiman b a Georgia Tech Research Institute, Georgia Institute of Technology,

More information

The Scaled Unscented Transformation

The Scaled Unscented Transformation The Scaled Unscented Transformation Simon J. Julier, IDAK Industries, 91 Missouri Blvd., #179 Jefferson City, MO 6519 E-mail:sjulier@idak.com Abstract This paper describes a generalisation of the unscented

More information

Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements

Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements Comparison of Kalman Filter Estimation Approaches for State Space Models with Nonlinear Measurements Fredri Orderud Sem Sælands vei 7-9, NO-7491 Trondheim Abstract The Etended Kalman Filter (EKF) has long

More information

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES

NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES 2013 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING NON-LINEAR NOISE ADAPTIVE KALMAN FILTERING VIA VARIATIONAL BAYES Simo Särä Aalto University, 02150 Espoo, Finland Jouni Hartiainen

More information

Introduction to Unscented Kalman Filter

Introduction to Unscented Kalman Filter Introduction to Unscented Kalman Filter 1 Introdution In many scientific fields, we use certain models to describe the dynamics of system, such as mobile robot, vision tracking and so on. The word dynamics

More information

Nonlinear State Estimation! Particle, Sigma-Points Filters!

Nonlinear State Estimation! Particle, Sigma-Points Filters! Nonlinear State Estimation! Particle, Sigma-Points Filters! Robert Stengel! Optimal Control and Estimation, MAE 546! Princeton University, 2017!! Particle filter!! Sigma-Points Unscented Kalman ) filter!!

More information

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother

Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Prediction of ESTSP Competition Time Series by Unscented Kalman Filter and RTS Smoother Simo Särkkä, Aki Vehtari and Jouko Lampinen Helsinki University of Technology Department of Electrical and Communications

More information

Constrained State Estimation Using the Unscented Kalman Filter

Constrained State Estimation Using the Unscented Kalman Filter 16th Mediterranean Conference on Control and Automation Congress Centre, Ajaccio, France June 25-27, 28 Constrained State Estimation Using the Unscented Kalman Filter Rambabu Kandepu, Lars Imsland and

More information

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations

Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations PREPRINT 1 Recursive Noise Adaptive Kalman Filtering by Variational Bayesian Approximations Simo Särä, Member, IEEE and Aapo Nummenmaa Abstract This article considers the application of variational Bayesian

More information

Lecture 6: Bayesian Inference in SDE Models

Lecture 6: Bayesian Inference in SDE Models Lecture 6: Bayesian Inference in SDE Models Bayesian Filtering and Smoothing Point of View Simo Särkkä Aalto University Simo Särkkä (Aalto) Lecture 6: Bayesian Inference in SDEs 1 / 45 Contents 1 SDEs

More information

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1

output dimension input dimension Gaussian evidence Gaussian Gaussian evidence evidence from t +1 inputs and outputs at time t x t+2 x t-1 x t+1 To appear in M. S. Kearns, S. A. Solla, D. A. Cohn, (eds.) Advances in Neural Information Processing Systems. Cambridge, MA: MIT Press, 999. Learning Nonlinear Dynamical Systems using an EM Algorithm Zoubin

More information

Expectation Propagation in Dynamical Systems

Expectation Propagation in Dynamical Systems Expectation Propagation in Dynamical Systems Marc Peter Deisenroth Joint Work with Shakir Mohamed (UBC) August 10, 2012 Marc Deisenroth (TU Darmstadt) EP in Dynamical Systems 1 Motivation Figure : Complex

More information

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems

Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems 14th International Conference on Information Fusion Chicago, Illinois, USA, July 5-8, 2011 Combined Particle and Smooth Variable Structure Filtering for Nonlinear Estimation Problems S. Andrew Gadsden

More information

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino

ROBOTICS 01PEEQW. Basilio Bona DAUIN Politecnico di Torino ROBOTICS 01PEEQW Basilio Bona DAUIN Politecnico di Torino Probabilistic Fundamentals in Robotics Gaussian Filters Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile

More information

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION

RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T DISTRIBUTION 1 IEEE INTERNATIONAL WORKSHOP ON MACHINE LEARNING FOR SIGNAL PROCESSING, SEPT. 3 6, 1, SANTANDER, SPAIN RECURSIVE OUTLIER-ROBUST FILTERING AND SMOOTHING FOR NONLINEAR SYSTEMS USING THE MULTIVARIATE STUDENT-T

More information

Sigma Point Belief Propagation

Sigma Point Belief Propagation Copyright 2014 IEEE IEEE Signal Processing Letters, vol. 21, no. 2, Feb. 2014, pp. 145 149 1 Sigma Point Belief Propagation Florian Meyer, Student Member, IEEE, Ondrej Hlina, Member, IEEE, and Franz Hlawatsch,

More information

State Estimation using Moving Horizon Estimation and Particle Filtering

State Estimation using Moving Horizon Estimation and Particle Filtering State Estimation using Moving Horizon Estimation and Particle Filtering James B. Rawlings Department of Chemical and Biological Engineering UW Math Probability Seminar Spring 2009 Rawlings MHE & PF 1 /

More information

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC

Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model. David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Lagrangian Data Assimilation and Manifold Detection for a Point-Vortex Model David Darmon, AMSC Kayo Ide, AOSC, IPST, CSCAMM, ESSIC Background Data Assimilation Iterative process Forecast Analysis Background

More information

A Time-Varying Threshold STAR Model of Unemployment

A Time-Varying Threshold STAR Model of Unemployment A Time-Varying Threshold STAR Model of Unemloyment michael dueker a michael owyang b martin sola c,d a Russell Investments b Federal Reserve Bank of St. Louis c Deartamento de Economia, Universidad Torcuato

More information

State Estimation of Linear and Nonlinear Dynamic Systems

State Estimation of Linear and Nonlinear Dynamic Systems State Estimation of Linear and Nonlinear Dynamic Systems Part III: Nonlinear Systems: Extended Kalman Filter (EKF) and Unscented Kalman Filter (UKF) James B. Rawlings and Fernando V. Lima Department of

More information

Parameterized Joint Densities with Gaussian Mixture Marginals and their Potential Use in Nonlinear Robust Estimation

Parameterized Joint Densities with Gaussian Mixture Marginals and their Potential Use in Nonlinear Robust Estimation Proceedings of the 2006 IEEE International Conference on Control Applications Munich, Germany, October 4-6, 2006 WeA0. Parameterized Joint Densities with Gaussian Mixture Marginals and their Potential

More information

Lecture 2: From Linear Regression to Kalman Filter and Beyond

Lecture 2: From Linear Regression to Kalman Filter and Beyond Lecture 2: From Linear Regression to Kalman Filter and Beyond Department of Biomedical Engineering and Computational Science Aalto University January 26, 2012 Contents 1 Batch and Recursive Estimation

More information

The Ensemble Kalman Filter:

The Ensemble Kalman Filter: p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen, Ocean Dynamics, Vol 5, No p. The Ensemble

More information

A Comparison of the Extended and Unscented Kalman Filters for Discrete-Time Systems with Nondifferentiable Dynamics

A Comparison of the Extended and Unscented Kalman Filters for Discrete-Time Systems with Nondifferentiable Dynamics Proceedings of the 27 American Control Conference Marriott Marquis Hotel at Times Square New Yor City, USA, July -3, 27 FrA7.2 A Comparison of the Extended and Unscented Kalman Filters for Discrete-Time

More information

The Kalman Filter ImPr Talk

The Kalman Filter ImPr Talk The Kalman Filter ImPr Talk Ged Ridgway Centre for Medical Image Computing November, 2006 Outline What is the Kalman Filter? State Space Models Kalman Filter Overview Bayesian Updating of Estimates Kalman

More information

DATA ASSIMILATION FOR FLOOD FORECASTING

DATA ASSIMILATION FOR FLOOD FORECASTING DATA ASSIMILATION FOR FLOOD FORECASTING Arnold Heemin Delft University of Technology 09/16/14 1 Data assimilation is the incorporation of measurement into a numerical model to improve the model results

More information

Organization. I MCMC discussion. I project talks. I Lecture.

Organization. I MCMC discussion. I project talks. I Lecture. Organization I MCMC discussion I project talks. I Lecture. Content I Uncertainty Propagation Overview I Forward-Backward with an Ensemble I Model Reduction (Intro) Uncertainty Propagation in Causal Systems

More information

New Fast Kalman filter method

New Fast Kalman filter method New Fast Kalman filter method Hojat Ghorbanidehno, Hee Sun Lee 1. Introduction Data assimilation methods combine dynamical models of a system with typically noisy observations to obtain estimates of the

More information

4 Derivations of the Discrete-Time Kalman Filter

4 Derivations of the Discrete-Time Kalman Filter Technion Israel Institute of Technology, Department of Electrical Engineering Estimation and Identification in Dynamical Systems (048825) Lecture Notes, Fall 2009, Prof N Shimkin 4 Derivations of the Discrete-Time

More information

Recurrent Neural Network Training with the Extended Kalman Filter

Recurrent Neural Network Training with the Extended Kalman Filter Recurrent Neural Networ raining with the Extended Kalman Filter Peter REBAICKÝ Slova University of echnology Faculty of Informatics and Information echnologies Ilovičova 3, 842 16 Bratislava, Slovaia trebaticy@fiit.stuba.s

More information

Ensemble square-root filters

Ensemble square-root filters Ensemble square-root filters MICHAEL K. TIPPETT International Research Institute for climate prediction, Palisades, New Yor JEFFREY L. ANDERSON GFDL, Princeton, New Jersy CRAIG H. BISHOP Naval Research

More information

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010

Probabilistic Fundamentals in Robotics. DAUIN Politecnico di Torino July 2010 Probabilistic Fundamentals in Robotics Gaussian Filters Basilio Bona DAUIN Politecnico di Torino July 2010 Course Outline Basic mathematical framework Probabilistic models of mobile robots Mobile robot

More information

A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER

A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER A STUDY ON THE STATE ESTIMATION OF NONLINEAR ELECTRIC CIRCUITS BY UNSCENTED KALMAN FILTER Esra SAATCI Aydın AKAN 2 e-mail: esra.saatci@iku.edu.tr e-mail: akan@istanbul.edu.tr Department of Electronic Eng.,

More information

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory

More information

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics

EKF, UKF. Pieter Abbeel UC Berkeley EECS. Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics EKF, UKF Pieter Abbeel UC Berkeley EECS Many slides adapted from Thrun, Burgard and Fox, Probabilistic Robotics Kalman Filter Kalman Filter = special case of a Bayes filter with dynamics model and sensory

More information

Fundamentals of Data Assimila1on

Fundamentals of Data Assimila1on 014 GSI Community Tutorial NCAR Foothills Campus, Boulder, CO July 14-16, 014 Fundamentals of Data Assimila1on Milija Zupanski Cooperative Institute for Research in the Atmosphere Colorado State University

More information

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization

A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization A Comparison of the EKF, SPKF, and the Bayes Filter for Landmark-Based Localization and Timothy D. Barfoot CRV 2 Outline Background Objective Experimental Setup Results Discussion Conclusion 2 Outline

More information

EM-algorithm for Training of State-space Models with Application to Time Series Prediction

EM-algorithm for Training of State-space Models with Application to Time Series Prediction EM-algorithm for Training of State-space Models with Application to Time Series Prediction Elia Liitiäinen, Nima Reyhani and Amaury Lendasse Helsinki University of Technology - Neural Networks Research

More information

Stability of Ensemble Kalman Filters

Stability of Ensemble Kalman Filters Stability of Ensemble Kalman Filters Idrissa S. Amour, Zubeda Mussa, Alexander Bibov, Antti Solonen, John Bardsley, Heikki Haario and Tuomo Kauranne Lappeenranta University of Technology University of

More information

SQUARE-ROOT CUBATURE-QUADRATURE KALMAN FILTER

SQUARE-ROOT CUBATURE-QUADRATURE KALMAN FILTER Asian Journal of Control, Vol. 6, No. 2, pp. 67 622, March 204 Published online 8 April 203 in Wiley Online Library (wileyonlinelibrary.com) DOI: 0.002/asjc.704 SQUARE-ROO CUBAURE-QUADRAURE KALMAN FILER

More information

Dynamic estimation methods compute two main outputs: Estimates of a system state vector and some

Dynamic estimation methods compute two main outputs: Estimates of a system state vector and some A Multipurpose Consider Covariance Analysis for Square-Root Information Smoothers Joanna C. Hins and Mar L. Psiai Cornell University, Ithaca, NY, 14853-7501 A new form of consider covariance analysis for

More information

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter

ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter ECE531 Lecture 11: Dynamic Parameter Estimation: Kalman-Bucy Filter D. Richard Brown III Worcester Polytechnic Institute 09-Apr-2009 Worcester Polytechnic Institute D. Richard Brown III 09-Apr-2009 1 /

More information

The Ensemble Kalman Filter:

The Ensemble Kalman Filter: p.1 The Ensemble Kalman Filter: Theoretical formulation and practical implementation Geir Evensen Norsk Hydro Research Centre, Bergen, Norway Based on Evensen 23, Ocean Dynamics, Vol 53, No 4 p.2 The Ensemble

More information

An Improved Particle Filter with Applications in Ballistic Target Tracking

An Improved Particle Filter with Applications in Ballistic Target Tracking Sensors & ransducers Vol. 72 Issue 6 June 204 pp. 96-20 Sensors & ransducers 204 by IFSA Publishing S. L. http://www.sensorsportal.co An Iproved Particle Filter with Applications in Ballistic arget racing

More information

UNSCENTED KALMAN FILTERING FOR SPACECRAFT ATTITUDE STATE AND PARAMETER ESTIMATION

UNSCENTED KALMAN FILTERING FOR SPACECRAFT ATTITUDE STATE AND PARAMETER ESTIMATION AAS-04-115 UNSCENTED KALMAN FILTERING FOR SPACECRAFT ATTITUDE STATE AND PARAMETER ESTIMATION Matthew C. VanDyke, Jana L. Schwartz, Christopher D. Hall An Unscented Kalman Filter (UKF) is derived in an

More information

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form

ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER. The Kalman Filter. We will be concerned with state space systems of the form ECONOMETRIC METHODS II: TIME SERIES LECTURE NOTES ON THE KALMAN FILTER KRISTOFFER P. NIMARK The Kalman Filter We will be concerned with state space systems of the form X t = A t X t 1 + C t u t 0.1 Z t

More information

Lecture 8: Bayesian Estimation of Parameters in State Space Models

Lecture 8: Bayesian Estimation of Parameters in State Space Models in State Space Models March 30, 2016 Contents 1 Bayesian estimation of parameters in state space models 2 Computational methods for parameter estimation 3 Practical parameter estimation in state space

More information

1 Kalman Filter Introduction

1 Kalman Filter Introduction 1 Kalman Filter Introduction You should first read Chapter 1 of Stochastic models, estimation, and control: Volume 1 by Peter S. Maybec (available here). 1.1 Explanation of Equations (1-3) and (1-4) Equation

More information

Autonomous Mobile Robot Design

Autonomous Mobile Robot Design Autonomous Mobile Robot Design Topic: Extended Kalman Filter Dr. Kostas Alexis (CSE) These slides relied on the lectures from C. Stachniss, J. Sturm and the book Probabilistic Robotics from Thurn et al.

More information

Lecture 9. Time series prediction

Lecture 9. Time series prediction Lecture 9 Time series prediction Prediction is about function fitting To predict we need to model There are a bewildering number of models for data we look at some of the major approaches in this lecture

More information

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect

Gaussian Mixtures Proposal Density in Particle Filter for Track-Before-Detect 12th International Conference on Information Fusion Seattle, WA, USA, July 6-9, 29 Gaussian Mixtures Proposal Density in Particle Filter for Trac-Before-Detect Ondřej Straa, Miroslav Šimandl and Jindřich

More information

Gaussian Process Approximations of Stochastic Differential Equations

Gaussian Process Approximations of Stochastic Differential Equations Gaussian Process Approximations of Stochastic Differential Equations Cédric Archambeau Centre for Computational Statistics and Machine Learning University College London c.archambeau@cs.ucl.ac.uk CSML

More information

A Comparitive Study Of Kalman Filter, Extended Kalman Filter And Unscented Kalman Filter For Harmonic Analysis Of The Non-Stationary Signals

A Comparitive Study Of Kalman Filter, Extended Kalman Filter And Unscented Kalman Filter For Harmonic Analysis Of The Non-Stationary Signals International Journal of Scientific & Engineering Research, Volume 3, Issue 7, July-2012 1 A Comparitive Study Of Kalman Filter, Extended Kalman Filter And Unscented Kalman Filter For Harmonic Analysis

More information

Lessons in Estimation Theory for Signal Processing, Communications, and Control

Lessons in Estimation Theory for Signal Processing, Communications, and Control Lessons in Estimation Theory for Signal Processing, Communications, and Control Jerry M. Mendel Department of Electrical Engineering University of Southern California Los Angeles, California PRENTICE HALL

More information

Computers and Chemical Engineering

Computers and Chemical Engineering Computers and Chemical Engineering 33 (2009) 13861401 Contents lists available at ScienceDirect Computers and Chemical Engineering journal homepage: www.elsevier.com/locate/compchemeng Constrained nonlinear

More information

Sensor Fusion: Particle Filter

Sensor Fusion: Particle Filter Sensor Fusion: Particle Filter By: Gordana Stojceska stojcesk@in.tum.de Outline Motivation Applications Fundamentals Tracking People Advantages and disadvantages Summary June 05 JASS '05, St.Petersburg,

More information

Information, Covariance and Square-Root Filtering in the Presence of Unknown Inputs 1

Information, Covariance and Square-Root Filtering in the Presence of Unknown Inputs 1 Katholiee Universiteit Leuven Departement Eletrotechnie ESAT-SISTA/TR 06-156 Information, Covariance and Square-Root Filtering in the Presence of Unnown Inputs 1 Steven Gillijns and Bart De Moor 2 October

More information

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein

Kalman filtering and friends: Inference in time series models. Herke van Hoof slides mostly by Michael Rubinstein Kalman filtering and friends: Inference in time series models Herke van Hoof slides mostly by Michael Rubinstein Problem overview Goal Estimate most probable state at time k using measurement up to time

More information

Markov Chain Monte Carlo Methods for Stochastic Optimization

Markov Chain Monte Carlo Methods for Stochastic Optimization Markov Chain Monte Carlo Methods for Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge U of Toronto, MIE,

More information

Particle Filtering Approaches for Dynamic Stochastic Optimization

Particle Filtering Approaches for Dynamic Stochastic Optimization Particle Filtering Approaches for Dynamic Stochastic Optimization John R. Birge The University of Chicago Booth School of Business Joint work with Nicholas Polson, Chicago Booth. JRBirge I-Sim Workshop,

More information

Linear Dynamical Systems

Linear Dynamical Systems Linear Dynamical Systems Sargur N. srihari@cedar.buffalo.edu Machine Learning Course: http://www.cedar.buffalo.edu/~srihari/cse574/index.html Two Models Described by Same Graph Latent variables Observations

More information

Wind-field Reconstruction Using Flight Data

Wind-field Reconstruction Using Flight Data 28 American Control Conference Westin Seattle Hotel, Seattle, Washington, USA June 11-13, 28 WeC18.4 Wind-field Reconstruction Using Flight Data Harish J. Palanthandalam-Madapusi, Anouck Girard, and Dennis

More information

Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions

Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions Cardinality Balanced Multi-Target Multi-Bernoulli Filtering Using Adaptive Birth Distributions Stephan Reuter, Daniel Meissner, Benjamin Wiling, and Klaus Dietmayer Institute of Measurement, Control, and

More information

DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION. Alexandre Iline, Harri Valpola and Erkki Oja

DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION. Alexandre Iline, Harri Valpola and Erkki Oja DETECTING PROCESS STATE CHANGES BY NONLINEAR BLIND SOURCE SEPARATION Alexandre Iline, Harri Valpola and Erkki Oja Laboratory of Computer and Information Science Helsinki University of Technology P.O.Box

More information

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms

L06. LINEAR KALMAN FILTERS. NA568 Mobile Robotics: Methods & Algorithms L06. LINEAR KALMAN FILTERS NA568 Mobile Robotics: Methods & Algorithms 2 PS2 is out! Landmark-based Localization: EKF, UKF, PF Today s Lecture Minimum Mean Square Error (MMSE) Linear Kalman Filter Gaussian

More information

Perception: objects in the environment

Perception: objects in the environment Zsolt Vizi, Ph.D. 2018 Self-driving cars Sensor fusion: one categorization Type 1: low-level/raw data fusion combining several sources of raw data to produce new data that is expected to be more informative

More information

A Gaussian Mixture PHD Filter for Nonlinear Jump Markov Models

A Gaussian Mixture PHD Filter for Nonlinear Jump Markov Models A Gaussian Mixture PHD Filter for Nonlinear Jump Marov Models Ba-Ngu Vo Ahmed Pasha Hoang Duong Tuan Department of Electrical and Electronic Engineering The University of Melbourne Parville VIC 35 Australia

More information

Quadratic Extended Filtering in Nonlinear Systems with Uncertain Observations

Quadratic Extended Filtering in Nonlinear Systems with Uncertain Observations Applied Mathematical Sciences, Vol. 8, 2014, no. 4, 157-172 HIKARI Ltd, www.m-hiari.com http://dx.doi.org/10.12988/ams.2014.311636 Quadratic Extended Filtering in Nonlinear Systems with Uncertain Observations

More information

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012

Gaussian Processes. Le Song. Machine Learning II: Advanced Topics CSE 8803ML, Spring 2012 Gaussian Processes Le Song Machine Learning II: Advanced Topics CSE 8803ML, Spring 01 Pictorial view of embedding distribution Transform the entire distribution to expected features Feature space Feature

More information

A Comparison of Nonlinear Kalman Filtering Applied to Feed forward Neural Networks as Learning Algorithms

A Comparison of Nonlinear Kalman Filtering Applied to Feed forward Neural Networks as Learning Algorithms A Comparison of Nonlinear Kalman Filtering Applied to Feed forward Neural Networs as Learning Algorithms Wieslaw Pietrusziewicz SDART Ltd One Central Par, Northampton Road Manchester M40 5WW, United Kindgom

More information

Efficient Monitoring for Planetary Rovers

Efficient Monitoring for Planetary Rovers International Symposium on Artificial Intelligence and Robotics in Space (isairas), May, 2003 Efficient Monitoring for Planetary Rovers Vandi Verma vandi@ri.cmu.edu Geoff Gordon ggordon@cs.cmu.edu Carnegie

More information

A STATE ESTIMATOR FOR NONLINEAR STOCHASTIC SYSTEMS BASED ON DIRAC MIXTURE APPROXIMATIONS

A STATE ESTIMATOR FOR NONLINEAR STOCHASTIC SYSTEMS BASED ON DIRAC MIXTURE APPROXIMATIONS A STATE ESTIMATOR FOR NONINEAR STOCHASTIC SYSTEMS BASED ON DIRAC MIXTURE APPROXIMATIONS Oliver C. Schrempf, Uwe D. Hanebec Intelligent Sensor-Actuator-Systems aboratory, Universität Karlsruhe (TH), Germany

More information

A New Nonlinear State Estimator Using the Fusion of Multiple Extended Kalman Filters

A New Nonlinear State Estimator Using the Fusion of Multiple Extended Kalman Filters 18th International Conference on Information Fusion Washington, DC - July 6-9, 2015 A New Nonlinear State Estimator Using the Fusion of Multiple Extended Kalman Filters Zhansheng Duan, Xiaoyun Li Center

More information

REAL-TIME ATTITUDE-INDEPENDENT THREE-AXIS MAGNETOMETER CALIBRATION

REAL-TIME ATTITUDE-INDEPENDENT THREE-AXIS MAGNETOMETER CALIBRATION REAL-TIME ATTITUDE-INDEPENDENT THREE-AXIS MAGNETOMETER CALIBRATION John L. Crassidis and Ko-Lam Lai Department of Mechanical & Aerospace Engineering University at Buffalo, State University of New Yor Amherst,

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tyrus Berry George Mason University NJIT Feb. 28, 2017 Postdoc supported by NSF This work is in collaboration with: Tim Sauer, GMU Franz Hamilton, Postdoc, NCSU

More information

Adaptive ensemble Kalman filtering of nonlinear systems. Tyrus Berry and Timothy Sauer George Mason University, Fairfax, VA 22030

Adaptive ensemble Kalman filtering of nonlinear systems. Tyrus Berry and Timothy Sauer George Mason University, Fairfax, VA 22030 Generated using V3.2 of the official AMS LATEX template journal page layout FOR AUTHOR USE ONLY, NOT FOR SUBMISSION! Adaptive ensemble Kalman filtering of nonlinear systems Tyrus Berry and Timothy Sauer

More information

Basic Concepts in Data Reconciliation. Chapter 6: Steady-State Data Reconciliation with Model Uncertainties

Basic Concepts in Data Reconciliation. Chapter 6: Steady-State Data Reconciliation with Model Uncertainties Chapter 6: Steady-State Data with Model Uncertainties CHAPTER 6 Steady-State Data with Model Uncertainties 6.1 Models with Uncertainties In the previous chapters, the models employed in the DR were considered

More information

Data assimilation with and without a model

Data assimilation with and without a model Data assimilation with and without a model Tim Sauer George Mason University Parameter estimation and UQ U. Pittsburgh Mar. 5, 2017 Partially supported by NSF Most of this work is due to: Tyrus Berry,

More information

Expectation propagation for signal detection in flat-fading channels

Expectation propagation for signal detection in flat-fading channels Expectation propagation for signal detection in flat-fading channels Yuan Qi MIT Media Lab Cambridge, MA, 02139 USA yuanqi@media.mit.edu Thomas Minka CMU Statistics Department Pittsburgh, PA 15213 USA

More information

Unbiased minimum variance estimation for systems with unknown exogenous inputs

Unbiased minimum variance estimation for systems with unknown exogenous inputs Unbiased minimum variance estimation for systems with unknown exogenous inputs Mohamed Darouach, Michel Zasadzinski To cite this version: Mohamed Darouach, Michel Zasadzinski. Unbiased minimum variance

More information