Non-Gaussian Estimation and Observer-Based Feedback using the Gaussian Mixture Kalman and Extended Kalman Filters

Similar documents
Statistical Pattern Recognition

3/8/2016. Contents in latter part PATTERN RECOGNITION AND MACHINE LEARNING. Dynamical Systems. Dynamical Systems. Linear Dynamical Systems

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian

A Note on Effi cient Conditional Simulation of Gaussian Distributions. April 2010

Chandrasekhar Type Algorithms. for the Riccati Equation of Lainiotis Filter

Lecture 2: Monte Carlo Simulation

Multilevel ensemble Kalman filtering

THE KALMAN FILTER RAUL ROJAS

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

The Expectation-Maximization (EM) Algorithm

Lecture 19: Convergence

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector

Ω ). Then the following inequality takes place:

Sequential Monte Carlo Methods - A Review. Arnaud Doucet. Engineering Department, Cambridge University, UK

1.010 Uncertainty in Engineering Fall 2008

Lainiotis filter implementation. via Chandrasekhar type algorithm

Stochastic Simulation

Introduction to Signals and Systems, Part V: Lecture Summary

Clustering. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar.

CSE 527, Additional notes on MLE & EM

( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let.

Problem Set 4 Due Oct, 12

Exponential Families and Bayesian Inference

Expectation-Maximization Algorithm.

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

Monte Carlo Integration

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Mixtures of Gaussians and the EM Algorithm

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

An Introduction to Randomized Algorithms

Lecture 10 October Minimaxity and least favorable prior sequences

Machine Learning Brett Bernstein

Standard Bayesian Approach to Quantized Measurements and Imprecise Likelihoods

Lecture 33: Bootstrap

Machine Learning for Data Science (CS 4786)

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

A Simplified Derivation of Scalar Kalman Filter using Bayesian Probability Theory

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

A statistical method to determine sample size to estimate characteristic value of soil parameters

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem

Statistical Inference Based on Extremum Estimators

1. Linearization of a nonlinear system given in the form of a system of ordinary differential equations

5.1 A mutual information bound based on metric entropy

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Orthogonal Gaussian Filters for Signal Processing

Estimation for Complete Data

Seunghee Ye Ma 8: Week 5 Oct 28

3. Z Transform. Recall that the Fourier transform (FT) of a DT signal xn [ ] is ( ) [ ] = In order for the FT to exist in the finite magnitude sense,

7.1 Convergence of sequences of random variables

Chapter 6 Sampling Distributions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013. Large Deviations for i.i.d. Random Variables

Regression and generalization

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS

Lecture 11 and 12: Basic estimation theory

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Optimally Sparse SVMs

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

1 Introduction to reducing variance in Monte Carlo simulations

A Risk Comparison of Ordinary Least Squares vs Ridge Regression

1 Covariance Estimation

Random Variables, Sampling and Estimation

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

10-701/ Machine Learning Mid-term Exam Solution

Signal Processing. Lecture 02: Discrete Time Signals and Systems. Ahmet Taha Koru, Ph. D. Yildiz Technical University.

AN OPEN-PLUS-CLOSED-LOOP APPROACH TO SYNCHRONIZATION OF CHAOTIC AND HYPERCHAOTIC MAPS

Berry-Esseen bounds for self-normalized martingales

GUIDELINES ON REPRESENTATIVE SAMPLING

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution

Problem Cosider the curve give parametrically as x = si t ad y = + cos t for» t» ß: (a) Describe the path this traverses: Where does it start (whe t =

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

ADVANCED DIGITAL SIGNAL PROCESSING

RAINFALL PREDICTION BY WAVELET DECOMPOSITION

Study the bias (due to the nite dimensional approximation) and variance of the estimators

UNSCENTED KALMAN FILTER REVISITED - HERMITE-GAUSS QUADRATURE APPROACH

10.6 ALTERNATING SERIES

Time-Domain Representations of LTI Systems

Algorithms for Clustering

Quantile regression with multilayer perceptrons.

Computing the maximum likelihood estimates: concentrated likelihood, EM-algorithm. Dmitry Pavlyuk

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach

Empirical Processes: Glivenko Cantelli Theorems

Vector Quantization: a Limiting Case of EM

CS537. Numerical Analysis and Computing

Statistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions

Mechatronics. Time Response & Frequency Response 2 nd -Order Dynamic System 2-Pole, Low-Pass, Active Filter

Preponderantly increasing/decreasing data in regression analysis

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

Chapter 10 Advanced Topics in Random Processes

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame

Asymptotic Results for the Linear Regression Model

REGRESSION WITH QUADRATIC LOSS

Lectures on Stochastic System Analysis and Bayesian Updating

DIGITAL FILTER ORDER REDUCTION

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator

Modified Decomposition Method by Adomian and. Rach for Solving Nonlinear Volterra Integro- Differential Equations

Lecture 11: Channel Coding Theorem: Converse Part

Transcription:

No-Gaussia Estimatio ad Observer-Based Feedback usig the Gaussia Mixture Kalma ad Exteded Kalma Filters Debdipta Goswami 1 ad Derek A. Paley 2 Abstract This paper cosiders the problem of o-gaussia estimatio ad observer-based feedback i liear ad oliear settigs. Estimatio i oliear systems with o- Gaussia process oise statistics is importat for applicatios i atmospheric ad oceaic samplig. No-Gaussia filterig is, however, largely problem specific ad mostly sub-optimal. This mauscript uses a Gaussia Mixture Model (GMM) to characterize the prior o-gaussia distributio, ad applies the Kalma filter update to estimate the state with ucertaity. The boudedess of error i both liear ad oliear cases is aalytically ustified uder various assumptios, ad the resultig estimate is used for feedback cotrol. To apply GMM i oliear settigs, we utilize a commo extesio of the Kalma filter: the Exteded Kalma Filter (EKF). The theoretical results are illustrated by umerical simulatios. I. INTRODUCTION Kowledge of state variables i dyamical systems is the key compoet for desigig state-feedback cotrol laws that ca practically stabilize a plat. However i actual systems, lack of access to state variables motivates observer-based feedback cotrol systems, where the curret state is estimated from a set of observables. The situatio worses whe the system dyamics ad/or the observatios are corrupted with oise, ad the iitial state estimate may oly be give i terms of a a priori probability desity. Hece, the eed for a filter arises for state estimatio. For a liear system with liear observatios ad Gaussia process ad sesor oise, the optimal estimator is give by the celebrated Kalma filter [1]. But for a oliear system ad systems with o- Gaussia oise, the momet propagatio eeds a ifiite umber of parameters ad a optimal filter is difficult to formulate. As a compromise, several sub-optimal solutios have bee developed, e.g., the Exteded Kalma Filter (EKF) ad the Usceted Kalma Filter (UKF), which rely o a fiite umber of parameters, thus sacrificig some of the system s details. However, these filters still rely o the Gaussia-process oise model, which may ot hold i a oliear system. The recursive Bayesia filter is the geeral represetatio of a optimal oliear filter that coverts the prior (forecast) state probability desity fuctio (PDF) ito the posterior (aalysis) state PDF usig the likelihood of the observatio. However i may cases, the prior PDF is ot explicitly 1 Debdipta Goswami is a graduate studet i the Departmet of Electrical ad Computer Egieerig ad the Istitute for Systems Research, Uiversity of Marylad, College Park, MD 2742. goswamid@umd.edu 2 Derek A. Paley is the Willis H. Youg Jr. Associate Professor of Aerospace Egieerig Educatio i the Departmet of Aerospace Egieerig ad the Istitute for Systems Research, Uiversity of Marylad, College Park, MD 2742 dpaley@umd.edu kow ad therefore has to be estimated usig a esemble realizatio. For such cases, the recursive Bayesia estimatio becomes computatioally difficult ad, therefore, a semiparametric model must be realized. Alspach ad Soreso [2] have show that ay probability desity fuctio ca be approximated arbitrarily closely from a esemble realizatio usig a weighted sum of Gaussia PDFs. The so-called Gaussia Mixture Model (GMM) gives a approximate way to explicitly calculate the posteriori desity of the states of a stochastic system, eve i the oliear o-gaussia case. There are several approaches for time ad measuremet updates usig GMM as show i the GMM-DO filter of Sodergaard ad Lermusiaux [3]. GMM, equipped with Mote-Carlo data-fittig usig the Expectatio Maximizatio (EM) algorithm [4] ad Bayesia Iformatio Criteria (BIC) [5], provides a accurate estimate of the prior PDF. GMM represets the PDF usig a weighted sum of Gaussia PDFs, each of which ca be updated usig idividual Kalma filters if the measuremet model is liear ad the measuremet oise profile is Gaussia. Data assimilatio, such as with atmospheric or oceaic samplig vehicles, ofte uses a liear (or liearized) observatio model, ad the measuremet oise is assumed to be additive Gaussia [3]. With these assumptios, the GMM alog with Kalma filter updates (i.e., GMM-KF) is sufficiet to estimate the state. For oliear system dyamics, the Kalma update step may be replaced by a Exteded Kalma Filter (EKF), which gives the predictio up to first-order precisio [6]. The state estimate thus obtaied may be used for feedback stabilizatio. A block diagram of such a observer-based feedback system is show i Fig. 1. I this model, x ad y are the state ad observatio, respectively at the th time step. The process is iheretly oisy with a additive process-oise W ad the observatio is corrupted by the measuremet-oise V. The state-estimate ˆx is u r e x u = Ke +1 = f(x, u ) + W, y y = Cx + V ˆx ˆx = O GMM (y ) Fig. 1: Block-diagram of a discrete time closed-loop observer-based feedback cotrol system with a Gaussia Mixture Model observer O GMM. y

obtaied usig GMM-KF observer (deoted as O GMM ) ad this estimate is used to derive the feedback cotrol sigal u = K(r ˆx ). The cotributios of this work are (1) a theoretical guaratee of boudedess of the estimatio error while usig GMM-KF i a liear system with o-gaussia oise; (2) ultimate boudedess of the orm of the state i a liear system with observer-based feedback; (3) extesio of error boud aalysis to a oliear system with o-gaussia process oise; ad (4) proof of the ultimate boudedess of the state i a oliear system usig Lyapuov s method for observer-based feedback with GMM-EKF. The mauscript is orgaized as follows. Sectio II provides a brief overview of Gaussia Mixture Model. Sectio III explores the aalytical as well as umerical results for liear settigs. Sectio IV exteds the results to oliear systems usig GMM-EKF. Sectio V summarizes the mauscript ad discusses possible future work. II. GAUSSIAN MIXTURE MODEL The GMM-KF works by represetig the probability desity fuctio (PDF) usig a Gaussia Mixture Model (GMM). Let w, = 1,..., M, be scalar weights such that w = 1. Let x ad P be the mea vector ad covariace matrix respectively for a multivariate Gaussia N ( x; x, P ), = 1,..., M. The weighted sum of the M Gaussia desities [3] ( p X x; {w, x, P } M ) M = w N ( x; x, P ) (1) is a valid probability desity fuctio (PDF) that itegrates to uity ad has a aalytical represetatio. Through the selectio of the weights, meas, covariaces, ad umber of mixture compoets, (1) ca represet eve highly o- Gaussia distributios. Traditioal esemble/particle-based methods represet a PDF usig sparse support of esemble members (i.e., a Mote Carlo samplig of realizatios) [7]. This represetatio eables oliear propagatio of the ucertaity i the forecast step of the filter. Ufortuately, may particle filters suffer from degeeracy issues due to the sparsity of the PDF represetatio [7]. Kerel-based approaches address this issue by periodically creatig a desity estimate from the esemble sample to matai the full support of the state space ad to facilitate resamplig [7], [8], [9]. Ufortuately, such approaches require the arbitrary choice of fittig parameters such as kerel badwidth [3]. For Gaussia mixtures, give a specific choice for mixture complexity, a Expectatio Maximizatio algorithm automatically selects the weights, meas, ad covariaces of the Gaussias to best fit the esemble [1]. A key cotributio of [3] is the use of the Bayesia Iformatio Criterio (BIC) for the automatic selectio of the mixture complexity. Let x be the th sample i a esamble realizatio. The BIC may be (approximately) expressed as [3] BIC = 2 N log p(x Ω ML ; M) + K log N, (2) where K is the umber of parameters i the model, Ω ML is the maximum likelihood set of parameters (produced by the EM algorithm), ad N is the umber of esemble members. For a multivariate Gaussia mixture where d is the dimesio of the state vector, K = M (2d+(d(d 1)) /2+1) is the umber of free parameters [3]. The BIC has two compoets: the first compoet evaluates the goodess-of-fit for the model of complexity M ad the secod compoet is a pealty o the overall model complexity [3]. By sequetially evaluatig models of icreasig complexity, oe may idetify a local miimum i the BIC. Oe seeks the best fit of a mixture of Gaussias to the data; the model-complexity term i the BIC esures that a simpler model is preferred [3]. For may data-assimilatio applicatios, the observatiooperator C liearly extracts the measuremet from the state vector, i.e., y =Cx+V with v N (, K V ), (3) where V is zero-mea measuremet oise with covariace K V. For a (sigle) Gaussia forecast PDF, Gaussia measuremet oise, ad a liear observatio operator, the Kalma-aalysis equatios represet the optimal approach to Bayesia assimilatio of a measuremet. I the case of a mixture of Gaussias, the Kalma-aalysis equatios may be augmeted with a weight-aalysis equatio to yield the proper applicatio of Bayes rule for each compoet i the mixture [3], as follows. The Bayesia update of a Gaussia mixture prior with Gaussia observatio model yields a Gaussia mixture posterior [3]. For a prior GMM p X (x) = w N ( x; x, P ), ad a Gaussia observatio model p Y X (y x) = N (y; Cx, R), the posterior PDF looks like [3] where p X Y (x y) = ŵ N ( x; ˆx, ˆP ), (4) ˆx = x + K (y Cx ), K = P C T (CP C T + K V ) 1, ˆP = (I K C)P, ad (5) ŵ = w N (y; Cx, CP C T + K V ) m=1 wm N (y; Cx m, CP m C T + K V ). Combiig the weighted Gaussias produces the posteriormea estimate. III. GMM-KF FOR LINEAR SYSTEMS Usually GMM-KF filter works by ivokig the Expectatio Maximizatio algorithm at each step, ad calculatig

the ukow parameters. But for the sake of theoretical tractability, ad to reduce computatioal burdes, we are goig to formulate a special case, where the GMM is calculated usig EM ad BIC oce from the iitial prior, ad the updated usig Kalma update steps. First cosider a liear Gaussia system x +1 = Ax + W ad y = Cx + V, 1, (6) where x R d, y R q, W, V, 1 are orthogoal ad zero mea i.i.d. Gaussia radom vectors with cov(v ) = K V ad cov(w ) = K W. The best estimate of the state is obtaied by Kalma filter: i.e., x = Aˆx 1, ˆx = x + K (y Cx ), K = S C T (CS C T + K V ) 1, S = AΣ 1 A T + K W, Σ = (I K C)S, where S = cov(x x ) ad Σ = cov(x ˆx ). The error x for the Kalma filter is give by x = (I K C) (A x 1 + W 1 ) K V. A. Error Propagatio i Liear GMM-KF Theorem 1 [11]: Let K W = QQ T. Suppose that (A, Q) is reachable ad (A, C) is observable. If S 1 =, the Σ Σ, K K, S S as. The limitig matrices are the oly solutios of the equatios Σ = (I KC)S, K = SC T (CSC T + K V ) 1, ad S = AΣA T + K W. Now we ivestigate the error dyamics of GMM-KF i the liear o-gaussia case. I this sceario, the system remais same as Eq. (6), except W is o-gaussia (may be assumed as zero mea, without loss of geerality). The GMM-KF is, for = 1,, M, x = Aˆx 1, ˆx = x + K(y Cx ), K = SC T (CSC T + K V ) 1, S = AΣ 1 AT + K W, Σ = (I K C)S, w = w 1 N (y ; Cx, CSC T + K V ) m=1 wm 1 N (y ; Cx m, CS m C T + K V ), where ˆx = M w ˆx, M (7) (8) w = 1, S = cov(x x ) ad Σ = cov(x ˆx ). Assume that the iitial values of the parameters have bee set by the EM algorithm ad BIC. To show that x ˆx is bouded, we proceed to the error aalysis of this estimator, where the error x x ˆx ca be writte as x = w x = w(x ˆx ). (9) Theorem 2: Cosider the liear system (6) with possibly o-gaussia oise W. If the coditios of Theorem 1 are satisfied, the matrix C has full colum-rak ad the orm of the measuremet oise V is bouded with probability P, the the orm of the error x is bouded with probability P for all. Proof: From Theorem 1, Σ = (I KC)S, K = S C T (CS C T + K V ) 1, S = AΣ A T + K W, = 1,, M are the bouded limits of Σ, K ad S as. Let R = CSC T + K V, d be the dimesioality of x ad β = m=1 wm 1N (y ; Cx m, CS m C T + K V ) be the fiite ormalizatio factor. From the Kalma update equatio (8), x = ( I KC ) r KV, where r A x 1 + W 1. Usig this result alog with Eq. (8), Eq. (9) ca be expaded as x = M w x = 1 β = 1 w 1 N (y ; Cx, CSC T + K V ) x w 1 β N (C(Ax 1 + W 1 ) + V ; CAˆx 1, R ) x = 1 w ( 2π) d 1 β det(r) = e 1 2 (C r +V)T (R ) 1 (C r +V) (( I KC ) ) [ r KV. 1 w ( 2π) d 1 β det(r) e 1 2 (C r +V)T (R ) 1 (C r +V) r [ 1 w ( 2π) d 1 β det(r) e 1 2 (C r +V)T (R ) 1 (C r +V) K ( C r + V ) ]. (1) From Theorem 1, K K ad R R as with fiite-orm limit ad (R ) 1 is positive defiite. Because the orm of e 1 2 xt Qx x is bouded for ay positive defiite Q (see Appedix I), the the secod term of the Eq. (1) is bouded. For the first term, let q = C r + V. Sice C has full colum-rak, r = (C T C) 1 C T (q V ). Puttig this i the first term of Eq. (1) we get 1 w 1 β ( 2π) d det(r) ]

.5.4 With feedback cotrol Without feedback cotrol The validity of the boudedess of error ad the practical stability [12] of observer-based feedback is illustrated by umerical simulatio. The L 2 orm of the error for a liear system is give i Fig. 2. The effectiveess of the observerbased feedback is show i Fig. 3. x.3.2 2 15 With feedback cotrol Without feedback cotrol.1 2 4 6 8 1 Fig. 2: L 2 error of GMM-KF estimate with ad without feedback. e 1 2 (q )T (R ) 1 q (C T C) 1 C T (q V ) which is bouded (see Appedix 1) with probability P if V is bouded with the same probability. B. Observer-Based Feedback Cotrol To make use of the GMM-KF estimate i feedback cotrol, cosider a system x +1 = Ax + Bu + W. where W is a zero mea possibly o-gaussia process. With GMM-KF estimatio ad feedback, u = K ˆx, K is such chose that λ(a BK) max < 1, where λ max stads for maximum magitude of the eigevalues ad, usig the desired state x des =, we get x +1 = Ax BKˆx + W = (A BK)x + BK(x ˆx ) + W. = (A BK) x +(A BK) 1 (BK(x 1 ˆx 1 ) + W ) + + BK(x ˆx ) + W (11) Takig the expectatio E of x x des = x ad usig the triagle iequality yields x +1 λ max x + λ 1 max BK ( x 1 ˆx 1 + W ) + + BK x ˆx + W E x +1 λ max x + λ 1 max BK E x 1 ˆx 1 + + BK E( x ˆx ). (12) Sice λ(a BK) max < 1, E x +1 remais bouded as if E x ˆx is bouded, which is ideed true by Theorem 2 with probability P depedig o the measuremet oise. x 1 5 2 4 6 8 1 Fig. 3: Observer-based feedback effectively stabilizes the system. Here we have used the system [ ] [ ] 1..9 1. x +1 = x.5 1.2 + u 1. + W, where W is a o-gaussia zero mea i.i.d. radom vector. This system is ustable ad x icreases expoetially with [ u = ] for all. For feedback cotrol, we use u =.5 ˆx.7, so that the eigevalues of A BK lie withi the uit circle. The observer-based cotrol effectively stabilizes the system as show i Fig. 3. IV. EXTENSION TO NONLINEAR SYSTEMS To exted the GMM-KF settig to oliear systems, we ivestigate the widely used filterig framework, the Exteded Kalma Filter (EKF) [6]. We apply GMM to a oliear system with liear measuremet model, i.e., x +1 = f(x ) + W ad y = Cx + V, 1, (13) where W is possibly o-gaussia additive oise ad V N (, K V ). The GMM framework i this case is, for = 1,, M, x = f(ˆx 1 ), ˆx = x + K(y Cx ), K = SC T (CSC T + K V ) 1, S = A(ˆx 1 )Σ 1 A(ˆx 1) T + K W, Σ = (I K C)S, w = w 1 N (y ; Cx, CSC T + K V ) m=1 wm 1 N (y ; Cx m, CS m C T + K V ), ˆx = M w ˆx, (14)

where M w = 1, S = cov(x x ), Σ = cov(x ˆx ) ad A(x) = J f (x) is the Jacobia of f evaluated at x. The iitial values of the parameters have bee set by the Expectatio Maximizatio algorithm ad BIC as i the liear case. A. Error Propagatio The error dyamics are x = M w x = 1 β w 1 ( 2π) d det(r) e 1 2 (C f +V)T (R ) 1 (C f +V) ) ((I KC) f KV, (15) where f = f(x 1 ) f(ˆx 1 ) + W 1 ad R = CS C T + K V. To have x bouded i the oliear case, we eed stricter assumptios tha before because R ad S o loger coverge to fiite-orm matrices whe. Assumptio 1: α I Σ < β I, α, β for all ad = 1,, M. Uder Assumptio 1, S ad R, beig the positivedefiiteess-preservig biliear trasformatios of Σ 1 ad S (from Eq. 14), respectively, are always bouded above ad below by positive defiite matrices. Hece KC = SC T (R) 1 C is also positive defiite ad bouded above ad below. Now proceedig i the exact same way as the proof of the Theorem 2, x will also be bouded with probability P if C has full colum rak ad V is bouded with probability P. Let the boud o x be b x (P ). B. Observer-Based Feedback Cotrol To cotrol the oliear system (13), suppose we desig a state-feedback cotroller u(x), ad drive it with the estimated state derived by the filter. The closed loop system looks like x +1 = f(x ) + u(ˆx ) = F (x ) + u(ˆx ) u(x ) + W y = Cx + V, (16) where W is the o-gaussia additive oise ad F (x ) = f(x ) + u(x ). Assumptio 2: 2.1: The omial system x +1 = F (x ) is uiformly asymptotically stable o a ope ball B r of radius r cetered at ad a C 1 (i.e., differetiable) Lyapuov fuctio V : Z + B r R that satisfies α 1 ( x ) V (, x ) α 2 ( x ), (17) V N (, x ) α 3 ( x ), (18) where α i, i = 1, 2, 3 are class K fuctios ad V N (, x ) = V ( + 1, x +1 ) V (, x ) uder the omial system [12], [13]. 2.2: p R + ad M > such that u x M x p 1. With the applicatio of the mea value iequality, we get u(ˆx ) u(x ) Mb x (P ) from Assumptio 1. 2.3: W < b W (P ) with probability P. Assumptio 2.3 characterizes the process oise, ad b W (P ) ca be readily obtaied from the PDF or cumulative disctributio of the process oise (oly the latter if W is ot absolutely cotiuous). Theorem 3: Cosider the o-liear system (16) with o-gaussia process oise ad GMM-EKF state estimate ˆx. If Assumptios 1 2 are satisfied, the x is bouded with a ultimate boud b x with probability P, where b x is a fuctio of M, b x (P ), ad b W (P ). Proof : We use V (, x ) from Assumptio 2 for the perturbed system. Thus, where V P (, x ) = V N (, x ) + δv P (, x ), δv P (, x ) = V ( + 1, F (x ) + u(ˆx ) u(x ) + W ) V ( + 1, F (x )), ad subscript P stads for the perturbed system. Sice V C 1, l > such that δv P (, x ) l u(ˆx ) u(x ) + W Now, from Assumptios 1 2, l( u(ˆx ) u(x ) + W ). V P (, x ) = V N (, x ) + δv P (, x ) α 3 ( x ) + lm(b x (P ) + b W (P )), wp P = (1 θ)α 3 ( x ) θα 3 ( x ) +lm(b x (P ) + b W (P )), wp P ad θ (, 1) (1 θ)α 3 ( x ), ( ) x α3 1 lm(b x (P ) + b W (P )), θ wp P. Hece the x for the system (16) is u.u.b, with ultimate boud ( ) b x (P ) = α1 1 α 2 α3 1 lm(b x (P ) + b W (P )), wp P. θ (19) C. Simulatio Results We tested the GMM-EKF o a 2D oliear chaotic map attributed to the discrete-time versio of the Duffig

Oscillator [14]. The map is x 1 +1 = x 2 x 2 +1 = bx 1 + ax 2 (x 2 ) 3 + w. (2) The map depeds o the two costats a ad b, usually set to a = 2.75 ad b =.2 to produce chaotic behaviour [14]. We added zero-mea o-gaussia i.i.d. radom variable w as the process oise. The resultat estimated map ad the error is show i Fig. 4. The effectiveess of GMM-EKF i feedback cotrol is illustrated i the Fig. 5 with the system x 1 +1 = x 2 x 2 +1 = 1.1x 1 x 2 + w + u, (21) where w is o-gaussia i.i.d. radom variable. To stabilize the system, the feedback cotrol is give by u =.3ˆx 1 ˆx 2. The observer-based feedback effectively stabilizes the system as show i Fig. 5. x, ˆx 3 2 1 Actual map Estimated map 5 1 (a) x.6.4.2 5 1 (b) Fig. 4: GMM-EKF estimate of Duffig map: (a) L 2 orm of the state, (b) L 2 error. x 1 8 6 4 2 With feedback cotrol Without feedback cotrol 1 2 3 4 5 6 7 8 Fig. 5: Observer-based feedback (with GMM-EKF) effectively stabilizes the system. V. CONCLUSIONS This paper presets the Gaussia Mixture Model Kalma Filter (GMM-KF) as a effective tool to deal with o- Gaussia process oise for observer based feedback. The boudedess of estimatio error is prove aalytically ad supported by simulatio results. The framework is exteded to oliear settigs usig the Exteded Kalma Filter (EKF) i couctio with GMM. GMM-KF is a dyamic Bayesia filterig techique that ca be modified to less computatioal complexity, but works well with data-drive systems. The theoretical ustificatio of the performace of GMM-KF is show here. I ogoig work, we seek to exted this framework to oliear observatios, as well as to a geeral Bayesia filterig techique. APPENDIX I Propositio: x exp( x T Qx) with Q > is bouded. Proof : x exp( x T Qx) x exp( x T Qx) x exp( λ mi (Q) x 2 ), sice Q >, λ max λ mi (Q) >, 1 e, 2λmi (Q) from sigle-variable calculus. ACKNOWLEDGMENTS (22) The authors thak Frak Lagor ad Dipakar Maity for valuable discussios pertaiig to GMM ad BIC. REFERENCES [1] R. E. Kalma, A ew approach to liear filterig ad predictio problems, Trasactios of the ASME Joural of Basic Egieerig, vol. 82, pp. 35 45, 196. [2] D. Alspach ad H. Soreso, Noliear Bayesia estimatio usig Gaussia sum approximatios, IEEE Trasactios o Automatic Cotrol, vol. 17, o. 4, pp. 439 448, 1972. [3] T. Sodergaard ad P. F. J. Lermusiaux, Data assimilatio with Gaussia mixture models usig the dyamically orthogoal field equatios. Part I: Theory ad scheme, Mo. Weather Rev., vol. Volume 141, pp. 1737 176, 213. [4] A. P. Dempster, N. M. Laird, ad D. B. Rubi, Maximum likelihood from icomplete data via the EM algorithm, Joural of the Royal Statistical Society. Series B (Methodological), vol. 39, o. 1, pp. 1 38, 1977. [5] G. Schwarz, Estimatig the dimesio of a model, A. Statist., vol. 6, o. 2, pp. 461 464, 3 1978. [6] M. I. Ribeiro, Kalma ad Exteded Kalma Filters: cocept, derivatio ad properties, Techical Report, 24. [Olie]. Available: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=1.1.1.2. 588 [7] M. S. Arulampalam, S. Maskell, N. Gordo, ad T. Clapp, A tutorial o particle filters for olie oliear/o-gaussia Bayesia trackig, IEEE Tras. Sigal Proc., vol. 5, o. 2, pp. 174 188, 22. [8] A. Doucet, S. Godsill, ad C. Adrieu, O sequetial Mote Carlo samplig methods for Bayesia filterig, Statistics ad Computig, vol. 1, o. 3, pp. 197 28, 2. [9] G. Kitagawa, Mote Carlo Filter ad Smoother for No-Gaussia Noliear State Space Models, Joural of Computatioal ad Graphical Statistics, vol. 5, o. 1, pp. 1 25, 1996. [1] C. Bishop, Patter recogitio ad machie learig. New York: Spriger, 26. [11] J. Walrad, Kalma filter: Covergece, UC Berkley, EE 226A Lecture Note. [Olie]. Available: http://robotics.eecs.berkeley.edu/ wlr/226f6/226a-ch11.pdf [12] H. K. Khalil, Noliear systems. Upper Saddle River, (N.J.): Pretice Hall, 1996. [13] C. Cruz-Herádez, J. Alvarez-Gallegos, ad R. Castro-Liares, Stability of discrete oliear systems uder ovaishig pertuabatios: applicatio to a oliear model-matchig problem, IMA Joural of Mathematical Cotrol ad Iformatio, vol. 16, o. 1, pp. 23 41, 1999. [14] T. Kapitaiak, Chaotic Oscillators: Theory ad Applicatios. World Scietific, 1992.