Non-Gaussian Estimation and Observer-Based Feedback using the Gaussian Mixture Kalman and Extended Kalman Filters

Size: px
Start display at page:

Download "Non-Gaussian Estimation and Observer-Based Feedback using the Gaussian Mixture Kalman and Extended Kalman Filters"

Transcription

1 No-Gaussia Estimatio ad Observer-Based Feedback usig the Gaussia Mixture Kalma ad Exteded Kalma Filters Debdipta Goswami 1 ad Derek A. Paley 2 Abstract This paper cosiders the problem of o-gaussia estimatio ad observer-based feedback i liear ad oliear settigs. Estimatio i oliear systems with o- Gaussia process oise statistics is importat for applicatios i atmospheric ad oceaic samplig. No-Gaussia filterig is, however, largely problem specific ad mostly sub-optimal. This mauscript uses a Gaussia Mixture Model (GMM) to characterize the prior o-gaussia distributio, ad applies the Kalma filter update to estimate the state with ucertaity. The boudedess of error i both liear ad oliear cases is aalytically ustified uder various assumptios, ad the resultig estimate is used for feedback cotrol. To apply GMM i oliear settigs, we utilize a commo extesio of the Kalma filter: the Exteded Kalma Filter (EKF). The theoretical results are illustrated by umerical simulatios. I. INTRODUCTION Kowledge of state variables i dyamical systems is the key compoet for desigig state-feedback cotrol laws that ca practically stabilize a plat. However i actual systems, lack of access to state variables motivates observer-based feedback cotrol systems, where the curret state is estimated from a set of observables. The situatio worses whe the system dyamics ad/or the observatios are corrupted with oise, ad the iitial state estimate may oly be give i terms of a a priori probability desity. Hece, the eed for a filter arises for state estimatio. For a liear system with liear observatios ad Gaussia process ad sesor oise, the optimal estimator is give by the celebrated Kalma filter [1]. But for a oliear system ad systems with o- Gaussia oise, the momet propagatio eeds a ifiite umber of parameters ad a optimal filter is difficult to formulate. As a compromise, several sub-optimal solutios have bee developed, e.g., the Exteded Kalma Filter (EKF) ad the Usceted Kalma Filter (UKF), which rely o a fiite umber of parameters, thus sacrificig some of the system s details. However, these filters still rely o the Gaussia-process oise model, which may ot hold i a oliear system. The recursive Bayesia filter is the geeral represetatio of a optimal oliear filter that coverts the prior (forecast) state probability desity fuctio (PDF) ito the posterior (aalysis) state PDF usig the likelihood of the observatio. However i may cases, the prior PDF is ot explicitly 1 Debdipta Goswami is a graduate studet i the Departmet of Electrical ad Computer Egieerig ad the Istitute for Systems Research, Uiversity of Marylad, College Park, MD goswamid@umd.edu 2 Derek A. Paley is the Willis H. Youg Jr. Associate Professor of Aerospace Egieerig Educatio i the Departmet of Aerospace Egieerig ad the Istitute for Systems Research, Uiversity of Marylad, College Park, MD 2742 dpaley@umd.edu kow ad therefore has to be estimated usig a esemble realizatio. For such cases, the recursive Bayesia estimatio becomes computatioally difficult ad, therefore, a semiparametric model must be realized. Alspach ad Soreso [2] have show that ay probability desity fuctio ca be approximated arbitrarily closely from a esemble realizatio usig a weighted sum of Gaussia PDFs. The so-called Gaussia Mixture Model (GMM) gives a approximate way to explicitly calculate the posteriori desity of the states of a stochastic system, eve i the oliear o-gaussia case. There are several approaches for time ad measuremet updates usig GMM as show i the GMM-DO filter of Sodergaard ad Lermusiaux [3]. GMM, equipped with Mote-Carlo data-fittig usig the Expectatio Maximizatio (EM) algorithm [4] ad Bayesia Iformatio Criteria (BIC) [5], provides a accurate estimate of the prior PDF. GMM represets the PDF usig a weighted sum of Gaussia PDFs, each of which ca be updated usig idividual Kalma filters if the measuremet model is liear ad the measuremet oise profile is Gaussia. Data assimilatio, such as with atmospheric or oceaic samplig vehicles, ofte uses a liear (or liearized) observatio model, ad the measuremet oise is assumed to be additive Gaussia [3]. With these assumptios, the GMM alog with Kalma filter updates (i.e., GMM-KF) is sufficiet to estimate the state. For oliear system dyamics, the Kalma update step may be replaced by a Exteded Kalma Filter (EKF), which gives the predictio up to first-order precisio [6]. The state estimate thus obtaied may be used for feedback stabilizatio. A block diagram of such a observer-based feedback system is show i Fig. 1. I this model, x ad y are the state ad observatio, respectively at the th time step. The process is iheretly oisy with a additive process-oise W ad the observatio is corrupted by the measuremet-oise V. The state-estimate ˆx is u r e x u = Ke +1 = f(x, u ) + W, y y = Cx + V ˆx ˆx = O GMM (y ) Fig. 1: Block-diagram of a discrete time closed-loop observer-based feedback cotrol system with a Gaussia Mixture Model observer O GMM. y

2 obtaied usig GMM-KF observer (deoted as O GMM ) ad this estimate is used to derive the feedback cotrol sigal u = K(r ˆx ). The cotributios of this work are (1) a theoretical guaratee of boudedess of the estimatio error while usig GMM-KF i a liear system with o-gaussia oise; (2) ultimate boudedess of the orm of the state i a liear system with observer-based feedback; (3) extesio of error boud aalysis to a oliear system with o-gaussia process oise; ad (4) proof of the ultimate boudedess of the state i a oliear system usig Lyapuov s method for observer-based feedback with GMM-EKF. The mauscript is orgaized as follows. Sectio II provides a brief overview of Gaussia Mixture Model. Sectio III explores the aalytical as well as umerical results for liear settigs. Sectio IV exteds the results to oliear systems usig GMM-EKF. Sectio V summarizes the mauscript ad discusses possible future work. II. GAUSSIAN MIXTURE MODEL The GMM-KF works by represetig the probability desity fuctio (PDF) usig a Gaussia Mixture Model (GMM). Let w, = 1,..., M, be scalar weights such that w = 1. Let x ad P be the mea vector ad covariace matrix respectively for a multivariate Gaussia N ( x; x, P ), = 1,..., M. The weighted sum of the M Gaussia desities [3] ( p X x; {w, x, P } M ) M = w N ( x; x, P ) (1) is a valid probability desity fuctio (PDF) that itegrates to uity ad has a aalytical represetatio. Through the selectio of the weights, meas, covariaces, ad umber of mixture compoets, (1) ca represet eve highly o- Gaussia distributios. Traditioal esemble/particle-based methods represet a PDF usig sparse support of esemble members (i.e., a Mote Carlo samplig of realizatios) [7]. This represetatio eables oliear propagatio of the ucertaity i the forecast step of the filter. Ufortuately, may particle filters suffer from degeeracy issues due to the sparsity of the PDF represetatio [7]. Kerel-based approaches address this issue by periodically creatig a desity estimate from the esemble sample to matai the full support of the state space ad to facilitate resamplig [7], [8], [9]. Ufortuately, such approaches require the arbitrary choice of fittig parameters such as kerel badwidth [3]. For Gaussia mixtures, give a specific choice for mixture complexity, a Expectatio Maximizatio algorithm automatically selects the weights, meas, ad covariaces of the Gaussias to best fit the esemble [1]. A key cotributio of [3] is the use of the Bayesia Iformatio Criterio (BIC) for the automatic selectio of the mixture complexity. Let x be the th sample i a esamble realizatio. The BIC may be (approximately) expressed as [3] BIC = 2 N log p(x Ω ML ; M) + K log N, (2) where K is the umber of parameters i the model, Ω ML is the maximum likelihood set of parameters (produced by the EM algorithm), ad N is the umber of esemble members. For a multivariate Gaussia mixture where d is the dimesio of the state vector, K = M (2d+(d(d 1)) /2+1) is the umber of free parameters [3]. The BIC has two compoets: the first compoet evaluates the goodess-of-fit for the model of complexity M ad the secod compoet is a pealty o the overall model complexity [3]. By sequetially evaluatig models of icreasig complexity, oe may idetify a local miimum i the BIC. Oe seeks the best fit of a mixture of Gaussias to the data; the model-complexity term i the BIC esures that a simpler model is preferred [3]. For may data-assimilatio applicatios, the observatiooperator C liearly extracts the measuremet from the state vector, i.e., y =Cx+V with v N (, K V ), (3) where V is zero-mea measuremet oise with covariace K V. For a (sigle) Gaussia forecast PDF, Gaussia measuremet oise, ad a liear observatio operator, the Kalma-aalysis equatios represet the optimal approach to Bayesia assimilatio of a measuremet. I the case of a mixture of Gaussias, the Kalma-aalysis equatios may be augmeted with a weight-aalysis equatio to yield the proper applicatio of Bayes rule for each compoet i the mixture [3], as follows. The Bayesia update of a Gaussia mixture prior with Gaussia observatio model yields a Gaussia mixture posterior [3]. For a prior GMM p X (x) = w N ( x; x, P ), ad a Gaussia observatio model p Y X (y x) = N (y; Cx, R), the posterior PDF looks like [3] where p X Y (x y) = ŵ N ( x; ˆx, ˆP ), (4) ˆx = x + K (y Cx ), K = P C T (CP C T + K V ) 1, ˆP = (I K C)P, ad (5) ŵ = w N (y; Cx, CP C T + K V ) m=1 wm N (y; Cx m, CP m C T + K V ). Combiig the weighted Gaussias produces the posteriormea estimate. III. GMM-KF FOR LINEAR SYSTEMS Usually GMM-KF filter works by ivokig the Expectatio Maximizatio algorithm at each step, ad calculatig

3 the ukow parameters. But for the sake of theoretical tractability, ad to reduce computatioal burdes, we are goig to formulate a special case, where the GMM is calculated usig EM ad BIC oce from the iitial prior, ad the updated usig Kalma update steps. First cosider a liear Gaussia system x +1 = Ax + W ad y = Cx + V, 1, (6) where x R d, y R q, W, V, 1 are orthogoal ad zero mea i.i.d. Gaussia radom vectors with cov(v ) = K V ad cov(w ) = K W. The best estimate of the state is obtaied by Kalma filter: i.e., x = Aˆx 1, ˆx = x + K (y Cx ), K = S C T (CS C T + K V ) 1, S = AΣ 1 A T + K W, Σ = (I K C)S, where S = cov(x x ) ad Σ = cov(x ˆx ). The error x for the Kalma filter is give by x = (I K C) (A x 1 + W 1 ) K V. A. Error Propagatio i Liear GMM-KF Theorem 1 [11]: Let K W = QQ T. Suppose that (A, Q) is reachable ad (A, C) is observable. If S 1 =, the Σ Σ, K K, S S as. The limitig matrices are the oly solutios of the equatios Σ = (I KC)S, K = SC T (CSC T + K V ) 1, ad S = AΣA T + K W. Now we ivestigate the error dyamics of GMM-KF i the liear o-gaussia case. I this sceario, the system remais same as Eq. (6), except W is o-gaussia (may be assumed as zero mea, without loss of geerality). The GMM-KF is, for = 1,, M, x = Aˆx 1, ˆx = x + K(y Cx ), K = SC T (CSC T + K V ) 1, S = AΣ 1 AT + K W, Σ = (I K C)S, w = w 1 N (y ; Cx, CSC T + K V ) m=1 wm 1 N (y ; Cx m, CS m C T + K V ), where ˆx = M w ˆx, M (7) (8) w = 1, S = cov(x x ) ad Σ = cov(x ˆx ). Assume that the iitial values of the parameters have bee set by the EM algorithm ad BIC. To show that x ˆx is bouded, we proceed to the error aalysis of this estimator, where the error x x ˆx ca be writte as x = w x = w(x ˆx ). (9) Theorem 2: Cosider the liear system (6) with possibly o-gaussia oise W. If the coditios of Theorem 1 are satisfied, the matrix C has full colum-rak ad the orm of the measuremet oise V is bouded with probability P, the the orm of the error x is bouded with probability P for all. Proof: From Theorem 1, Σ = (I KC)S, K = S C T (CS C T + K V ) 1, S = AΣ A T + K W, = 1,, M are the bouded limits of Σ, K ad S as. Let R = CSC T + K V, d be the dimesioality of x ad β = m=1 wm 1N (y ; Cx m, CS m C T + K V ) be the fiite ormalizatio factor. From the Kalma update equatio (8), x = ( I KC ) r KV, where r A x 1 + W 1. Usig this result alog with Eq. (8), Eq. (9) ca be expaded as x = M w x = 1 β = 1 w 1 N (y ; Cx, CSC T + K V ) x w 1 β N (C(Ax 1 + W 1 ) + V ; CAˆx 1, R ) x = 1 w ( 2π) d 1 β det(r) = e 1 2 (C r +V)T (R ) 1 (C r +V) (( I KC ) ) [ r KV. 1 w ( 2π) d 1 β det(r) e 1 2 (C r +V)T (R ) 1 (C r +V) r [ 1 w ( 2π) d 1 β det(r) e 1 2 (C r +V)T (R ) 1 (C r +V) K ( C r + V ) ]. (1) From Theorem 1, K K ad R R as with fiite-orm limit ad (R ) 1 is positive defiite. Because the orm of e 1 2 xt Qx x is bouded for ay positive defiite Q (see Appedix I), the the secod term of the Eq. (1) is bouded. For the first term, let q = C r + V. Sice C has full colum-rak, r = (C T C) 1 C T (q V ). Puttig this i the first term of Eq. (1) we get 1 w 1 β ( 2π) d det(r) ]

4 .5.4 With feedback cotrol Without feedback cotrol The validity of the boudedess of error ad the practical stability [12] of observer-based feedback is illustrated by umerical simulatio. The L 2 orm of the error for a liear system is give i Fig. 2. The effectiveess of the observerbased feedback is show i Fig. 3. x With feedback cotrol Without feedback cotrol Fig. 2: L 2 error of GMM-KF estimate with ad without feedback. e 1 2 (q )T (R ) 1 q (C T C) 1 C T (q V ) which is bouded (see Appedix 1) with probability P if V is bouded with the same probability. B. Observer-Based Feedback Cotrol To make use of the GMM-KF estimate i feedback cotrol, cosider a system x +1 = Ax + Bu + W. where W is a zero mea possibly o-gaussia process. With GMM-KF estimatio ad feedback, u = K ˆx, K is such chose that λ(a BK) max < 1, where λ max stads for maximum magitude of the eigevalues ad, usig the desired state x des =, we get x +1 = Ax BKˆx + W = (A BK)x + BK(x ˆx ) + W. = (A BK) x +(A BK) 1 (BK(x 1 ˆx 1 ) + W ) + + BK(x ˆx ) + W (11) Takig the expectatio E of x x des = x ad usig the triagle iequality yields x +1 λ max x + λ 1 max BK ( x 1 ˆx 1 + W ) + + BK x ˆx + W E x +1 λ max x + λ 1 max BK E x 1 ˆx BK E( x ˆx ). (12) Sice λ(a BK) max < 1, E x +1 remais bouded as if E x ˆx is bouded, which is ideed true by Theorem 2 with probability P depedig o the measuremet oise. x Fig. 3: Observer-based feedback effectively stabilizes the system. Here we have used the system [ ] [ ] x +1 = x u 1. + W, where W is a o-gaussia zero mea i.i.d. radom vector. This system is ustable ad x icreases expoetially with [ u = ] for all. For feedback cotrol, we use u =.5 ˆx.7, so that the eigevalues of A BK lie withi the uit circle. The observer-based cotrol effectively stabilizes the system as show i Fig. 3. IV. EXTENSION TO NONLINEAR SYSTEMS To exted the GMM-KF settig to oliear systems, we ivestigate the widely used filterig framework, the Exteded Kalma Filter (EKF) [6]. We apply GMM to a oliear system with liear measuremet model, i.e., x +1 = f(x ) + W ad y = Cx + V, 1, (13) where W is possibly o-gaussia additive oise ad V N (, K V ). The GMM framework i this case is, for = 1,, M, x = f(ˆx 1 ), ˆx = x + K(y Cx ), K = SC T (CSC T + K V ) 1, S = A(ˆx 1 )Σ 1 A(ˆx 1) T + K W, Σ = (I K C)S, w = w 1 N (y ; Cx, CSC T + K V ) m=1 wm 1 N (y ; Cx m, CS m C T + K V ), ˆx = M w ˆx, (14)

5 where M w = 1, S = cov(x x ), Σ = cov(x ˆx ) ad A(x) = J f (x) is the Jacobia of f evaluated at x. The iitial values of the parameters have bee set by the Expectatio Maximizatio algorithm ad BIC as i the liear case. A. Error Propagatio The error dyamics are x = M w x = 1 β w 1 ( 2π) d det(r) e 1 2 (C f +V)T (R ) 1 (C f +V) ) ((I KC) f KV, (15) where f = f(x 1 ) f(ˆx 1 ) + W 1 ad R = CS C T + K V. To have x bouded i the oliear case, we eed stricter assumptios tha before because R ad S o loger coverge to fiite-orm matrices whe. Assumptio 1: α I Σ < β I, α, β for all ad = 1,, M. Uder Assumptio 1, S ad R, beig the positivedefiiteess-preservig biliear trasformatios of Σ 1 ad S (from Eq. 14), respectively, are always bouded above ad below by positive defiite matrices. Hece KC = SC T (R) 1 C is also positive defiite ad bouded above ad below. Now proceedig i the exact same way as the proof of the Theorem 2, x will also be bouded with probability P if C has full colum rak ad V is bouded with probability P. Let the boud o x be b x (P ). B. Observer-Based Feedback Cotrol To cotrol the oliear system (13), suppose we desig a state-feedback cotroller u(x), ad drive it with the estimated state derived by the filter. The closed loop system looks like x +1 = f(x ) + u(ˆx ) = F (x ) + u(ˆx ) u(x ) + W y = Cx + V, (16) where W is the o-gaussia additive oise ad F (x ) = f(x ) + u(x ). Assumptio 2: 2.1: The omial system x +1 = F (x ) is uiformly asymptotically stable o a ope ball B r of radius r cetered at ad a C 1 (i.e., differetiable) Lyapuov fuctio V : Z + B r R that satisfies α 1 ( x ) V (, x ) α 2 ( x ), (17) V N (, x ) α 3 ( x ), (18) where α i, i = 1, 2, 3 are class K fuctios ad V N (, x ) = V ( + 1, x +1 ) V (, x ) uder the omial system [12], [13]. 2.2: p R + ad M > such that u x M x p 1. With the applicatio of the mea value iequality, we get u(ˆx ) u(x ) Mb x (P ) from Assumptio : W < b W (P ) with probability P. Assumptio 2.3 characterizes the process oise, ad b W (P ) ca be readily obtaied from the PDF or cumulative disctributio of the process oise (oly the latter if W is ot absolutely cotiuous). Theorem 3: Cosider the o-liear system (16) with o-gaussia process oise ad GMM-EKF state estimate ˆx. If Assumptios 1 2 are satisfied, the x is bouded with a ultimate boud b x with probability P, where b x is a fuctio of M, b x (P ), ad b W (P ). Proof : We use V (, x ) from Assumptio 2 for the perturbed system. Thus, where V P (, x ) = V N (, x ) + δv P (, x ), δv P (, x ) = V ( + 1, F (x ) + u(ˆx ) u(x ) + W ) V ( + 1, F (x )), ad subscript P stads for the perturbed system. Sice V C 1, l > such that δv P (, x ) l u(ˆx ) u(x ) + W Now, from Assumptios 1 2, l( u(ˆx ) u(x ) + W ). V P (, x ) = V N (, x ) + δv P (, x ) α 3 ( x ) + lm(b x (P ) + b W (P )), wp P = (1 θ)α 3 ( x ) θα 3 ( x ) +lm(b x (P ) + b W (P )), wp P ad θ (, 1) (1 θ)α 3 ( x ), ( ) x α3 1 lm(b x (P ) + b W (P )), θ wp P. Hece the x for the system (16) is u.u.b, with ultimate boud ( ) b x (P ) = α1 1 α 2 α3 1 lm(b x (P ) + b W (P )), wp P. θ (19) C. Simulatio Results We tested the GMM-EKF o a 2D oliear chaotic map attributed to the discrete-time versio of the Duffig

6 Oscillator [14]. The map is x 1 +1 = x 2 x 2 +1 = bx 1 + ax 2 (x 2 ) 3 + w. (2) The map depeds o the two costats a ad b, usually set to a = 2.75 ad b =.2 to produce chaotic behaviour [14]. We added zero-mea o-gaussia i.i.d. radom variable w as the process oise. The resultat estimated map ad the error is show i Fig. 4. The effectiveess of GMM-EKF i feedback cotrol is illustrated i the Fig. 5 with the system x 1 +1 = x 2 x 2 +1 = 1.1x 1 x 2 + w + u, (21) where w is o-gaussia i.i.d. radom variable. To stabilize the system, the feedback cotrol is give by u =.3ˆx 1 ˆx 2. The observer-based feedback effectively stabilizes the system as show i Fig. 5. x, ˆx Actual map Estimated map 5 1 (a) x (b) Fig. 4: GMM-EKF estimate of Duffig map: (a) L 2 orm of the state, (b) L 2 error. x With feedback cotrol Without feedback cotrol Fig. 5: Observer-based feedback (with GMM-EKF) effectively stabilizes the system. V. CONCLUSIONS This paper presets the Gaussia Mixture Model Kalma Filter (GMM-KF) as a effective tool to deal with o- Gaussia process oise for observer based feedback. The boudedess of estimatio error is prove aalytically ad supported by simulatio results. The framework is exteded to oliear settigs usig the Exteded Kalma Filter (EKF) i couctio with GMM. GMM-KF is a dyamic Bayesia filterig techique that ca be modified to less computatioal complexity, but works well with data-drive systems. The theoretical ustificatio of the performace of GMM-KF is show here. I ogoig work, we seek to exted this framework to oliear observatios, as well as to a geeral Bayesia filterig techique. APPENDIX I Propositio: x exp( x T Qx) with Q > is bouded. Proof : x exp( x T Qx) x exp( x T Qx) x exp( λ mi (Q) x 2 ), sice Q >, λ max λ mi (Q) >, 1 e, 2λmi (Q) from sigle-variable calculus. ACKNOWLEDGMENTS (22) The authors thak Frak Lagor ad Dipakar Maity for valuable discussios pertaiig to GMM ad BIC. REFERENCES [1] R. E. Kalma, A ew approach to liear filterig ad predictio problems, Trasactios of the ASME Joural of Basic Egieerig, vol. 82, pp , 196. [2] D. Alspach ad H. Soreso, Noliear Bayesia estimatio usig Gaussia sum approximatios, IEEE Trasactios o Automatic Cotrol, vol. 17, o. 4, pp , [3] T. Sodergaard ad P. F. J. Lermusiaux, Data assimilatio with Gaussia mixture models usig the dyamically orthogoal field equatios. Part I: Theory ad scheme, Mo. Weather Rev., vol. Volume 141, pp , 213. [4] A. P. Dempster, N. M. Laird, ad D. B. Rubi, Maximum likelihood from icomplete data via the EM algorithm, Joural of the Royal Statistical Society. Series B (Methodological), vol. 39, o. 1, pp. 1 38, [5] G. Schwarz, Estimatig the dimesio of a model, A. Statist., vol. 6, o. 2, pp , [6] M. I. Ribeiro, Kalma ad Exteded Kalma Filters: cocept, derivatio ad properties, Techical Report, 24. [Olie]. Available: [7] M. S. Arulampalam, S. Maskell, N. Gordo, ad T. Clapp, A tutorial o particle filters for olie oliear/o-gaussia Bayesia trackig, IEEE Tras. Sigal Proc., vol. 5, o. 2, pp , 22. [8] A. Doucet, S. Godsill, ad C. Adrieu, O sequetial Mote Carlo samplig methods for Bayesia filterig, Statistics ad Computig, vol. 1, o. 3, pp , 2. [9] G. Kitagawa, Mote Carlo Filter ad Smoother for No-Gaussia Noliear State Space Models, Joural of Computatioal ad Graphical Statistics, vol. 5, o. 1, pp. 1 25, [1] C. Bishop, Patter recogitio ad machie learig. New York: Spriger, 26. [11] J. Walrad, Kalma filter: Covergece, UC Berkley, EE 226A Lecture Note. [Olie]. Available: wlr/226f6/226a-ch11.pdf [12] H. K. Khalil, Noliear systems. Upper Saddle River, (N.J.): Pretice Hall, [13] C. Cruz-Herádez, J. Alvarez-Gallegos, ad R. Castro-Liares, Stability of discrete oliear systems uder ovaishig pertuabatios: applicatio to a oliear model-matchig problem, IMA Joural of Mathematical Cotrol ad Iformatio, vol. 16, o. 1, pp , [14] T. Kapitaiak, Chaotic Oscillators: Theory ad Applicatios. World Scietific, 1992.

Statistical Pattern Recognition

Statistical Pattern Recognition Statistical Patter Recogitio Classificatio: No-Parametric Modelig Hamid R. Rabiee Jafar Muhammadi Sprig 2014 http://ce.sharif.edu/courses/92-93/2/ce725-2/ Ageda Parametric Modelig No-Parametric Modelig

More information

3/8/2016. Contents in latter part PATTERN RECOGNITION AND MACHINE LEARNING. Dynamical Systems. Dynamical Systems. Linear Dynamical Systems

3/8/2016. Contents in latter part PATTERN RECOGNITION AND MACHINE LEARNING. Dynamical Systems. Dynamical Systems. Linear Dynamical Systems Cotets i latter part PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 13: SEQUENTIAL DATA Liear Dyamical Systems What is differet from HMM? Kalma filter Its stregth ad limitatio Particle Filter Its simple

More information

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian

Chapter 12 EM algorithms The Expectation-Maximization (EM) algorithm is a maximum likelihood method for models that have hidden variables eg. Gaussian Chapter 2 EM algorithms The Expectatio-Maximizatio (EM) algorithm is a maximum likelihood method for models that have hidde variables eg. Gaussia Mixture Models (GMMs), Liear Dyamic Systems (LDSs) ad Hidde

More information

A Note on Effi cient Conditional Simulation of Gaussian Distributions. April 2010

A Note on Effi cient Conditional Simulation of Gaussian Distributions. April 2010 A Note o Effi ciet Coditioal Simulatio of Gaussia Distributios A D D C S S, U B C, V, BC, C April 2010 A Cosider a multivariate Gaussia radom vector which ca be partitioed ito observed ad uobserved compoetswe

More information

Chandrasekhar Type Algorithms. for the Riccati Equation of Lainiotis Filter

Chandrasekhar Type Algorithms. for the Riccati Equation of Lainiotis Filter Cotemporary Egieerig Scieces, Vol. 3, 00, o. 4, 9-00 Chadrasekhar ype Algorithms for the Riccati Equatio of Laiiotis Filter Nicholas Assimakis Departmet of Electroics echological Educatioal Istitute of

More information

Lecture 2: Monte Carlo Simulation

Lecture 2: Monte Carlo Simulation STAT/Q SCI 43: Itroductio to Resamplig ethods Sprig 27 Istructor: Ye-Chi Che Lecture 2: ote Carlo Simulatio 2 ote Carlo Itegratio Assume we wat to evaluate the followig itegratio: e x3 dx What ca we do?

More information

Multilevel ensemble Kalman filtering

Multilevel ensemble Kalman filtering Multilevel esemble Kalma filterig Håko Hoel 1 Kody Law 2 Raúl Tempoe 3 1 Departmet of Mathematics, Uiversity of Oslo, Norway 2 Oak Ridge Natioal Laboratory, TN, USA 3 Applied Mathematics ad Computatioal

More information

THE KALMAN FILTER RAUL ROJAS

THE KALMAN FILTER RAUL ROJAS THE KALMAN FILTER RAUL ROJAS Abstract. This paper provides a getle itroductio to the Kalma filter, a umerical method that ca be used for sesor fusio or for calculatio of trajectories. First, we cosider

More information

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015

ECE 8527: Introduction to Machine Learning and Pattern Recognition Midterm # 1. Vaishali Amin Fall, 2015 ECE 8527: Itroductio to Machie Learig ad Patter Recogitio Midterm # 1 Vaishali Ami Fall, 2015 tue39624@temple.edu Problem No. 1: Cosider a two-class discrete distributio problem: ω 1 :{[0,0], [2,0], [2,2],

More information

The Expectation-Maximization (EM) Algorithm

The Expectation-Maximization (EM) Algorithm The Expectatio-Maximizatio (EM) Algorithm Readig Assigmets T. Mitchell, Machie Learig, McGraw-Hill, 997 (sectio 6.2, hard copy). S. Gog et al. Dyamic Visio: From Images to Face Recogitio, Imperial College

More information

Lecture 19: Convergence

Lecture 19: Convergence Lecture 19: Covergece Asymptotic approach I statistical aalysis or iferece, a key to the success of fidig a good procedure is beig able to fid some momets ad/or distributios of various statistics. I may

More information

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector

Summary and Discussion on Simultaneous Analysis of Lasso and Dantzig Selector Summary ad Discussio o Simultaeous Aalysis of Lasso ad Datzig Selector STAT732, Sprig 28 Duzhe Wag May 4, 28 Abstract This is a discussio o the work i Bickel, Ritov ad Tsybakov (29). We begi with a short

More information

Ω ). Then the following inequality takes place:

Ω ). Then the following inequality takes place: Lecture 8 Lemma 5. Let f : R R be a cotiuously differetiable covex fuctio. Choose a costat δ > ad cosider the subset Ωδ = { R f δ } R. Let Ωδ ad assume that f < δ, i.e., is ot o the boudary of f = δ, i.e.,

More information

Sequential Monte Carlo Methods - A Review. Arnaud Doucet. Engineering Department, Cambridge University, UK

Sequential Monte Carlo Methods - A Review. Arnaud Doucet. Engineering Department, Cambridge University, UK Sequetial Mote Carlo Methods - A Review Araud Doucet Egieerig Departmet, Cambridge Uiversity, UK http://www-sigproc.eg.cam.ac.uk/ ad2/araud doucet.html ad2@eg.cam.ac.uk Istitut Heri Poicaré - Paris - 2

More information

1.010 Uncertainty in Engineering Fall 2008

1.010 Uncertainty in Engineering Fall 2008 MIT OpeCourseWare http://ocw.mit.edu.00 Ucertaity i Egieerig Fall 2008 For iformatio about citig these materials or our Terms of Use, visit: http://ocw.mit.edu.terms. .00 - Brief Notes # 9 Poit ad Iterval

More information

Lainiotis filter implementation. via Chandrasekhar type algorithm

Lainiotis filter implementation. via Chandrasekhar type algorithm Joural of Computatios & Modellig, vol.1, o.1, 2011, 115-130 ISSN: 1792-7625 prit, 1792-8850 olie Iteratioal Scietific Press, 2011 Laiiotis filter implemetatio via Chadrasehar type algorithm Nicholas Assimais

More information

Stochastic Simulation

Stochastic Simulation Stochastic Simulatio 1 Itroductio Readig Assigmet: Read Chapter 1 of text. We shall itroduce may of the key issues to be discussed i this course via a couple of model problems. Model Problem 1 (Jackso

More information

Introduction to Signals and Systems, Part V: Lecture Summary

Introduction to Signals and Systems, Part V: Lecture Summary EEL33: Discrete-Time Sigals ad Systems Itroductio to Sigals ad Systems, Part V: Lecture Summary Itroductio to Sigals ad Systems, Part V: Lecture Summary So far we have oly looked at examples of o-recursive

More information

Clustering. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar.

Clustering. CM226: Machine Learning for Bioinformatics. Fall Sriram Sankararaman Acknowledgments: Fei Sha, Ameet Talwalkar. Clusterig CM226: Machie Learig for Bioiformatics. Fall 216 Sriram Sakararama Ackowledgmets: Fei Sha, Ameet Talwalkar Clusterig 1 / 42 Admiistratio HW 1 due o Moday. Email/post o CCLE if you have questios.

More information

CSE 527, Additional notes on MLE & EM

CSE 527, Additional notes on MLE & EM CSE 57 Lecture Notes: MLE & EM CSE 57, Additioal otes o MLE & EM Based o earlier otes by C. Grat & M. Narasimha Itroductio Last lecture we bega a examiatio of model based clusterig. This lecture will be

More information

( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let.

( ) (( ) ) ANSWERS TO EXERCISES IN APPENDIX B. Section B.1 VECTORS AND SETS. Exercise B.1-1: Convex sets. are convex, , hence. and. (a) Let. Joh Riley 8 Jue 03 ANSWERS TO EXERCISES IN APPENDIX B Sectio B VECTORS AND SETS Exercise B-: Covex sets (a) Let 0 x, x X, X, hece 0 x, x X ad 0 x, x X Sice X ad X are covex, x X ad x X The x X X, which

More information

Problem Set 4 Due Oct, 12

Problem Set 4 Due Oct, 12 EE226: Radom Processes i Systems Lecturer: Jea C. Walrad Problem Set 4 Due Oct, 12 Fall 06 GSI: Assae Gueye This problem set essetially reviews detectio theory ad hypothesis testig ad some basic otios

More information

Exponential Families and Bayesian Inference

Exponential Families and Bayesian Inference Computer Visio Expoetial Families ad Bayesia Iferece Lecture Expoetial Families A expoetial family of distributios is a d-parameter family f(x; havig the followig form: f(x; = h(xe g(t T (x B(, (. where

More information

Expectation-Maximization Algorithm.

Expectation-Maximization Algorithm. Expectatio-Maximizatio Algorithm. Petr Pošík Czech Techical Uiversity i Prague Faculty of Electrical Egieerig Dept. of Cyberetics MLE 2 Likelihood.........................................................................................................

More information

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering

CEE 522 Autumn Uncertainty Concepts for Geotechnical Engineering CEE 5 Autum 005 Ucertaity Cocepts for Geotechical Egieerig Basic Termiology Set A set is a collectio of (mutually exclusive) objects or evets. The sample space is the (collectively exhaustive) collectio

More information

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1

EECS564 Estimation, Filtering, and Detection Hwk 2 Solns. Winter p θ (z) = (2θz + 1 θ), 0 z 1 EECS564 Estimatio, Filterig, ad Detectio Hwk 2 Sols. Witer 25 4. Let Z be a sigle observatio havig desity fuctio where. p (z) = (2z + ), z (a) Assumig that is a oradom parameter, fid ad plot the maximum

More information

Monte Carlo Integration

Monte Carlo Integration Mote Carlo Itegratio I these otes we first review basic umerical itegratio methods (usig Riema approximatio ad the trapezoidal rule) ad their limitatios for evaluatig multidimesioal itegrals. Next we itroduce

More information

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence

Chapter 3. Strong convergence. 3.1 Definition of almost sure convergence Chapter 3 Strog covergece As poited out i the Chapter 2, there are multiple ways to defie the otio of covergece of a sequece of radom variables. That chapter defied covergece i probability, covergece i

More information

Convergence of random variables. (telegram style notes) P.J.C. Spreij

Convergence of random variables. (telegram style notes) P.J.C. Spreij Covergece of radom variables (telegram style otes).j.c. Spreij this versio: September 6, 2005 Itroductio As we kow, radom variables are by defiitio measurable fuctios o some uderlyig measurable space

More information

Mixtures of Gaussians and the EM Algorithm

Mixtures of Gaussians and the EM Algorithm Mixtures of Gaussias ad the EM Algorithm CSE 6363 Machie Learig Vassilis Athitsos Computer Sciece ad Egieerig Departmet Uiversity of Texas at Arligto 1 Gaussias A popular way to estimate probability desity

More information

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n.

Resampling Methods. X (1/2), i.e., Pr (X i m) = 1/2. We order the data: X (1) X (2) X (n). Define the sample median: ( n. Jauary 1, 2019 Resamplig Methods Motivatio We have so may estimators with the property θ θ d N 0, σ 2 We ca also write θ a N θ, σ 2 /, where a meas approximately distributed as Oce we have a cosistet estimator

More information

An Introduction to Randomized Algorithms

An Introduction to Randomized Algorithms A Itroductio to Radomized Algorithms The focus of this lecture is to study a radomized algorithm for quick sort, aalyze it usig probabilistic recurrece relatios, ad also provide more geeral tools for aalysis

More information

Lecture 10 October Minimaxity and least favorable prior sequences

Lecture 10 October Minimaxity and least favorable prior sequences STATS 300A: Theory of Statistics Fall 205 Lecture 0 October 22 Lecturer: Lester Mackey Scribe: Brya He, Rahul Makhijai Warig: These otes may cotai factual ad/or typographic errors. 0. Miimaxity ad least

More information

Machine Learning Brett Bernstein

Machine Learning Brett Bernstein Machie Learig Brett Berstei Week 2 Lecture: Cocept Check Exercises Starred problems are optioal. Excess Risk Decompositio 1. Let X = Y = {1, 2,..., 10}, A = {1,..., 10, 11} ad suppose the data distributio

More information

Standard Bayesian Approach to Quantized Measurements and Imprecise Likelihoods

Standard Bayesian Approach to Quantized Measurements and Imprecise Likelihoods Stadard Bayesia Approach to Quatized Measuremets ad Imprecise Likelihoods Lawrece D. Stoe Metro Ic. Resto, VA USA Stoe@metsci.com Abstract I this paper we show that the stadard defiitio of likelihood fuctio

More information

Lecture 33: Bootstrap

Lecture 33: Bootstrap Lecture 33: ootstrap Motivatio To evaluate ad compare differet estimators, we eed cosistet estimators of variaces or asymptotic variaces of estimators. This is also importat for hypothesis testig ad cofidece

More information

Machine Learning for Data Science (CS 4786)

Machine Learning for Data Science (CS 4786) Machie Learig for Data Sciece CS 4786) Lecture & 3: Pricipal Compoet Aalysis The text i black outlies high level ideas. The text i blue provides simple mathematical details to derive or get to the algorithm

More information

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d

Linear regression. Daniel Hsu (COMS 4771) (y i x T i β)2 2πσ. 2 2σ 2. 1 n. (x T i β y i ) 2. 1 ˆβ arg min. β R n d Liear regressio Daiel Hsu (COMS 477) Maximum likelihood estimatio Oe of the simplest liear regressio models is the followig: (X, Y ),..., (X, Y ), (X, Y ) are iid radom pairs takig values i R d R, ad Y

More information

A Simplified Derivation of Scalar Kalman Filter using Bayesian Probability Theory

A Simplified Derivation of Scalar Kalman Filter using Bayesian Probability Theory 6th Iteratioal Workshop o Aalysis of Dyamic Measuremets Jue -3 0 Göteorg Swede A Simplified Derivatio of Scalar Kalma Filter usig Bayesia Proaility Theory Gregory Kyriazis Imetro - Electrical Metrology

More information

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss

ECE 901 Lecture 12: Complexity Regularization and the Squared Loss ECE 90 Lecture : Complexity Regularizatio ad the Squared Loss R. Nowak 5/7/009 I the previous lectures we made use of the Cheroff/Hoeffdig bouds for our aalysis of classifier errors. Hoeffdig s iequality

More information

A statistical method to determine sample size to estimate characteristic value of soil parameters

A statistical method to determine sample size to estimate characteristic value of soil parameters A statistical method to determie sample size to estimate characteristic value of soil parameters Y. Hojo, B. Setiawa 2 ad M. Suzuki 3 Abstract Sample size is a importat factor to be cosidered i determiig

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 3 9/11/2013. Large deviations Theory. Cramér s Theorem MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/5.070J Fall 203 Lecture 3 9//203 Large deviatios Theory. Cramér s Theorem Cotet.. Cramér s Theorem. 2. Rate fuctio ad properties. 3. Chage of measure techique.

More information

Statistical Inference Based on Extremum Estimators

Statistical Inference Based on Extremum Estimators T. Rotheberg Fall, 2007 Statistical Iferece Based o Extremum Estimators Itroductio Suppose 0, the true value of a p-dimesioal parameter, is kow to lie i some subset S R p : Ofte we choose to estimate 0

More information

1. Linearization of a nonlinear system given in the form of a system of ordinary differential equations

1. Linearization of a nonlinear system given in the form of a system of ordinary differential equations . Liearizatio of a oliear system give i the form of a system of ordiary differetial equatios We ow show how to determie a liear model which approximates the behavior of a time-ivariat oliear system i a

More information

5.1 A mutual information bound based on metric entropy

5.1 A mutual information bound based on metric entropy Chapter 5 Global Fao Method I this chapter, we exted the techiques of Chapter 2.4 o Fao s method the local Fao method) to a more global costructio. I particular, we show that, rather tha costructig a local

More information

Discrete Mathematics for CS Spring 2008 David Wagner Note 22

Discrete Mathematics for CS Spring 2008 David Wagner Note 22 CS 70 Discrete Mathematics for CS Sprig 2008 David Wager Note 22 I.I.D. Radom Variables Estimatig the bias of a coi Questio: We wat to estimate the proportio p of Democrats i the US populatio, by takig

More information

Orthogonal Gaussian Filters for Signal Processing

Orthogonal Gaussian Filters for Signal Processing Orthogoal Gaussia Filters for Sigal Processig Mark Mackezie ad Kiet Tieu Mechaical Egieerig Uiversity of Wollogog.S.W. Australia Abstract A Gaussia filter usig the Hermite orthoormal series of fuctios

More information

Estimation for Complete Data

Estimation for Complete Data Estimatio for Complete Data complete data: there is o loss of iformatio durig study. complete idividual complete data= grouped data A complete idividual data is the oe i which the complete iformatio of

More information

Seunghee Ye Ma 8: Week 5 Oct 28

Seunghee Ye Ma 8: Week 5 Oct 28 Week 5 Summary I Sectio, we go over the Mea Value Theorem ad its applicatios. I Sectio 2, we will recap what we have covered so far this term. Topics Page Mea Value Theorem. Applicatios of the Mea Value

More information

3. Z Transform. Recall that the Fourier transform (FT) of a DT signal xn [ ] is ( ) [ ] = In order for the FT to exist in the finite magnitude sense,

3. Z Transform. Recall that the Fourier transform (FT) of a DT signal xn [ ] is ( ) [ ] = In order for the FT to exist in the finite magnitude sense, 3. Z Trasform Referece: Etire Chapter 3 of text. Recall that the Fourier trasform (FT) of a DT sigal x [ ] is ω ( ) [ ] X e = j jω k = xe I order for the FT to exist i the fiite magitude sese, S = x [

More information

7.1 Convergence of sequences of random variables

7.1 Convergence of sequences of random variables Chapter 7 Limit Theorems Throughout this sectio we will assume a probability space (, F, P), i which is defied a ifiite sequece of radom variables (X ) ad a radom variable X. The fact that for every ifiite

More information

Chapter 6 Sampling Distributions

Chapter 6 Sampling Distributions Chapter 6 Samplig Distributios 1 I most experimets, we have more tha oe measuremet for ay give variable, each measuremet beig associated with oe radomly selected a member of a populatio. Hece we eed to

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013. Large Deviations for i.i.d. Random Variables

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013. Large Deviations for i.i.d. Random Variables MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 2 9/9/2013 Large Deviatios for i.i.d. Radom Variables Cotet. Cheroff boud usig expoetial momet geeratig fuctios. Properties of a momet

More information

Regression and generalization

Regression and generalization Regressio ad geeralizatio CE-717: Machie Learig Sharif Uiversity of Techology M. Soleymai Fall 2016 Curve fittig: probabilistic perspective Describig ucertaity over value of target variable as a probability

More information

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS

A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS J. Japa Statist. Soc. Vol. 41 No. 1 2011 67 73 A RANK STATISTIC FOR NON-PARAMETRIC K-SAMPLE AND CHANGE POINT PROBLEMS Yoichi Nishiyama* We cosider k-sample ad chage poit problems for idepedet data i a

More information

Lecture 11 and 12: Basic estimation theory

Lecture 11 and 12: Basic estimation theory Lecture ad 2: Basic estimatio theory Sprig 202 - EE 94 Networked estimatio ad cotrol Prof. Kha March 2 202 I. MAXIMUM-LIKELIHOOD ESTIMATORS The maximum likelihood priciple is deceptively simple. Louis

More information

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator

Economics 241B Relation to Method of Moments and Maximum Likelihood OLSE as a Maximum Likelihood Estimator Ecoomics 24B Relatio to Method of Momets ad Maximum Likelihood OLSE as a Maximum Likelihood Estimator Uder Assumptio 5 we have speci ed the distributio of the error, so we ca estimate the model parameters

More information

Optimally Sparse SVMs

Optimally Sparse SVMs A. Proof of Lemma 3. We here prove a lower boud o the umber of support vectors to achieve geeralizatio bouds of the form which we cosider. Importatly, this result holds ot oly for liear classifiers, but

More information

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f.

Let us give one more example of MLE. Example 3. The uniform distribution U[0, θ] on the interval [0, θ] has p.d.f. Lecture 5 Let us give oe more example of MLE. Example 3. The uiform distributio U[0, ] o the iterval [0, ] has p.d.f. { 1 f(x =, 0 x, 0, otherwise The likelihood fuctio ϕ( = f(x i = 1 I(X 1,..., X [0,

More information

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample.

Statistical Inference (Chapter 10) Statistical inference = learn about a population based on the information provided by a sample. Statistical Iferece (Chapter 10) Statistical iferece = lear about a populatio based o the iformatio provided by a sample. Populatio: The set of all values of a radom variable X of iterest. Characterized

More information

1 Introduction to reducing variance in Monte Carlo simulations

1 Introduction to reducing variance in Monte Carlo simulations Copyright c 010 by Karl Sigma 1 Itroductio to reducig variace i Mote Carlo simulatios 11 Review of cofidece itervals for estimatig a mea I statistics, we estimate a ukow mea µ = E(X) of a distributio by

More information

A Risk Comparison of Ordinary Least Squares vs Ridge Regression

A Risk Comparison of Ordinary Least Squares vs Ridge Regression Joural of Machie Learig Research 14 (2013) 1505-1511 Submitted 5/12; Revised 3/13; Published 6/13 A Risk Compariso of Ordiary Least Squares vs Ridge Regressio Paramveer S. Dhillo Departmet of Computer

More information

1 Covariance Estimation

1 Covariance Estimation Eco 75 Lecture 5 Covariace Estimatio ad Optimal Weightig Matrices I this lecture, we cosider estimatio of the asymptotic covariace matrix B B of the extremum estimator b : Covariace Estimatio Lemma 4.

More information

Random Variables, Sampling and Estimation

Random Variables, Sampling and Estimation Chapter 1 Radom Variables, Samplig ad Estimatio 1.1 Itroductio This chapter will cover the most importat basic statistical theory you eed i order to uderstad the ecoometric material that will be comig

More information

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors

ECONOMETRIC THEORY. MODULE XIII Lecture - 34 Asymptotic Theory and Stochastic Regressors ECONOMETRIC THEORY MODULE XIII Lecture - 34 Asymptotic Theory ad Stochastic Regressors Dr. Shalabh Departmet of Mathematics ad Statistics Idia Istitute of Techology Kapur Asymptotic theory The asymptotic

More information

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5

CS434a/541a: Pattern Recognition Prof. Olga Veksler. Lecture 5 CS434a/54a: Patter Recogitio Prof. Olga Veksler Lecture 5 Today Itroductio to parameter estimatio Two methods for parameter estimatio Maimum Likelihood Estimatio Bayesia Estimatio Itroducto Bayesia Decisio

More information

10-701/ Machine Learning Mid-term Exam Solution

10-701/ Machine Learning Mid-term Exam Solution 0-70/5-78 Machie Learig Mid-term Exam Solutio Your Name: Your Adrew ID: True or False (Give oe setece explaatio) (20%). (F) For a cotiuous radom variable x ad its probability distributio fuctio p(x), it

More information

Signal Processing. Lecture 02: Discrete Time Signals and Systems. Ahmet Taha Koru, Ph. D. Yildiz Technical University.

Signal Processing. Lecture 02: Discrete Time Signals and Systems. Ahmet Taha Koru, Ph. D. Yildiz Technical University. Sigal Processig Lecture 02: Discrete Time Sigals ad Systems Ahmet Taha Koru, Ph. D. Yildiz Techical Uiversity 2017-2018 Fall ATK (YTU) Sigal Processig 2017-2018 Fall 1 / 51 Discrete Time Sigals Discrete

More information

AN OPEN-PLUS-CLOSED-LOOP APPROACH TO SYNCHRONIZATION OF CHAOTIC AND HYPERCHAOTIC MAPS

AN OPEN-PLUS-CLOSED-LOOP APPROACH TO SYNCHRONIZATION OF CHAOTIC AND HYPERCHAOTIC MAPS http://www.paper.edu.c Iteratioal Joural of Bifurcatio ad Chaos, Vol. 1, No. 5 () 119 15 c World Scietific Publishig Compay AN OPEN-PLUS-CLOSED-LOOP APPROACH TO SYNCHRONIZATION OF CHAOTIC AND HYPERCHAOTIC

More information

Berry-Esseen bounds for self-normalized martingales

Berry-Esseen bounds for self-normalized martingales Berry-Essee bouds for self-ormalized martigales Xiequa Fa a, Qi-Ma Shao b a Ceter for Applied Mathematics, Tiaji Uiversity, Tiaji 30007, Chia b Departmet of Statistics, The Chiese Uiversity of Hog Kog,

More information

GUIDELINES ON REPRESENTATIVE SAMPLING

GUIDELINES ON REPRESENTATIVE SAMPLING DRUGS WORKING GROUP VALIDATION OF THE GUIDELINES ON REPRESENTATIVE SAMPLING DOCUMENT TYPE : REF. CODE: ISSUE NO: ISSUE DATE: VALIDATION REPORT DWG-SGL-001 002 08 DECEMBER 2012 Ref code: DWG-SGL-001 Issue

More information

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution

Mathematical Modeling of Optimum 3 Step Stress Accelerated Life Testing for Generalized Pareto Distribution America Joural of Theoretical ad Applied Statistics 05; 4(: 6-69 Published olie May 8, 05 (http://www.sciecepublishiggroup.com/j/ajtas doi: 0.648/j.ajtas.05040. ISSN: 6-8999 (Prit; ISSN: 6-9006 (Olie Mathematical

More information

Problem Cosider the curve give parametrically as x = si t ad y = + cos t for» t» ß: (a) Describe the path this traverses: Where does it start (whe t =

Problem Cosider the curve give parametrically as x = si t ad y = + cos t for» t» ß: (a) Describe the path this traverses: Where does it start (whe t = Mathematics Summer Wilso Fial Exam August 8, ANSWERS Problem 1 (a) Fid the solutio to y +x y = e x x that satisfies y() = 5 : This is already i the form we used for a first order liear differetial equatio,

More information

IIT JAM Mathematical Statistics (MS) 2006 SECTION A

IIT JAM Mathematical Statistics (MS) 2006 SECTION A IIT JAM Mathematical Statistics (MS) 6 SECTION A. If a > for ad lim a / L >, the which of the followig series is ot coverget? (a) (b) (c) (d) (d) = = a = a = a a + / a lim a a / + = lim a / a / + = lim

More information

ADVANCED DIGITAL SIGNAL PROCESSING

ADVANCED DIGITAL SIGNAL PROCESSING ADVANCED DIGITAL SIGNAL PROCESSING PROF. S. C. CHAN (email : sccha@eee.hku.hk, Rm. CYC-702) DISCRETE-TIME SIGNALS AND SYSTEMS MULTI-DIMENSIONAL SIGNALS AND SYSTEMS RANDOM PROCESSES AND APPLICATIONS ADAPTIVE

More information

RAINFALL PREDICTION BY WAVELET DECOMPOSITION

RAINFALL PREDICTION BY WAVELET DECOMPOSITION RAIFALL PREDICTIO BY WAVELET DECOMPOSITIO A. W. JAYAWARDEA Departmet of Civil Egieerig, The Uiversit of Hog Kog, Hog Kog, Chia P. C. XU Academ of Mathematics ad Sstem Scieces, Chiese Academ of Scieces,

More information

Study the bias (due to the nite dimensional approximation) and variance of the estimators

Study the bias (due to the nite dimensional approximation) and variance of the estimators 2 Series Methods 2. Geeral Approach A model has parameters (; ) where is ite-dimesioal ad is oparametric. (Sometimes, there is o :) We will focus o regressio. The fuctio is approximated by a series a ite

More information

UNSCENTED KALMAN FILTER REVISITED - HERMITE-GAUSS QUADRATURE APPROACH

UNSCENTED KALMAN FILTER REVISITED - HERMITE-GAUSS QUADRATURE APPROACH UNSCENED KALMAN FILER REVISIED - HERMIE-GAUSS QUADRAURE APPROACH Ja Štecha ad Vladimír Havlea Departmet of Cotrol Egieerig Faculty of Electrical Egieerig Czech echical Uiversity i Prague Karlovo áměstí

More information

10.6 ALTERNATING SERIES

10.6 ALTERNATING SERIES 0.6 Alteratig Series Cotemporary Calculus 0.6 ALTERNATING SERIES I the last two sectios we cosidered tests for the covergece of series whose terms were all positive. I this sectio we examie series whose

More information

Time-Domain Representations of LTI Systems

Time-Domain Representations of LTI Systems 2.1 Itroductio Objectives: 1. Impulse resposes of LTI systems 2. Liear costat-coefficiets differetial or differece equatios of LTI systems 3. Bloc diagram represetatios of LTI systems 4. State-variable

More information

Algorithms for Clustering

Algorithms for Clustering CR2: Statistical Learig & Applicatios Algorithms for Clusterig Lecturer: J. Salmo Scribe: A. Alcolei Settig: give a data set X R p where is the umber of observatio ad p is the umber of features, we wat

More information

Quantile regression with multilayer perceptrons.

Quantile regression with multilayer perceptrons. Quatile regressio with multilayer perceptros. S.-F. Dimby ad J. Rykiewicz Uiversite Paris 1 - SAMM 90 Rue de Tolbiac, 75013 Paris - Frace Abstract. We cosider oliear quatile regressio ivolvig multilayer

More information

Computing the maximum likelihood estimates: concentrated likelihood, EM-algorithm. Dmitry Pavlyuk

Computing the maximum likelihood estimates: concentrated likelihood, EM-algorithm. Dmitry Pavlyuk Computig the maximum likelihood estimates: cocetrated likelihood, EM-algorithm Dmitry Pavlyuk The Mathematical Semiar, Trasport ad Telecommuicatio Istitute, Riga, 13.05.2016 Presetatio outlie 1. Basics

More information

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach

Lecture 7: Density Estimation: k-nearest Neighbor and Basis Approach STAT 425: Itroductio to Noparametric Statistics Witer 28 Lecture 7: Desity Estimatio: k-nearest Neighbor ad Basis Approach Istructor: Ye-Chi Che Referece: Sectio 8.4 of All of Noparametric Statistics.

More information

Empirical Processes: Glivenko Cantelli Theorems

Empirical Processes: Glivenko Cantelli Theorems Empirical Processes: Gliveko Catelli Theorems Mouliath Baerjee Jue 6, 200 Gliveko Catelli classes of fuctios The reader is referred to Chapter.6 of Weller s Torgo otes, Chapter??? of VDVW ad Chapter 8.3

More information

Vector Quantization: a Limiting Case of EM

Vector Quantization: a Limiting Case of EM . Itroductio & defiitios Assume that you are give a data set X = { x j }, j { 2,,, }, of d -dimesioal vectors. The vector quatizatio (VQ) problem requires that we fid a set of prototype vectors Z = { z

More information

CS537. Numerical Analysis and Computing

CS537. Numerical Analysis and Computing CS57 Numerical Aalysis ad Computig Lecture Locatig Roots o Equatios Proessor Ju Zhag Departmet o Computer Sciece Uiversity o Ketucky Leigto KY 456-6 Jauary 9 9 What is the Root May physical system ca be

More information

Statistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions

Statistical and Mathematical Methods DS-GA 1002 December 8, Sample Final Problems Solutions Statistical ad Mathematical Methods DS-GA 00 December 8, 05. Short questios Sample Fial Problems Solutios a. Ax b has a solutio if b is i the rage of A. The dimesio of the rage of A is because A has liearly-idepedet

More information

Mechatronics. Time Response & Frequency Response 2 nd -Order Dynamic System 2-Pole, Low-Pass, Active Filter

Mechatronics. Time Response & Frequency Response 2 nd -Order Dynamic System 2-Pole, Low-Pass, Active Filter Time Respose & Frequecy Respose d -Order Dyamic System -Pole, Low-Pass, Active Filter R 4 R 7 C 5 e i R 1 C R 3 - + R 6 - + e out Assigmet: Perform a Complete Dyamic System Ivestigatio of the Two-Pole,

More information

Preponderantly increasing/decreasing data in regression analysis

Preponderantly increasing/decreasing data in regression analysis Croatia Operatioal Research Review 269 CRORR 7(2016), 269 276 Prepoderatly icreasig/decreasig data i regressio aalysis Darija Marković 1, 1 Departmet of Mathematics, J. J. Strossmayer Uiversity of Osijek,

More information

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

This exam contains 19 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam. Probability ad Statistics FS 07 Secod Sessio Exam 09.0.08 Time Limit: 80 Miutes Name: Studet ID: This exam cotais 9 pages (icludig this cover page) ad 0 questios. A Formulae sheet is provided with the

More information

Chapter 10 Advanced Topics in Random Processes

Chapter 10 Advanced Topics in Random Processes ery Stark ad Joh W. Woods, Probability, Statistics, ad Radom Variables for Egieers, 4th ed., Pearso Educatio Ic.,. ISBN 978--3-33-6 Chapter Advaced opics i Radom Processes Sectios. Mea-Square (m.s.) Calculus

More information

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame

Information Theory Tutorial Communication over Channels with memory. Chi Zhang Department of Electrical Engineering University of Notre Dame Iformatio Theory Tutorial Commuicatio over Chaels with memory Chi Zhag Departmet of Electrical Egieerig Uiversity of Notre Dame Abstract A geeral capacity formula C = sup I(; Y ), which is correct for

More information

Asymptotic Results for the Linear Regression Model

Asymptotic Results for the Linear Regression Model Asymptotic Results for the Liear Regressio Model C. Fli November 29, 2000 1. Asymptotic Results uder Classical Assumptios The followig results apply to the liear regressio model y = Xβ + ε, where X is

More information

REGRESSION WITH QUADRATIC LOSS

REGRESSION WITH QUADRATIC LOSS REGRESSION WITH QUADRATIC LOSS MAXIM RAGINSKY Regressio with quadratic loss is aother basic problem studied i statistical learig theory. We have a radom couple Z = X, Y ), where, as before, X is a R d

More information

Lectures on Stochastic System Analysis and Bayesian Updating

Lectures on Stochastic System Analysis and Bayesian Updating Lectures o Stochastic System Aalysis ad Bayesia Updatig Jue 29-July 13 2005 James L. Beck, Califoria Istitute of Techology Jiaye Chig, Natioal Taiwa Uiversity of Sciece & Techology Siu-Kui (Iva) Au, Nayag

More information

DIGITAL FILTER ORDER REDUCTION

DIGITAL FILTER ORDER REDUCTION DIGITAL FILTER RDER REDUTI VAHID RAISSI DEHKRDI, McGILL UIVERSITY, AADA, vahid@cim.mcgill.ca AMIR G. AGHDAM, RDIA UIVERSITY, AADA, aghdam@ece.cocordia.ca ABSTRAT I this paper, a method is proposed to reduce

More information

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator

Slide Set 13 Linear Model with Endogenous Regressors and the GMM estimator Slide Set 13 Liear Model with Edogeous Regressors ad the GMM estimator Pietro Coretto pcoretto@uisa.it Ecoometrics Master i Ecoomics ad Fiace (MEF) Uiversità degli Studi di Napoli Federico II Versio: Friday

More information

Modified Decomposition Method by Adomian and. Rach for Solving Nonlinear Volterra Integro- Differential Equations

Modified Decomposition Method by Adomian and. Rach for Solving Nonlinear Volterra Integro- Differential Equations Noliear Aalysis ad Differetial Equatios, Vol. 5, 27, o. 4, 57-7 HIKARI Ltd, www.m-hikari.com https://doi.org/.2988/ade.27.62 Modified Decompositio Method by Adomia ad Rach for Solvig Noliear Volterra Itegro-

More information

Lecture 11: Channel Coding Theorem: Converse Part

Lecture 11: Channel Coding Theorem: Converse Part EE376A/STATS376A Iformatio Theory Lecture - 02/3/208 Lecture : Chael Codig Theorem: Coverse Part Lecturer: Tsachy Weissma Scribe: Erdem Bıyık I this lecture, we will cotiue our discussio o chael codig

More information