Particle Filtering and Smoothing Methods

Size: px
Start display at page:

Download "Particle Filtering and Smoothing Methods"

Transcription

1 Paricle Filering and Smoohing Mehods Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 3 rd Ocober 2012 A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

2 Sae-Space Models Le {X } 1 be a laen/hidden X -valued Markov process wih X 1 µ ( ) and X (X 1 = x) f ( x). Le {Y } 1 be an Y-valued Markov observaion process such ha observaions are condiionally independen given {X } 1 and Y (X = x) g ( x). General class of ime series models aka Hidden Markov Models (HMM) including X = Ψ (X 1, V ), Y = Φ (X, W ) where V, W are wo sequences of i.i.d. random variables. Aim: Infer {X } given observaions {Y } on-line or off-line. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

3 Sae-Space Models Sae-space models are ubiquious in conrol, daa mining, economerics, geosciences, sysem biology ec. Since Jan. 2012, more han 13,500 papers have already appeared (source: Google Scholar). Finie Sae-space HMM: X is a finie space, i.e. {X } is a finie Markov chain Y (X = x) g ( x) Linear Gaussian sae-space model X = AX 1 + BV, V i.i.d. N (0, I ) Y = CX + DW, W i.i.d. N (0, I ) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

4 Sae-Space Models Sochasic Volailiy model X = φx 1 + σv, V i.i.d. N (0, 1) Y = β exp (X /2) W, W i.i.d. N (0, 1) Biochemical Nework model Pr ( X+d 1 =x 1 +1, X+d 2 =x 2 x 1, x 2 ) = α x 1 d + o (d), Pr ( X+d 1 =x 1 1, X+d 2 =x 2 +1 x 1, x 2 ) = β x 1 x 2 d + o (d), Pr ( X+d 1 =x 1, X+d 2 =x 2 1 x 1, x 2 ) = γ x 2 d + o (d), wih Y k = Xk 1 T + W i.i.d. k wih W k N ( 0, σ 2). Nonlinear Diffusion model dx = α (X ) d + β (X ) dv, V Brownian moion Y k = γ (X k T ) +W k, W k i.i.d. N ( 0, σ 2). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

5 Inference in Sae-Space Models Given observaions y 1: := (y 1, y 2,..., y ), inference abou X 1: := (X 1,..., X ) relies on he poserior where p (x 1:, y 1: ) = µ (x 1 ) p (y 1: ) = p (x 1: y 1: ) = p (x 1:, y 1: ) p (y 1: ) k=2 f (x k x k 1 ) }{{}}{{} p(x 1: ) p( y 1: x 1: ) p (x 1:, y 1: ) dx 1: k=1 g (y k x k ), When X is finie & linear Gaussian models, {p (x y 1: )} 1 can be compued exacly. For non-linear models, approximaions are required: EKF, UKF, Gaussian sum filers, ec. Approximaions of {p (x y 1: )} T =1 provide approximaion of p (x 1:T y 1:T ). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

6 Mone Carlo Mehods Basics Assume you can generae X (i) 1: p (x 1: y 1: ) where i = 1,..., N hen MC approximaion is p (x 1: y 1: ) = 1 N N δ (i) X (x 1: ) 1: i=1 Inegraion is sraighforward. ϕ (x 1: ) p (x 1: y 1: ) dx 1: ϕ (x 1: ) p ((x 1: ) y 1: ) dx 1: = 1 N N i=1 ϕ Marginalizaion is sraighforward. X (i) 1: p (x k y 1: ) = p (x 1: y 1: ) dx 1:k 1 dx k+1: = 1 N [ ( )] Basic and key propery: V 1 N N i=1 ϕ = X (i) 1: rae of convergence o zero is independen of dim (X ). N δ (i) X (x k ). k i=1 C ( dim(x )) N, i.e. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

7 Mone Carlo Mehods Problem 1: We canno ypically generae exac samples from p (x 1: y 1: ) for non-linear non-gaussian models. Problem 2: Even if we could, algorihms o generae samples from p (x 1: y 1: ) will have a leas complexiy O (). Paricle Mehods solves parially Problems 1 & 2 by breaking he problem of sampling from p (x 1: y 1: ) ino a collecion of simpler subproblems. Firs approximae p (x 1 y 1 ) and p (y 1 ) a ime 1, hen p (x 1:2 y 1:2 ) and p (y 1:2 ) a ime 2 and so on. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

8 Bayesian Recursion on Pah Space We have p (x 1: y 1: ) = p (x 1:, y 1: ) p (y 1: ) where = g (y x ) = g (y x ) f (x x 1 ) p (y y 1: 1 ) predicive p( x 1: y 1: 1 ) {}}{ f (x x 1 ) p (x 1: 1 y 1: 1 ) p (y y 1: 1 ) p (y y 1: 1 ) = g (y x ) p (x 1: y 1: 1 ) dx 1: Predicion-Updae formulaion p (x 1: 1, y 1: 1 ) p (y 1: 1 ) p (x 1: y 1: 1 ) = f (x x 1 ) p (x 1: 1 y 1: 1 ), p (x 1: y 1: ) = g (y x ) p (x 1: y 1: 1 ). p (y y 1: 1 ) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

9 Mone Carlo Implemenaion of Predicion Sep Assume you have a ime 1 By sampling X (i) hen p (x 1: 1 y 1: 1 ) = 1 N f N δ (i) X (x 1: 1 ). 1: 1 i=1 ( ) x X (i) 1 and seing X (i) 1: = p (x 1: y 1: 1 ) = 1 N N δ (i) (x 1: X ). 1: i=1 ( X (i) 1: 1, X (i) ) Sampling from f (x x 1 ) is usually sraighforward and is feasible even if f (x x 1 ) does no admi any analyical expression; e.g. biochemical nework models. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

10 Imporance Sampling Implemenaion of Updaing Sep Our arge a ime is p (x 1: y 1: ) = g (y x ) p (x 1: y 1: 1 ) p (y y 1: 1 ) so by subsiuing p (x 1: y 1: 1 ) o p (x 1: y 1: 1 ) we obain p (y y 1: 1 ) = g (y x ) p (x 1: y 1: 1 ) dx 1: We now have = 1 N N ( g y X (i) ). i=1 p (x 1: y 1: ) = g (y x ) p (x 1: y 1: 1 ) = p (y y 1: 1 ) ( wih W (i) g y X (i) ), N i=1 W (i) = 1. N i=1 W (i) δ (i) (x 1: X ). 1: A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

11 Mulinomial Resampling We have a weighed approximaion p (x 1: y 1: ) of p (x 1: y 1: ) p (x 1: y 1: ) = N i=1 W (i) δ (i) (x 1: X ). 1: To obain N samples X (i) 1: approximaely from p (x 1: y 1: ), resample N imes wih replacemen o obain N (i) X (i) 1: p (x 1: y 1: ) N δ (i) X (x 1: ) = 1: i=1 N i=1 p (x 1: y 1: ) = 1 N { } where follow a mulinomial of param. N, This can be achieved in O (N). N (i) N δ X (i) (x 1: ) 1: { W (i) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46 }.

12 Vanilla Paricle Filer A ime = 1 Sample X (i) 1 µ (x 1 ) hen p (x 1 y 1 ) = N W (i) 1 δ (i) (x 1 X ), W (i) 1 g 1 i=1 ( y 1 X (i) ) 1. Resample X (i) 1 p (x 1 y 1 ) o obain p (x 1 y 1 ) = 1 N N i=1 δ (i) X (x 1 ). 1 A ime 2 Sample X (i) f p (x 1: y 1: ) = ( ) x X (i) 1, se X (i) ( 1: = X (i) 1: 1, X (i) ) and N i=1 ( W (i) δ (i) (x 1: X ), W (i) g 1: Resample X (i) 1: p (x 1: y 1: ) o obain p (x 1: y 1: ) = 1 N N δ (i) X (x 1: ). 1: i=1 y X (i) ). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

13 Paricle Esimaes A ime, we ge p (x 1: y 1: ) = 1 N N δ (i) X (x 1: ). 1: i=1 The marginal likelihood esimae is given by ( 1 p (y 1: ) = p (y k y 1:k 1 ) = N k=1 k=1 N ( g i=1 y k X (i) ) ) k. Compuaional complexiy is O (N) a each ime sep and memory requiremens O (N). If we are only ineresed in p (x y 1: ) or p (s (x 1: ) y 1: ) where s (x 1: ) = Ψ (x, s 1 (x 1: 1 )) - e.g. s (x 1: ) = k=1 x 2 k - is fixed-dimensional hen memory requiremens O (N). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

14 Some Convergence Resuls Numerous convergence resuls are available; see (Del Moral, 2004, Del Moral, D. & Singh, 2013). Le ϕ : X R and consider ϕ = ϕ (x 1: ) p (x 1: y 1: ) dx 1:, ϕ = ϕ (x 1: ) p (x 1: y 1: ) dx 1: = 1 N N ( ϕ i=1 X (i) 1: We can prove ha for any bounded funcion ϕ and any p 1 E [ ϕ ϕ p ] 1/p B () c (p) ϕ, N lim N ( ϕ ϕ N ) N ( 0, σ 2 ). Very weak resuls: For a pah-dependen ϕ (x 1: ), B () and σ 2 ypically increase wih. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46 ).

15 sae sae Figure: p ( x 1 y 1 ) and Ê [ X 1 y 1 ] (op) and paricle approximaion of p ( x 1 y 1 ) A. (boom) Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46 Paricles on Pah-Space - figures by O. Cappė ime index ime index

16 sae sae ime index ime index Figure: p ( x 1 y 1 ), p ( x 2 y 1:2 )and Ê [ X 1 y 1 ], Ê [ X 2 y 1:2 ] (op) and paricle approximaion of p ( x 1:2 y 1:2 ) (boom) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

17 sae sae ime index ime index Figure: p ( x y 1: ) and Ê [ X y 1: ] for = 1, 2, 3 (op) and paricle approximaion of p ( x 1:3 y 1:3 ) (boom) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

18 sae sae ime index ime index Figure: p ( x y 1: ) and Ê [ X y 1: ] for = 1,..., 10 (op) and paricle approximaion of p ( x 1:10 y 1:10 ) (boom) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

19 sae sae ime index ime index Figure: p ( x y 1: ) and Ê [ X y 1: ] for = 1,..., 24 (op) and paricle approximaion of p ( x 1:24 y 1:24 ) (boom) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

20 Remarks Empirically his paricle mehod provides good approximaions of he marginals {p (x y 1: )} 1. This is wha is only necessary in many applicaions hankfully. The join disribuion p (x 1: y 1: ) is poorly esimaed when is large; i.e. we have in he previous example p (x 1:11 y 1:24 ) = δ X 1:11 (x 1:11 ). Degeneracy problem. For any N and any k, here exiss (k, N) such ha for any (k, N) p (x 1:k y 1: ) = δ X 1:k (x 1:k ) ; p (x 1: y 1: ) is an unreliable approximaion of p (x 1: y 1: ) as. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

21 Anoher Illusraion of he Degeneracy Phenomenon For he linear Gaussian model, we can compue exacly S / where ( ) S = xk 2 p (x 1: y 1: ) dx 1: k=1 using Kalman echniques. We compue he paricle esimae of his quaniy using Ŝ / where ( ) Ŝ = xk 2 p (x 1: y 1: ) dx 1: k=1 can be compued sequenially. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

22 Anoher Illusraion of he Degeneracy Phenomenon Figure: S / obained hrough he Kalman smooher (blue) and is paricle esimae Ŝ / (red). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

23 Sronger Convergence Resuls Assume he following exponenially sabiliy assumpion: For any x 1, x 1 1 p (x y 2:, X 1 = x 1 ) p ( x y 2:, X 1 = x ) 1 dx α for 0 α < 1. 2 Marginal disribuion. For ϕ (x 1: ) = ϕ (x L: ), here exiss B 1, B 2 < s.. E [ ϕ ϕ p ] 1/p B 1 c (p) ϕ N, lim N N ( ϕ ϕ ) N ( 0, σ 2 ) where σ 2 B 2, i.e. here is no accumulaion of numerical errors over ime. Relaive Variance Bound. There exiss B 3 < for no oo large. ( ) ) ( p (y1: ) 2 E p (y 1: ) 1 B 3 N A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

24 Summary Paricle mehods provide consisen esimaes under weak assumpions. Under sabiliy assumpions, we have uniform in ime sabiliy of { p (x y 1: )} 1 and relaive variance of { p (y 1: )} 1 only increases linearly wih. Even under sabiliy assumpions, one does no have uniform in ime sabiliy for { p (x 1: y 1: )} 1. Is i possible o eliminae and/or miigae he degeneracy problem? A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

25 Beer Resampling Schemes { Resampling selecs inegers N i=1 N (i) W (i) δ (i) (x 1: X ) 1: ( ) N (i) } such ha N i=1 Mulinomial Resampling. E = NW (i), ( ) ( ) V N (i) = NW (i) 1 W (i). Residual Resampling. Se Ñ (i) = NW (i) ( mulinomial of parameers N, W (1:N ) ) where N (i) N δ X (i) (x 1: ) 1: { }, sample N i from a W (i) W (i) N 1 Ñ (i) hen se N (i) = Ñ (i) + N (i). Sysemaic Resampling. Sample U 1 U ( 0, 1 ) N and le U i = U 1 + i 1 for i = 2,..., N, hen { N i = N U j : i 1 k=1 W (k) U j i k=1 W (k) A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46 }.

26 Dynamic Resampling To measure he variaion of he weighs, we can use he Effecive Sample Size (ESS) We have ESS = N if W (i) ( N ( ESS = i=1 W (i) ) 2 ) 1 = 1/N for any i and ESS = 1 if W (i) = 1 and W (j) = 0 for j = i. Dynamic Resampling: If he variaion of he weighs as measured by ESS is oo high, e.g. ESS < N/2, hen resample he paricles (Liu & Chen, 1995). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

27 Improving he Sampling Sep Boosrap filer. Sample paricles blindly according o he prior wihou aking ino accoun he observaion Very ineffi cien for vague prior/peaky likelihood. Opimal proposal/perfec adapaion. Implemen he following alernaive updae-propagae Bayesian recursion where Updae p (x 1: 1 y 1: ) = p( y x 1 )p( x 1: 1 y 1: 1 ) p( y y 1: 1 ) Propagae p (x 1: y 1: ) = p (x 1: 1 y 1: ) p (x y, x 1 ) p (x y, x 1 ) = f (x x 1 ) g (y x 1 ) p (y x 1 ) Much more effi cien when applicable; e.g. f (x x 1 ) = N (x ; ϕ (x 1 ), Σ v ), g (y x ) = N (y ; x, Σ w ). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

28 A General Recursion Inroduce an arbirary proposal disribuion q (x y, x 1 ); i.e. an approximaion o p (x y, x 1 ). We have seen ha so clearly where p (x 1: y 1: ) = g (y x ) f (x x 1 ) p (x 1: 1 y 1: 1 ) p (y y 1: 1 ) p (x 1: y 1: ) = w (x 1, x, y ) q (x y, x 1 ) p (x 1: 1 y 1: 1 ) p (y y 1: 1 ) w (x 1, x, y ) = g (y x ) f (x x 1 ) q (x y, x 1 ) This suggess a more general paricle algorihm. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

29 A General Paricle Algorihm { } Assume we have N weighed paricles W (i) 1, X (i) 1: 1 approximaing p (x 1: 1 y 1: 1 ) hen a ime, Sample X (i) ( ) q x y, X (i) 1, se X (i) ( 1: = X (i) 1: 1, X (i) ) and p (x 1: y 1: ) = N i=1 W (i) W (i) f 1 W (i) δ (i) (x 1: X ), 1: ( ) ( X (i) 1 g y X (i) ) ( ). q y, X (i) X (i) X (i) If ESS< N/2 resample X (i) 1: p (x 1: y 1: ) and se W (i) 1 N o obain p (x 1: y 1: ) = 1 N N i=1 δ (i) X (x 1: ). 1: 1 A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

30 Building Proposals Our aim is o selec q (x y, x 1 ) as close as possible o p (x y, x 1 ) as his minimizes he variance of w (x 1, x, y ) = g (y x ) f (x x 1 ). q (x y, x 1 ) Any sandard subopimal filering mehod can be used o approximae p (x y, x 1 ) and p (y x ). Example - Local linearisaion proposal: Le X = ϕ (X 1 ) + V, Y = Ψ (X ) + W, wih V N (0, Σ v ), W N (0, Σ w ). We perform local linearizaion Ψ (x) Y Ψ (ϕ (X 1 )) + (X ϕ (X 1 )) + W x and use as a proposal. ϕ(x 1 ) q (x y, x 1 ) ĝ (y x ) f (x x 1 ). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

31 Block Sampling Paricle Filer Problem: we only sample X a ime so, even if you use p (x y, x 1 ), paricles esimaes could have high variance if V p( x 1 y 1: 1 ) [p (y X 1 )] is high. Block sampling idea: allows yourself o sample again X L+1: 1 as well as X in ligh of y { p( x 1: L y 1: 1 ) }} { p (y y L+1: 1, x L ) p (x 1: 1 y 1: 1 ) dx L+1: 1 p (x 1: L y 1: ) = p (y y L+1: 1 ), p (x 1: y 1: ) = p (x L+1: y L+1:, x L ) p (x 1: L y 1: ). When p (x L+1: y L+1:, x L ) and p (y y L+1: 1, x L ) are no available, one can use approximaions (D., Briers & Senecal, 2006, Whieley & Lee, 2012). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

32 Block Sampling Proposals Variance of incremenal weigh p (y jy p ( x1: L j y1: 1 ). A. Douce (UCL Maserclass Oc. 2012) L +1: 1, x L ) w.r.. 3 rd Ocober / 46

33 Fighing Degeneracy Using MCMC Seps The design of good proposals can be complicaed and/or ime consuming. A sandard and generic way o limi parially degeneracy is known as he Resample-Move algorihm (Gilks & Berzuini, 2001); i.e. using MCMC kernels as a principled way o jier he paricle locaions. A MCMC kernel K (x 1: x 1:) of invarian disribuion p (x 1: y 1: ) is a Markov ransiion kernel wih he propery ha p ( x 1: ) ( ) y 1: = p (x 1: y 1: ) K x 1: x 1: dx1:, i.e. if X 1: p (x 1: y 1: ) and X 1: X 1: K (x 1: X 1:) hen marginally X 1: p (x 1: y 1: ). Conrary o MCMC, we ypically do no use ergodic kernels as on-line mehods are required. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

34 Example: Bearings-only-racking Targe modelled using a sandard consan velociy model X = AX 1 + V i.i.d. where V N (0, Σ). The sae vecor X = ( X 1 X 2 X 3 X 4 ) T conains locaion and velociy componens. One only receives observaions of he bearings of he arge ( ) X Y = an 1 3 X 1 + W where W i.i.d. N ( 0, 10 4) ; i.e. he observaions are almos noiseless. We compare Boosrap filer, Paricle-EKF wih L = 5, 10, MCMC moves L = 10 using dynamic resampling. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

35 Degeneracy for Various Proposals Boosrap RMFL(10) EKF(5) EKF(10) Figure: Average number of unique paricles X (i ) approximaing p ( x y 1:100 ); ime on x-axis, average number of unique paricles on y-axis. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

36 Summary Paricle mehods provide consisen esimaes under weak assumpions. We can esimae {p (x y 1: )} 1 saisfacorily bu our approximaions of {p (x 1: y 1: )} 1 degeneraes as increases because of resampling seps. We can miigae bu no eliminae he degeneracy problem by he design of clever proposals. Smoohing mehods o esimae p (x 1:T y 1:T ) can come o he rescue. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

37 Smoohing in Sae-Space Models Smoohing problem: given a fixed ime T, we are ineresed in p (x 1:T y 1:T ) or some of is marginals, e.g. {p (x y 1:T )} T =1. Smoohing is crucial o parameer esimaion. Direc SMC approximaions of p (x 1:T y 1:T ) and is marginals p (x k y 1:T ) are poor if T is large. SMC provide good approximaions of marginals {p (x y 1: )} 1. This can be used o develop effi cien smoohing esimaes. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

38 Fixed-Lag Smoohing The fixed-lag smoohing approximaion relies on p (x y 1:T ) p (x y 1:+ ) for large enough. and quaniaive bounds can be esablished under sabiliy assumpions. This can be exploied by SMC mehods (Kiagawa & Sao, 2001) { } Algorihmically: sop resampling beyond ime + (Kiagawa & Sao, 2001). X (i) Compuaional cos is O (N) bu non-vanishing bias as N (Olsson & al., 2008). Picking is diffi cul: oo small resuls in p (x y 1:+ ) being a poor approximaion of p (x y 1:T ). oo large improves he approximaion bu degeneracy creeps in. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

39 Forward Filering Backward Smoohing Assume you wan o compue he marginal smoohing disribuions {p (x y 1:T )} T =1 insead of sampling from hem. Forward filering Backward smoohing (FFBS). smooher a {}}{ p (x y 1:T ) = = p (x, x +1 y 1:T ) dx +1 p (x +1 y 1:T ) p (x y 1:, x +1 ) dx +1 = filer a smooher a +1 {}}{{}}{ f (x +1 x ) p (x y 1: ) p (x +1 y 1:T ) dx +1. p (x +1 y 1: ) }{{} backward ransiion p( x y 1:,x +1 ) Condiioned upon y 1:T, {X } T =1 is a backward Markov chain of iniial disribuion p (x T y 1:T ) and inhomogeneous Markov ransiions {p (x y 1:, x +1 )} T 1 =1. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

40 Paricle Forward Filering Backward Smoohing Forward filering: compue and sore { p (x y 1: )} T =1 using your favourie PF. Backward smoohing: For = T 1,..., 1, we have p (x y 1:T ) = N i=1 W (i) T δ X (i) (x ) wih W (i) = 1/N and T T p (x y 1:T ) = p (x y 1: ) }{{} where 1 N N i=1 δ X (i) (x ) = N i=1 W (i) T δ X (i) (x ) W (i) N T = W (j) +1 T j=1 p (x +1 y 1:T ) }{{} N j=1 W (j) +1 T δ X (j) (x +1 ) +1 ( f N l=1 f X (j) +1 f ( x +1 x ) f ( x+1 x ) p( x y 1: )dx dx +1 ( X (j) +1 ) (i) X X (l) Compuaional complexiy is O ( TN 2) bu sampling approximaion in O (TN) (Douc e al., 2011). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46 ).

41 Two-Filer Smoohing An alernaive o FB smoohing is he Two-Filer (TF) formula p (x, x +1 y 1:T ) forward filer backward filer {}}{{}}{ p (x y 1: )f (x +1 x ) p (y +1:T x +1 ) The backward informaion filer saisfies p (y T x T ) = g (y T x T ) and p (y :T x ) = g (y x ) p (y +1:T x +1 ) f (x +1 x ) dx +1 Various paricle mehods have been proposed o approximae {p (y :T x )} T =1 bu rely implicily on p (y :T x ) dx < and ry o come up wih a backward dynamics; e.g. solve X +1 = ϕ (X, V +1 ) X = ϕ 1 (X, V +1 ). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

42 Generalized Two-Filer Smoohing Generalized Two-Filer smoohing (Briers, D. & Maskell, 2010) p (x, x +1 y 1:T ) forward filer backward filer {}}{{}}{ p (x y 1: )f (x +1 x ) p (x +1 y +1:T ) p (x +1 ) }{{} arificial prior where p (x +1 y +1:T ) p (y +1:T x +1 ) p (x +1 ). By consrucion, we now have inegrable p (x +1 y +1:T ) which we can approximae using a backward SMC algorihm argeing {p (x +1:T y +1:T )} 1 =T where p (x :T y :T ) p (x ) T k=+1 f (x k x k 1 ) T k= g (y k x k ). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

43 Paricle Generalized Two-Filer Smoohing Forward filer: compue and sore { p (x y 1: )} T =1 using your favourie PF. Backward filer: compue and sore { p (x y :T ) } T using your =1 favourie PF. Combinaion sep: for any {1,..., T } we have p (x, x +1 y 1:T ) p (x y 1:T ) f (x +1 x ) p (x +1 y +1: ) p (x +1 ( N N f X (j) ) +1 X (i) ( i=1 j=1 p X (j) ) δ (i) X,X (j) (x, x +1 ) Cos O ( N 2 T ) bu O (NT ) hrough imporance sampling (Briers, D. & Singh, 2005; Fearnhead, Wyncoll & Tawn, 2010). A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

44 Comparison Direc Mehod vs Fixed-lag, FB and TF Assume he model is exp. sable and we are ineresed in approximaing ϕ T = ϕ (x ) p (x y 1:T ) dx. Mehod Fixed-lag Direc FB/TF # paricles N N N cos O (TN) O (TN) O ( TN 2),O (TN) Variance O (1/N) O ((T + 1) /N) O (1/N) Bias δ O (1/N) O (1/N) MSE=Bias 2 +Var δ 2 + O (1/N) O ((T + 1) /N) O (1/N) FB/TF provide uniformly good approximaions of {p (x y 1:T )} T =1 whereas direc mehod provide only good approximaion for T small. Fas implemenaions FB and TF of compuaional complexiy O (NT ) ouperform oher approaches in erms of MSE. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

45 Summary Paricle smoohing echniques allow us o solve he degeneracy problem. Paricle fixed-lag smoohing is he simples one bu has non-vanishing bias diffi cul o quanify. Paricle FB and TF algorihms provide uniformly good approximaions of marginal smoohing disribuions conrary o direc mehod. A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

46 Some References and Resources A.D., J.F.G. De Freias & N.J. Gordon (ediors), Sequenial Mone Carlo Mehods in Pracice, Springer-Verlag: New York, P. Del Moral, Feynman-Kac Formulae: Genealogical and Ineracing Paricle Sysems wih Applicaions, Springer-Verlag: New York, O. Cappé, E. Moulines & T. Ryden, Hidden Markov Models, Springer-Verlag: New York, Webpage wih links o papers and codes: hp:// A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober / 46

Maximum Likelihood Parameter Estimation in State-Space Models

Maximum Likelihood Parameter Estimation in State-Space Models Maximum Likelihood Parameer Esimaion in Sae-Space Models Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 4 h Ocober 212 A. Douce (UCL Maserclass Oc. 212 4 h Ocober 212 1 / 32

More information

Sequential Monte Carlo Methods for Bayesian Computation

Sequential Monte Carlo Methods for Bayesian Computation Sequential Monte Carlo Methods for Bayesian Computation A. Doucet Kyoto Sept. 2012 A. Doucet (MLSS Sept. 2012) Sept. 2012 1 / 136 Motivating Example 1: Generic Bayesian Model Let X be a vector parameter

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19 Sequenial Imporance Sampling (SIS) AKA Paricle Filering, Sequenial Impuaion (Kong, Liu, Wong, 994) For many problems, sampling direcly from he arge disribuion is difficul or impossible. One reason possible

More information

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

Sequential Importance Resampling (SIR) Particle Filter

Sequential Importance Resampling (SIR) Particle Filter Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle

More information

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering Inroducion o Arificial Inelligence V22.0472-001 Fall 2009 Lecure 18: aricle & Kalman Filering Announcemens Final exam will be a 7pm on Wednesday December 14 h Dae of las class 1.5 hrs long I won ask anyhing

More information

Temporal probability models. Chapter 15, Sections 1 5 1

Temporal probability models. Chapter 15, Sections 1 5 1 Temporal probabiliy models Chaper 15, Secions 1 5 Chaper 15, Secions 1 5 1 Ouline Time and uncerainy Inerence: ilering, predicion, smoohing Hidden Markov models Kalman ilers (a brie menion) Dynamic Bayesian

More information

Recent Developments In Evolutionary Data Assimilation And Model Uncertainty Estimation For Hydrologic Forecasting Hamid Moradkhani

Recent Developments In Evolutionary Data Assimilation And Model Uncertainty Estimation For Hydrologic Forecasting Hamid Moradkhani Feb 6-8, 208 Recen Developmens In Evoluionary Daa Assimilaion And Model Uncerainy Esimaion For Hydrologic Forecasing Hamid Moradkhani Cener for Complex Hydrosysems Research Deparmen of Civil, Consrucion

More information

Temporal probability models

Temporal probability models Temporal probabiliy models CS194-10 Fall 2011 Lecure 25 CS194-10 Fall 2011 Lecure 25 1 Ouline Hidden variables Inerence: ilering, predicion, smoohing Hidden Markov models Kalman ilers (a brie menion) Dynamic

More information

Probabilistic Robotics

Probabilistic Robotics Probabilisic Roboics Bayes Filer Implemenaions Gaussian filers Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel Gaussians : ~ π e p N p - Univariae / / : ~ μ μ μ e p Ν p d π Mulivariae

More information

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

Estimation of Poses with Particle Filters

Estimation of Poses with Particle Filters Esimaion of Poses wih Paricle Filers Dr.-Ing. Bernd Ludwig Chair for Arificial Inelligence Deparmen of Compuer Science Friedrich-Alexander-Universiä Erlangen-Nürnberg 12/05/2008 Dr.-Ing. Bernd Ludwig (FAU

More information

Convergence of Sequential Monte Carlo Methods

Convergence of Sequential Monte Carlo Methods Convergence of Sequenial Mone Carlo Mehods by Dan Crisan - Arnaud Douce * Saisical Laboraory, DPMMS, Universiy of Cambridge, 16 Mill Lane, Cambridge, CB2 1SB, UK. Email: d.crisan@saslab.cam.ac.uk Signal

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

CSE-473. A Gentle Introduction to Particle Filters

CSE-473. A Gentle Introduction to Particle Filters CSE-473 A Genle Inroducion o Paricle Filers Bayes Filers for Robo Localizaion Dieer Fo 2 Bayes Filers: Framework Given: Sream of observaions z and acion daa u: d Sensor model Pz. = { u, z2, u 1, z 1 Dynamics

More information

Using the Kalman filter Extended Kalman filter

Using the Kalman filter Extended Kalman filter Using he Kalman filer Eended Kalman filer Doz. G. Bleser Prof. Sricker Compuer Vision: Objec and People Tracking SA- Ouline Recap: Kalman filer algorihm Using Kalman filers Eended Kalman filer algorihm

More information

Stable approximations of optimal filters

Stable approximations of optimal filters Sable approximaions of opimal filers Joaquin Miguez Deparmen of Signal Theory & Communicaions, Universidad Carlos III de Madrid. E-mail: joaquin.miguez@uc3m.es Join work wih Dan Crisan (Imperial College

More information

Linear Gaussian State Space Models

Linear Gaussian State Space Models Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying

More information

Introduction to Mobile Robotics

Introduction to Mobile Robotics Inroducion o Mobile Roboics Bayes Filer Kalman Filer Wolfram Burgard Cyrill Sachniss Giorgio Grisei Maren Bennewiz Chrisian Plagemann Bayes Filer Reminder Predicion bel p u bel d Correcion bel η p z bel

More information

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

Particle Filtering. 1 Introduction. Michael Johannes and Nicholas Polson

Particle Filtering. 1 Introduction. Michael Johannes and Nicholas Polson Paricle Filering Michael Johannes and Nicholas Polson Absrac This chaper provides an overview of paricle filers. Paricle filers generae approximaions o filering disribuions and are commonly used in non-linear

More information

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Georey E. Hinton. University oftoronto.   Technical Report CRG-TR February 22, Abstract Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical

More information

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé Bias in Condiional and Uncondiional Fixed Effecs Logi Esimaion: a Correcion * Tom Coupé Economics Educaion and Research Consorium, Naional Universiy of Kyiv Mohyla Academy Address: Vul Voloska 10, 04070

More information

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004 Augmened Realiy II Kalman Filers Gudrun Klinker May 25, 2004 Ouline Moivaion Discree Kalman Filer Modeled Process Compuing Model Parameers Algorihm Exended Kalman Filer Kalman Filer for Sensor Fusion Lieraure

More information

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM Robo Moion Model EKF based Localizaion EKF SLAM Graph SLAM General Robo Moion Model Robo sae v r Conrol a ime Sae updae model Noise model of robo conrol Noise model of conrol Robo moion model

More information

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems 8 Froniers in Signal Processing, Vol. 1, No. 1, July 217 hps://dx.doi.org/1.2266/fsp.217.112 Recursive Leas-Squares Fixed-Inerval Smooher Using Covariance Informaion based on Innovaion Approach in Linear

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

Lecture 1 Overview. course mechanics. outline & topics. what is a linear dynamical system? why study linear systems? some examples

Lecture 1 Overview. course mechanics. outline & topics. what is a linear dynamical system? why study linear systems? some examples EE263 Auumn 27-8 Sephen Boyd Lecure 1 Overview course mechanics ouline & opics wha is a linear dynamical sysem? why sudy linear sysems? some examples 1 1 Course mechanics all class info, lecures, homeworks,

More information

SMC in Estimation of a State Space Model

SMC in Estimation of a State Space Model SMC in Esimaion of a Sae Space Model Dong-Whan Ko Deparmen of Economics Rugers, he Sae Universiy of New Jersey December 31, 2012 Absrac I briefly summarize procedures for macroeconomic Dynamic Sochasic

More information

A Sequential Smoothing Algorithm with Linear Computational Cost

A Sequential Smoothing Algorithm with Linear Computational Cost A Sequenial Smoohing Algorihm wih Linear Compuaional Cos Paul Fearnhead David Wyncoll Jonahan Tawn May 9, 2008 Absrac In his paper we propose a new paricle smooher ha has a compuaional complexiy of O(N),

More information

Understanding the asymptotic behaviour of empirical Bayes methods

Understanding the asymptotic behaviour of empirical Bayes methods Undersanding he asympoic behaviour of empirical Bayes mehods Boond Szabo, Aad van der Vaar and Harry van Zanen EURANDOM, 11.10.2011. Conens 2/20 Moivaion Nonparameric Bayesian saisics Signal in Whie noise

More information

Smoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T

Smoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T Smoohing Consan process Separae signal & noise Smooh he daa: Backward smooher: A an give, replace he observaion b a combinaion of observaions a & before Simple smooher : replace he curren observaion wih

More information

Chapter 14. (Supplementary) Bayesian Filtering for State Estimation of Dynamic Systems

Chapter 14. (Supplementary) Bayesian Filtering for State Estimation of Dynamic Systems Chaper 4. Supplemenary Bayesian Filering for Sae Esimaion of Dynamic Sysems Neural Neworks and Learning Machines Haykin Lecure Noes on Selflearning Neural Algorihms ByoungTak Zhang School of Compuer Science

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests Ouline Ouline Hypohesis Tes wihin he Maximum Likelihood Framework There are hree main frequenis approaches o inference wihin he Maximum Likelihood framework: he Wald es, he Likelihood Raio es and he Lagrange

More information

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS SEIF, EnKF, EKF SLAM Pieer Abbeel UC Berkeley EECS Informaion Filer From an analyical poin of view == Kalman filer Difference: keep rack of he inverse covariance raher han he covariance marix [maer of

More information

EKF SLAM vs. FastSLAM A Comparison

EKF SLAM vs. FastSLAM A Comparison vs. A Comparison Michael Calonder, Compuer Vision Lab Swiss Federal Insiue of Technology, Lausanne EPFL) michael.calonder@epfl.ch The wo algorihms are described wih a planar robo applicaion in mind. Generalizaion

More information

Hidden Markov Models

Hidden Markov Models Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe

More information

Monte Carlo Filter Particle Filter

Monte Carlo Filter Particle Filter 205 European Conrol Conference (ECC) July 5-7, 205. Linz, Ausria Mone Carlo Filer Paricle Filer Masaya Muraa, Hidehisa Nagano and Kunio Kashino Absrac We propose a new realizaion mehod of he sequenial

More information

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle

Physics 235 Chapter 2. Chapter 2 Newtonian Mechanics Single Particle Chaper 2 Newonian Mechanics Single Paricle In his Chaper we will review wha Newon s laws of mechanics ell us abou he moion of a single paricle. Newon s laws are only valid in suiable reference frames,

More information

Filtering Turbulent Signals Using Gaussian and non-gaussian Filters with Model Error

Filtering Turbulent Signals Using Gaussian and non-gaussian Filters with Model Error Filering Turbulen Signals Using Gaussian and non-gaussian Filers wih Model Error June 3, 3 Nan Chen Cener for Amosphere Ocean Science (CAOS) Couran Insiue of Sciences New York Universiy / I. Ouline Use

More information

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Robust estimation based on the first- and third-moment restrictions of the power transformation model h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,

More information

OBJECTIVES OF TIME SERIES ANALYSIS

OBJECTIVES OF TIME SERIES ANALYSIS OBJECTIVES OF TIME SERIES ANALYSIS Undersanding he dynamic or imedependen srucure of he observaions of a single series (univariae analysis) Forecasing of fuure observaions Asceraining he leading, lagging

More information

An Introduction to Malliavin calculus and its applications

An Introduction to Malliavin calculus and its applications An Inroducion o Malliavin calculus and is applicaions Lecure 5: Smoohness of he densiy and Hörmander s heorem David Nualar Deparmen of Mahemaics Kansas Universiy Universiy of Wyoming Summer School 214

More information

7630 Autonomous Robotics Probabilistic Localisation

7630 Autonomous Robotics Probabilistic Localisation 7630 Auonomous Roboics Probabilisic Localisaion Principles of Probabilisic Localisaion Paricle Filers for Localisaion Kalman Filer for Localisaion Based on maerial from R. Triebel, R. Käsner, R. Siegwar,

More information

Testing for a Single Factor Model in the Multivariate State Space Framework

Testing for a Single Factor Model in the Multivariate State Space Framework esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics

More information

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate.

Introduction D P. r = constant discount rate, g = Gordon Model (1962): constant dividend growth rate. Inroducion Gordon Model (1962): D P = r g r = consan discoun rae, g = consan dividend growh rae. If raional expecaions of fuure discoun raes and dividend growh vary over ime, so should he D/P raio. Since

More information

RL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1

RL Lecture 7: Eligibility Traces. R. S. Sutton and A. G. Barto: Reinforcement Learning: An Introduction 1 RL Lecure 7: Eligibiliy Traces R. S. Suon and A. G. Baro: Reinforcemen Learning: An Inroducion 1 N-sep TD Predicion Idea: Look farher ino he fuure when you do TD backup (1, 2, 3,, n seps) R. S. Suon and

More information

Vehicle Arrival Models : Headway

Vehicle Arrival Models : Headway Chaper 12 Vehicle Arrival Models : Headway 12.1 Inroducion Modelling arrival of vehicle a secion of road is an imporan sep in raffic flow modelling. I has imporan applicaion in raffic flow simulaion where

More information

Tom Heskes and Onno Zoeter. Presented by Mark Buller

Tom Heskes and Onno Zoeter. Presented by Mark Buller Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden

More information

Air Traffic Forecast Empirical Research Based on the MCMC Method

Air Traffic Forecast Empirical Research Based on the MCMC Method Compuer and Informaion Science; Vol. 5, No. 5; 0 ISSN 93-8989 E-ISSN 93-8997 Published by Canadian Cener of Science and Educaion Air Traffic Forecas Empirical Research Based on he MCMC Mehod Jian-bo Wang,

More information

Continuous Time. Time-Domain System Analysis. Impulse Response. Impulse Response. Impulse Response. Impulse Response. ( t) + b 0.

Continuous Time. Time-Domain System Analysis. Impulse Response. Impulse Response. Impulse Response. Impulse Response. ( t) + b 0. Time-Domain Sysem Analysis Coninuous Time. J. Robers - All Righs Reserved. Edied by Dr. Rober Akl 1. J. Robers - All Righs Reserved. Edied by Dr. Rober Akl 2 Le a sysem be described by a 2 y ( ) + a 1

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK

CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 175 CHAPTER 10 VALIDATION OF TEST WITH ARTIFICAL NEURAL NETWORK 10.1 INTRODUCTION Amongs he research work performed, he bes resuls of experimenal work are validaed wih Arificial Neural Nework. From he

More information

CH Sean Han QF, NTHU, Taiwan BFS2010. (Joint work with T.-Y. Chen and W.-H. Liu)

CH Sean Han QF, NTHU, Taiwan BFS2010. (Joint work with T.-Y. Chen and W.-H. Liu) CH Sean Han QF, NTHU, Taiwan BFS2010 (Join work wih T.-Y. Chen and W.-H. Liu) Risk Managemen in Pracice: Value a Risk (VaR) / Condiional Value a Risk (CVaR) Volailiy Esimaion: Correced Fourier Transform

More information

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB

T L. t=1. Proof of Lemma 1. Using the marginal cost accounting in Equation(4) and standard arguments. t )+Π RB. t )+K 1(Q RB Elecronic Companion EC.1. Proofs of Technical Lemmas and Theorems LEMMA 1. Le C(RB) be he oal cos incurred by he RB policy. Then we have, T L E[C(RB)] 3 E[Z RB ]. (EC.1) Proof of Lemma 1. Using he marginal

More information

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks - Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics

More information

A variational radial basis function approximation for diffusion processes.

A variational radial basis function approximation for diffusion processes. A variaional radial basis funcion approximaion for diffusion processes. Michail D. Vreas, Dan Cornford and Yuan Shen {vreasm, d.cornford, y.shen}@ason.ac.uk Ason Universiy, Birmingham, UK hp://www.ncrg.ason.ac.uk

More information

MATH 128A, SUMMER 2009, FINAL EXAM SOLUTION

MATH 128A, SUMMER 2009, FINAL EXAM SOLUTION MATH 28A, SUMME 2009, FINAL EXAM SOLUTION BENJAMIN JOHNSON () (8 poins) [Lagrange Inerpolaion] (a) (4 poins) Le f be a funcion defined a some real numbers x 0,..., x n. Give a defining equaion for he Lagrange

More information

RAO-BLACKWELLIZED PARTICLE SMOOTHERS FOR MIXED LINEAR/NONLINEAR STATE-SPACE MODELS

RAO-BLACKWELLIZED PARTICLE SMOOTHERS FOR MIXED LINEAR/NONLINEAR STATE-SPACE MODELS RAO-BLACKWELLIZED PARICLE SMOOHERS FOR MIXED LINEAR/NONLINEAR SAE-SPACE MODELS Fredrik Lindsen, Pee Bunch, Simon J. Godsill and homas B. Schön Division of Auomaic Conrol, Linköping Universiy, Linköping,

More information

Empirical Process Theory

Empirical Process Theory Empirical Process heory 4.384 ime Series Analysis, Fall 27 Reciaion by Paul Schrimpf Supplemenary o lecures given by Anna Mikusheva Ocober 7, 28 Reciaion 7 Empirical Process heory Le x be a real-valued

More information

Object tracking: Using HMMs to estimate the geographical location of fish

Object tracking: Using HMMs to estimate the geographical location of fish Objec racking: Using HMMs o esimae he geographical locaion of fish 02433 - Hidden Markov Models Marin Wæver Pedersen, Henrik Madsen Course week 13 MWP, compiled June 8, 2011 Objecive: Locae fish from agging

More information

Smoothing Algorithms for State-Space Models

Smoothing Algorithms for State-Space Models IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. XX, NO. XX, 200X 1 Smoohing Algorihms for Sae-Space Models Mark Briers, Arnaud Douce, and Simon Maskell Absrac A prevalen problem in saisical signal processing,

More information

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

Monte Carlo data association for multiple target tracking

Monte Carlo data association for multiple target tracking Mone Carlo daa associaion for muliple arge racking Rickard Karlsson Dep. of Elecrical Engineering Linköping Universiy SE-58183 Linköping, Sweden E-mail: rickard@isy.liu.se Fredrik Gusafsson Dep. of Elecrical

More information

Exponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits

Exponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits DOI: 0.545/mjis.07.5009 Exponenial Weighed Moving Average (EWMA) Char Under The Assumpion of Moderaeness And Is 3 Conrol Limis KALPESH S TAILOR Assisan Professor, Deparmen of Saisics, M. K. Bhavnagar Universiy,

More information

6. Stochastic calculus with jump processes

6. Stochastic calculus with jump processes A) Trading sraegies (1/3) Marke wih d asses S = (S 1,, S d ) A rading sraegy can be modelled wih a vecor φ describing he quaniies invesed in each asse a each insan : φ = (φ 1,, φ d ) The value a of a porfolio

More information

Lecture 3: Exponential Smoothing

Lecture 3: Exponential Smoothing NATCOR: Forecasing & Predicive Analyics Lecure 3: Exponenial Smoohing John Boylan Lancaser Cenre for Forecasing Deparmen of Managemen Science Mehods and Models Forecasing Mehod A (numerical) procedure

More information

Speech and Language Processing

Speech and Language Processing Speech and Language rocessing Lecure 4 Variaional inference and sampling Informaion and Communicaions Engineering Course Takahiro Shinozaki 08//5 Lecure lan (Shinozaki s par) I gives he firs 6 lecures

More information

Figure 1. Jaw RMS target-tracker difference for a.9hz sinusoidal target.

Figure 1. Jaw RMS target-tracker difference for a.9hz sinusoidal target. Approaches o Bayesian Smooh Unimodal Regression George Woodworh Dec 3, 999 (Draf - please send commens o george-woodworh@uiowa.edu). Background Speech Ariculaion Daa The daa in Figure were obained by asking

More information

Lecture 6: Wiener Process

Lecture 6: Wiener Process Lecure 6: Wiener Process Eric Vanden-Eijnden Chapers 6, 7 and 8 offer a (very) brief inroducion o sochasic analysis. These lecures are based in par on a book projec wih Weinan E. A sandard reference for

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

PARTICLE FILTERS FOR SYSTEM IDENTIFICATION OF STATE-SPACE MODELS LINEAR IN EITHER PARAMETERS OR STATES 1

PARTICLE FILTERS FOR SYSTEM IDENTIFICATION OF STATE-SPACE MODELS LINEAR IN EITHER PARAMETERS OR STATES 1 PARTICLE FILTERS FOR SYSTEM IDENTIFICATION OF STATE-SPACE MODELS LINEAR IN EITHER PARAMETERS OR STATES 1 Thomas Schön and Fredrik Gusafsson Division of Auomaic Conrol and Communicaion Sysems Deparmen of

More information

Lecture 33: November 29

Lecture 33: November 29 36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure

More information

On a Fractional Stochastic Landau-Ginzburg Equation

On a Fractional Stochastic Landau-Ginzburg Equation Applied Mahemaical Sciences, Vol. 4, 1, no. 7, 317-35 On a Fracional Sochasic Landau-Ginzburg Equaion Nguyen Tien Dung Deparmen of Mahemaics, FPT Universiy 15B Pham Hung Sree, Hanoi, Vienam dungn@fp.edu.vn

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LDA, logisic

More information

Lookahead Strategies for Sequential Monte Carlo

Lookahead Strategies for Sequential Monte Carlo Lookahead Sraegies for Sequenial Mone Carlo Ming Lin, Rong Chen, and Jun S. Liu Xiamen Universiy, Rugers Universiy and Harvard Universiy June 24, 2012 Absrac Based on he principles of imporance sampling

More information

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS LECTURE : GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS We will work wih a coninuous ime reversible Markov chain X on a finie conneced sae space, wih generaor Lf(x = y q x,yf(y. (Recall ha q

More information

Západočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France

Západočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France ADAPTIVE SIGNAL PROCESSING USING MAXIMUM ENTROPY ON THE MEAN METHOD AND MONTE CARLO ANALYSIS Pavla Holejšovsá, Ing. *), Z. Peroua, Ing. **), J.-F. Bercher, Prof. Assis. ***) Západočesá Univerzia v Plzni,

More information

A Bayesian Approach to Spectral Analysis

A Bayesian Approach to Spectral Analysis Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2

More information

Lecture 20: Riccati Equations and Least Squares Feedback Control

Lecture 20: Riccati Equations and Least Squares Feedback Control 34-5 LINEAR SYSTEMS Lecure : Riccai Equaions and Leas Squares Feedback Conrol 5.6.4 Sae Feedback via Riccai Equaions A recursive approach in generaing he marix-valued funcion W ( ) equaion for i for he

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Probabilistic Robotics SLAM

Probabilistic Robotics SLAM Probabilisic Roboics SLAM The SLAM Problem SLAM is he process by which a robo builds a map of he environmen and, a he same ime, uses his map o compue is locaion Localizaion: inferring locaion given a map

More information

Richard A. Davis Colorado State University Bojan Basrak Eurandom Thomas Mikosch University of Groningen

Richard A. Davis Colorado State University Bojan Basrak Eurandom Thomas Mikosch University of Groningen Mulivariae Regular Variaion wih Applicaion o Financial Time Series Models Richard A. Davis Colorado Sae Universiy Bojan Basrak Eurandom Thomas Mikosch Universiy of Groningen Ouline + Characerisics of some

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LTU, decision

More information

Sample Autocorrelations for Financial Time Series Models. Richard A. Davis Colorado State University Thomas Mikosch University of Copenhagen

Sample Autocorrelations for Financial Time Series Models. Richard A. Davis Colorado State University Thomas Mikosch University of Copenhagen Sample Auocorrelaions for Financial Time Series Models Richard A. Davis Colorado Sae Universiy Thomas Mikosch Universiy of Copenhagen Ouline Characerisics of some financial ime series IBM reurns NZ-USA

More information

Mechanical Fatigue and Load-Induced Aging of Loudspeaker Suspension. Wolfgang Klippel,

Mechanical Fatigue and Load-Induced Aging of Loudspeaker Suspension. Wolfgang Klippel, Mechanical Faigue and Load-Induced Aging of Loudspeaker Suspension Wolfgang Klippel, Insiue of Acousics and Speech Communicaion Dresden Universiy of Technology presened a he ALMA Symposium 2012, Las Vegas

More information

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),

More information

CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation

CSE-571 Robotics. Sample-based Localization (sonar) Motivation. Bayes Filter Implementations. Particle filters. Density Approximation Moivaion CSE57 Roboics Bayes Filer Implemenaions Paricle filers So far, we discussed he Kalman filer: Gaussian, linearizaion problems Paricle filers are a way o efficienly represen nongaussian disribuions

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

Monte Carlo Sampling of Non-Gaussian Proposal Distribution in Feature-Based RBPF-SLAM

Monte Carlo Sampling of Non-Gaussian Proposal Distribution in Feature-Based RBPF-SLAM Proceedings of Ausralasian Conference on Roboics and Auomaion, 3-5 Dec 2012, Vicoria Universiy of Wellingon, New Zealand. Mone Carlo Sampling of Non-Gaussian Proposal Disribuion in Feaure-Based RBPF-SLAM

More information

Sequential Monte Carlo Methods for Bayesian Filtering

Sequential Monte Carlo Methods for Bayesian Filtering Sequenial Mone Carlo Mehods for Bayesian Filering Andreas Sørksen Sordal Thesis for he degree of Maser of Science Mahemaical Saisics Universiy of Bergen, orway 29h May 2008 This hesis is wrien in LATEX

More information

STATE-SPACE MODELLING. A mass balance across the tank gives:

STATE-SPACE MODELLING. A mass balance across the tank gives: B. Lennox and N.F. Thornhill, 9, Sae Space Modelling, IChemE Process Managemen and Conrol Subjec Group Newsleer STE-SPACE MODELLING Inroducion: Over he pas decade or so here has been an ever increasing

More information

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2 Financial Economerics Kalman Filer: some applicaions o Finance Universiy of Evry - Maser 2 Eric Bouyé January 27, 2009 Conens 1 Sae-space models 2 2 The Scalar Kalman Filer 2 21 Presenaion 2 22 Summary

More information

Chapter 2. First Order Scalar Equations

Chapter 2. First Order Scalar Equations Chaper. Firs Order Scalar Equaions We sar our sudy of differenial equaions in he same way he pioneers in his field did. We show paricular echniques o solve paricular ypes of firs order differenial equaions.

More information

Ensamble methods: Boosting

Ensamble methods: Boosting Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room

More information