Maximum Likelihood Parameter Estimation in State-Space Models

Size: px
Start display at page:

Download "Maximum Likelihood Parameter Estimation in State-Space Models"

Transcription

1 Maximum Likelihood Parameer Esimaion in Sae-Space Models Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 4 h Ocober 212 A. Douce (UCL Maserclass Oc h Ocober / 32

2 Sae-Space Models Le {X } 1 be a laen/hidden X -valued Markov process wih X 1 µ ( and X (X 1 = x f ( x. Le {Y } 1 be an Y-valued Markov observaion process such ha Y (X = x g ( x. Paricle filers esimae {p (x 1: y 1: } 1 on-line bu only esimaes of {p (x y 1: } 1 and {p (y 1: } 1 are reliable. Paricle smoohing mehods allow us o obain reliable esimaes of {p (x y 1:T } T =1. A. Douce (UCL Maserclass Oc h Ocober / 32

3 Sae-Space Models wih Unknown Parameers In mos scenarios of ineres, he sae-space model conains an unknown saic parameer θ Θ so ha X 1 µ θ ( and X (X 1 = x f θ ( x 1. The observaions {Y } 1 are condiionally independen given {X } 1 and θ Y (X = x g θ ( x. Aim: We would like o infer θ eiher on-line or off-line. A. Douce (UCL Maserclass Oc h Ocober / 32

4 Examples Sochasic Volailiy model where θ = ( φ, σ 2, β. X = φx 1 + σv, V i.i.d. N (, 1 Y = β exp (X /2 W, W i.i.d. N (, 1 Biochemical Nework model Pr ( X+d 1 =x 1 +1, X+d 2 =x 2 x 1, x 2 = α x 1 d + o (d, Pr ( X+d 1 =x 1 1, X+d 2 =x 2 +1 x 1, x 2 = β x 1 x 2 d + o (d, Pr ( X+d 1 =x 1, X+d 2 =x 2 1 x 1, x 2 = γ x 2 d + o (d, wih where θ = (α, β, γ. Y k = X 1 k T + W k wih W k i.i.d. N (, σ 2 A. Douce (UCL Maserclass Oc h Ocober / 32

5 Parameer Inference in Sae-Space Models Online Bayesian parameer inference. Offl ine Maximum Likelihood parameer inference. Online Maximum Likelihood parameer inference. A. Douce (UCL Maserclass Oc h Ocober / 32

6 Bayesian Parameer Inference in Sae-Space Models Se a prior p (θ on θ so inference relies now on p ( θ, x 1: y 1: = p (θ, x 1:, y 1: p (y 1: where wih p (θ, x 1:, y 1: = p (θ p θ (x 1:, y 1: We have p θ (x 1:, y 1: = µ θ (x 1 k=2 f θ (x k x k 1 k=1 p ( θ, x 1: y 1: = p ( θ y 1: p θ (x 1: y 1: g θ (y k x k Sandard and more sophisicaed paricle mehods o sample from {p ( θ, x 1: y 1: } 1 are ALL unreliabe. A. Douce (UCL Maserclass Oc h Ocober / 32

7 Online Bayesian Parameer Inference A ime = 1 ( θ (i 1, X (i 1 p (θ µ θ (x 1 hen p ( θ, x 1 y 1 = N W (i 1 δ ( θ (i 1,X (i (θ, x 1, W (i 1 g (i 1 θ 1 i=1 ( θ (i 1, X (i 1 p ( θ, x 1 y 1 and p ( θ, x 1 y 1 = 1 N N i=1 δ θ (i,x (i (θ, x 1. 1 A ime 2 Se θ (i ( = θ (i 1, X (i p ( θ, x 1: y 1 = θ (i N i=1 f θ (i W (i δ ( θ (i, X (i 1: p ( θ, x 1: y 1: and ( x X (i 1 and X (i p ( θ, x 1: y 1: = 1 N N i=1 δ (i θ,x (i (θ, x 1:. 1: 1: = ( ( X (i,x (i (θ, x 1:, W (i g (i 1: θ y 1 X (i 1. 1: 1, X (i ( y X (i A. Douce (UCL Maserclass Oc h Ocober / 32.

8 Online Bayesian Parameer Inference Provide consisen { esimaes } bu remarkably ineffi cien (Chopin, 22. Paricles in Θ space only sampled a ime 1: θ (i 1 degeneracy problem! Consider he exended sae Z = (X, θ hen ν (z 1 = p (θ 1 µ θ1 (x 1, f (z z 1 = δ θ 1 (θ f θ (x x 1, g (y z = g θ (y x ; i.e. θ = θ 1 for any wih θ 1 from he prior. Exponenial sabiliy assumpion on {p (z y 1: } 1 canno be saisfied. { } Use MCMC seps on θ so as o jier ; e.g. Andrieu, De Freias & D. (1999; Fearnhead (22; Gilks & Berzuini (21; Carvalho e al. (21. When p ( θ y 1:, x 1: = p ( θ s (x 1:, y 1: where s (x 1:, y 1: is a fixed-dimensional vecor, elegan bu sill implicily relies on p (x 1: y 1: so degeneracy will creep in. A. Douce (UCL Maserclass Oc h Ocober / 32 θ (i

9 Online Bayesian Parameer Inference A ime 1, we have Se θ (i X (i 1: = ( Resample ( p θ (i p ( θ, x 1: 1 y 1: 1 = 1 N = θ (i 1 ( X (i 1: 1, X (i (i, sample X and f θ (i N i=1 δ ( θ (i (i (θ, x 1,X 1: 1, 1: 1 ( X (i 1 p ( θ, x 1: y 1: = N i=1 W (i ( W (i g (i y θ X (i. θ (i, X (i 1: θ y 1:, X (i 1: p ( θ, x 1: y 1: = 1 N N i=1 δ ( θ (i δ ( θ (i, se,x (i (θ, x 1:, 1: p ( θ, x 1: y 1: hen sample o obain,x (i (θ, x 1:. 1: A. Douce (UCL Maserclass Oc h Ocober / 32

10 A Toy Example Linear Gaussian sae-space model We se p (θ 1 ( 1,1 (θ so X = θx 1 + σ V V, V i.i.d. N (, 1 Y = X + σ W W, W i.i.d. N (, 1. p ( θ y 1:, x 1: N ( θ; m, σ 2 1( 1,1 (θ where wih S 1, = σ 2 = S 1 2,, m = S 1 2, S 1, x k 1 x k, S 2, = xk 1 2 k=2 k=2 A. Douce (UCL Maserclass Oc h Ocober / 32

11 Illusraion of he Degeneracy Problem SMC esimae of E [ θ y 1: ], as increases he degeneracy creeps in. A. Douce (UCL Maserclass Oc h Ocober / 32

12 Anoher Toy Example Linear Gaussian sae-space model X = ρx 1 + V, V i.i.d. N (, 1 Y = X + σw, W i.i.d. N (, 1. We se ρ U ( 1,1 and σ 2 IG (1, 1. We use paricle filer wih perfec adapaion and Gibbs moves wih N = 1; paricle learning (Andrieu, D. & De Freias, 1999; Carvalho e al., 21 5 runs of he paricle mehod vs ground ruh obained using Kalman filer on saes and grid on parameers. A. Douce (UCL Maserclass Oc h Ocober / 32

13 pdf, n=5 pdf, n=4 pdf, n=3 pdf, n=2 pdf, n=1 Anoher Illusraion of Degeneracy for Paricle Learning σ2 y ρ Figure: Esimaes of p ( ρ y 1: and p ( σ 2 y 1: over 5 runs (red vs ground ruh (blue for = 1 3, 2.1 3,..., for N = 1 4 (Kanas e al., 212 A. Douce (UCL Maserclass Oc h Ocober / 32

14 Sepping Back For fixed θ, V [ p θ (y 1: /p θ (y 1: ] is in O (/N. In a Bayesian conex, p ( θ y 1: p θ (y 1: p (θ so we implicily need o compue p θ (y 1: a each paricle locaion θ (i. I appears impossible o obain uniformly in ime sable esimaes of {p ( θ y 1: } 1 for a fixed N. However for a given ime horizon T, we can use PF o sample effi cenly from p ( θ y 1:T ; see Lecure 3. A. Douce (UCL Maserclass Oc h Ocober / 32

15 Likelihood Funcion Esimaion Le y 1:T being given, he log-(marginal likelihood is given by l(θ = log p θ (y 1:T. For any θ Θ, one can esimae l(θ using paricle mehods, variance O (T /N. Direc maximizaion of l(θ diffi cul as esimae l(θ is no a smooh funcion of θ even for fixed random seed. For dim (X = 1, we can obain smooh esimae of log-likelihood funcion by using a smoohed resampling sep (e.g. Pi, 211; i.e. piecewise linear approximaion of Pr (X < x y 1:. For dim (X > 1, we can obain esimaes of l(θ highly posiively correlaed for neigbouring values in Θ (e.g. Lee, 28. A. Douce (UCL Maserclass Oc h Ocober / 32

16 Gradien Ascen To maximise l(θ w.r. θ, use a ieraion k + 1 θ k+1 = θ k + γ k l(θ θ=θk where l(θ θ=θk is he so-called score vecor. l(θ θ=θk can be esimaed using finie differences bu more effi cienly using Fisher s ideniy l(θ = log p θ (x 1:T, y 1:T p θ (x 1:T y 1:T dx 1:T where log p θ (x 1:T, y 1:T = log µ θ (x 1 + T =2 log f θ (x x 1 + T =1 log g θ (y x. A. Douce (UCL Maserclass Oc h Ocober / 32

17 Paricle Calculaion of he Score Vecor We have l(θ = { log µ θ (x 1 + log g θ (y 1 x 1 } p θ (x 1 y 1:T dx 1 T + { log f θ (x x 1 + log g θ (y x } p θ (x 1, x y 1:T dx 1 dx =2 To approximae l(θ, we jus need paricle approximaions of {p θ (x 1, x y 1:T } T =2. All he paricle smoohing mehods deailed before can be applied. Similar smoohed addiive funcionals have o be compued when implemening he Expecaion-Maximizaion. A. Douce (UCL Maserclass Oc h Ocober / 32

18 Comparison Direc Mehod vs FB We wan o esimae ϕ T = T =1 ϕ (x 1, x, y p (x 1, x y 1:T dx 1 dx. Mehod Direc FB # paricles N N cos O (TN O ( TN 2, O (TN Var. O ( T 2 /N O (T /N Bias O (T /N O (T /N MSE=Bias 2 +Var O ( T 2 /N O ( T 2 /N 2 Fas implemenaions FB of compuaional complexiy O (NT ouperform direc approach as MSE is O ( T 2 /N 2 whereas i is O ( T 2 /N for direc SMC. Naive implemenaions FB and TF have MSE of same order as direc mehod for fixed compuaional complexiy bu MSE is bias dominaed for FB/TF whereas i is variance dominaed for Direc SMC. A. Douce (UCL Maserclass Oc h Ocober / 32

19 Experimenal Resuls Consider a linear Gaussian model X = φx 1 + σ v V, V i.i.d. N (, 1 Y = cx + σ w W, W i.i.d. N (, 1. We simulae 1, observaions and compue paricle esimaes of ϕ T (x 1:T p (x 1:T y 1:T dx 1:T for 4 differen addiive funcionals ϕ (x 1: = ϕ 1 (x 1: 1 + ϕ (x 1, x, y including ϕ 1 (x 1, x, y = x 1 x, ϕ 2 (x 1, x, y = x 2. [Ground ruh can be compued using Kalman smooher.] We use 1 replicaions on he same daase o esimae he empirical variance. A. Douce (UCL Maserclass Oc h Ocober / 32

20 Score Score Boxplos of Direc vs FB Esimaes Algorihm 1 score esimaes for parameer σ v Algorihm score esimaes for parameer φ Algorihm 1 Algorihm Time seps Time seps Direc (lef vs FB (righ A. Douce (UCL Maserclass Oc h Ocober / 32

21 V ariance V ariance V ariance V ariance Empirical Variance for Direc vs FB Esimaes V a r ia n c e o f s c o r e e s im σ a e w.r.. v x 1 4 V a r ia n c e o f s c o r e e s im φ a e w.r.. V a r ia n c e o f s c o r e e s im σ a e w.r.. v V a r ia n c e o f s c o r e e s im φ a e w.r V a r ia n c e o f s c o r e e s im σ a e w.r.. w V a r ia n c e o f s c o r e e s im a e w.r.. c V a r ia n c e o f s c o r e e s im σ a e w.r.. w V a r ia n c e o f s c o r e e s im a e w.r.. c Time seps Time seps Time seps Time seps Direc (lef vs FB (righ; he verical scale is differen A. Douce (UCL Maserclass Oc h Ocober / 32

22 Online ML Parameer Inference Recursive maximum likelihood (Tieringon, 1984; LeGland & Mevel, 1997 proceeds as follows θ +1 = θ + γ log p θ1: (y y 1: 1 where p θ1: (y y 1: 1 is compued using θ k a ime k and γ =, γ 2 <. Under regulariy condiions, his converges owards a local maximum of he (average log-likelihood. Noe ha log p θ1: (y y 1: 1 = log p θ1: (y 1: log p θ1: 1 (y 1: 1 is given by he difference of wo pseudo-score vecors where log p θ1: (y 1: := ( k=2 log f θ (x k x k 1 θk + log g θ (y k x k θk p θ1: (x 1: y 1: dx 1:. A. Douce (UCL Maserclass Oc h Ocober / 32

23 Online ML Paricle Parameer Inference Paricle approximaion follows where θ +1 = θ + γ log p θ1: (y y 1: 1 log p θ1: (y y 1: 1 = log p θ1: (y 1: log p θ1: 1 (y 1: 1 is given by he difference of paricle esimaes of pseudo-score vecors (Poyadjis, D. & Singh, 211. Asympoic variance of log p θ1: (y y 1: 1 is uniformly bounded in O (1/N for FB esimae whereas i is O (/N for direc paricle mehod (Del Moral, D. & Singh, 211. Bias is O (1/N in boh cases. Major Problem: If we use FB, his is no an online algorihm anymore as i requires a backward pass of order O ( o approximae log p θ1: (y 1:... A. Douce (UCL Maserclass Oc h Ocober / 32

24 Variance of he Gradien Esimae for Direc vs FB Figure: Empirical variance of he gradien esimae for sandard versus FB approximaions (SV model A. Douce (UCL Maserclass Oc h Ocober / 32

25 Online Paricle ML Inference using Direc Approach x1 3 Figure: N = 1, paricles, online parameer esimaes for SV model. A. Douce (UCL Maserclass Oc h Ocober / 32

26 Online Paricle ML Inference using FB x1 3 Figure: N = 5 paricles, online parameer esimaes for SV model. A. Douce (UCL Maserclass Oc h Ocober / 32

27 Forward only Smoohing Dynamic programming allows us o compue in a single forward pass he FB esimaes of ϕ θ = ϕ (x 1: p θ (x 1: y 1: dx 1: where ϕ (x 1: = ϕ (x k 1, x k, y k k=1 Forward Backward (FB decomposiion saes T 1 p θ (x 1:T y 1:T = p θ (x T y 1:T =1 p θ (x y 1:, x +1 where p θ (x y 1:, x +1 = f θ( x +1 x p θ ( x y 1: p θ ( x +1 y 1:. Condiioned upon y 1:T, {X } T =1 is a backward Markov chain of iniial disribuion p (x T y 1:T and inhomogeneous Markov ransiions {p θ (x y 1:, x +1 } T 1 =1 independen of T. A. Douce (UCL Maserclass Oc h Ocober / 32

28 Forward only Smoohing We have ϕ θ = ϕ (x 1: p θ (x 1: 1 y 1: 1, x dx 1: 1 = ϕ (x 1: p θ (x 1: 1 y 1: 1, x dx 1: 1 p }{{} θ (x y 1: dx Forward smoohing recursion V θ (x = V θ (x [ ] V 1 θ (x 1 + ϕ (x 1:, y p θ (x 1 y 1: 1, x dx 1 Appears implicily in Ellio, Aggoun & Moore (1996, Ford (1998 and rediscovered a few imes... Presenaion follows here (Del Moral, D. & Singh, 29. A. Douce (UCL Maserclass Oc h Ocober / 32

29 Forward only Smoohing Forward smoohing recursion V θ (x = Proof is rivial [ ] V 1 θ (x 1 + ϕ (x 1:, y p θ (x 1 y 1: 1, x dx 1 V θ (x = ϕ (x 1: p θ (x 1: 1 y 1: 1, x dx 1: 1 = [ ϕ 1 (x 1: 1 + ϕ (x 1:, y ] p θ (x 1: 2 y 1: 2, x 1 p θ (x 1 y 1: 1, x dx 1: 1 = { ϕ 1 (x 1: 1 p θ (x 1: 2 y 1: 2, x 1 dx 1: 2 }{{} V 1 θ (x 1 +ϕ (x 1:, y } p θ (x 1 y 1: 1, x dx 1 Exac implemenaion possible for finie sae-space and linear Gaussian models. A. Douce (UCL Maserclass Oc h Ocober / 32

30 Paricle Forward only Smoohing V θ A ime 1, we have p θ (x 1 y 1: 1 = 1 N N i=1 δ (i X (x 1 and { ( } 1 V 1 θ X (i 1 ( 1 i N. A ime, compue p θ (x y 1: = N i=1 W (i δ (i X (x and se } ( p θ X (i = { V 1 θ (x 1 + ϕ (x 1, x, y = N j=1 f θ ϕ θ = 1 N N i=1 V θ ( ( X (i X (i X (j 1. [ V 1 θ ( N j=1 f θ ( X (j 1 X (i ( ] +ϕ X (j (i 1,X,y, X (j 1 x 1 y 1: 1, X (i dx 1 This esimae is exacly he same as he Paricle FB esimae, compuaional complexiy O ( N 2. A. Douce (UCL Maserclass Oc h Ocober / 32

31 Online Paricle ML Inference A ime 1, we have p θ1: 1 (x 1 y 1: 1, { V θ 1: 1 1 ( } X (i 1, log p θ1: 1 (y 1: 1 = V θ 1: 1 1 (x 1 p θ1: 1 (x 1 y 1: 1 dx 1 and ge θ. A ime, use your favourie PF o compue p θ1: (x y 1: and ( V θ 1: X (i = { } V θ 1: 1 1 (x 1 + ϕ (x 1, x, y ( p θ1: x 1 y 1: 1, X (i dx 1, ϕ (x 1:, y = log f θ (x x 1 θ + log g θ (y x θ and Parameer updae log p θ1: (y 1: = θ +1 = θ + γ V θ 1: (x p θ1: (x y 1: dx ( log p θ1: (y 1: log p θ1: 1 (y 1: 1 A. Douce (UCL Maserclass Oc h Ocober / 32

32 Summary Online Bayesian parameer inference using paricle mehods is ye an unsolved problem. Paricle smoohing echniques can be used o perform off-line and on-line ML parameer esimaion. Observed informaion marix can also be evaluaed online in a sable manner. For online inference, compuaional complexiy is O ( N 2 a each ime sep and requires evaluaing f θ (x x 1. A. Douce (UCL Maserclass Oc h Ocober / 32

Particle Filtering and Smoothing Methods

Particle Filtering and Smoothing Methods Paricle Filering and Smoohing Mehods Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 3 rd Ocober 2012 A. Douce (UCL Maserclass Oc. 2012) 3 rd Ocober 2012 1 / 46 Sae-Space Models

More information

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

Linear Gaussian State Space Models

Linear Gaussian State Space Models Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

Temporal probability models

Temporal probability models Temporal probabiliy models CS194-10 Fall 2011 Lecure 25 CS194-10 Fall 2011 Lecure 25 1 Ouline Hidden variables Inerence: ilering, predicion, smoohing Hidden Markov models Kalman ilers (a brie menion) Dynamic

More information

Tom Heskes and Onno Zoeter. Presented by Mark Buller

Tom Heskes and Onno Zoeter. Presented by Mark Buller Tom Heskes and Onno Zoeer Presened by Mark Buller Dynamic Bayesian Neworks Direced graphical models of sochasic processes Represen hidden and observed variables wih differen dependencies Generalize Hidden

More information

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests Ouline Ouline Hypohesis Tes wihin he Maximum Likelihood Framework There are hree main frequenis approaches o inference wihin he Maximum Likelihood framework: he Wald es, he Likelihood Raio es and he Lagrange

More information

Testing for a Single Factor Model in the Multivariate State Space Framework

Testing for a Single Factor Model in the Multivariate State Space Framework esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics

More information

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM Robo Moion Model EKF based Localizaion EKF SLAM Graph SLAM General Robo Moion Model Robo sae v r Conrol a ime Sae updae model Noise model of robo conrol Noise model of conrol Robo moion model

More information

Object tracking: Using HMMs to estimate the geographical location of fish

Object tracking: Using HMMs to estimate the geographical location of fish Objec racking: Using HMMs o esimae he geographical locaion of fish 02433 - Hidden Markov Models Marin Wæver Pedersen, Henrik Madsen Course week 13 MWP, compiled June 8, 2011 Objecive: Locae fish from agging

More information

An introduction to the theory of SDDP algorithm

An introduction to the theory of SDDP algorithm An inroducion o he heory of SDDP algorihm V. Leclère (ENPC) Augus 1, 2014 V. Leclère Inroducion o SDDP Augus 1, 2014 1 / 21 Inroducion Large scale sochasic problem are hard o solve. Two ways of aacking

More information

Temporal probability models. Chapter 15, Sections 1 5 1

Temporal probability models. Chapter 15, Sections 1 5 1 Temporal probabiliy models Chaper 15, Secions 1 5 Chaper 15, Secions 1 5 1 Ouline Time and uncerainy Inerence: ilering, predicion, smoohing Hidden Markov models Kalman ilers (a brie menion) Dynamic Bayesian

More information

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19 Sequenial Imporance Sampling (SIS) AKA Paricle Filering, Sequenial Impuaion (Kong, Liu, Wong, 994) For many problems, sampling direcly from he arge disribuion is difficul or impossible. One reason possible

More information

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II

Zürich. ETH Master Course: L Autonomous Mobile Robots Localization II Roland Siegwar Margaria Chli Paul Furgale Marco Huer Marin Rufli Davide Scaramuzza ETH Maser Course: 151-0854-00L Auonomous Mobile Robos Localizaion II ACT and SEE For all do, (predicion updae / ACT),

More information

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems

Recursive Least-Squares Fixed-Interval Smoother Using Covariance Information based on Innovation Approach in Linear Continuous Stochastic Systems 8 Froniers in Signal Processing, Vol. 1, No. 1, July 217 hps://dx.doi.org/1.2266/fsp.217.112 Recursive Leas-Squares Fixed-Inerval Smooher Using Covariance Informaion based on Innovaion Approach in Linear

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006

2.160 System Identification, Estimation, and Learning. Lecture Notes No. 8. March 6, 2006 2.160 Sysem Idenificaion, Esimaion, and Learning Lecure Noes No. 8 March 6, 2006 4.9 Eended Kalman Filer In many pracical problems, he process dynamics are nonlinear. w Process Dynamics v y u Model (Linearized)

More information

Extended ensemble Kalman filters for high-dimensional hierarchical state-space models

Extended ensemble Kalman filters for high-dimensional hierarchical state-space models Exended ensemble Kalman filers for high-dimensional hierarchical sae-space models Jonahan Sroud McDonough School of Business Georgeown Universiy Join work wih Mahias Kazfuss (Texas A&M) and Chris Wikle

More information

Georey E. Hinton. University oftoronto. Technical Report CRG-TR February 22, Abstract

Georey E. Hinton. University oftoronto.   Technical Report CRG-TR February 22, Abstract Parameer Esimaion for Linear Dynamical Sysems Zoubin Ghahramani Georey E. Hinon Deparmen of Compuer Science Universiy oftorono 6 King's College Road Torono, Canada M5S A4 Email: zoubin@cs.orono.edu Technical

More information

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering

Announcements. Recap: Filtering. Recap: Reasoning Over Time. Example: State Representations for Robot Localization. Particle Filtering Inroducion o Arificial Inelligence V22.0472-001 Fall 2009 Lecure 18: aricle & Kalman Filering Announcemens Final exam will be a 7pm on Wednesday December 14 h Dae of las class 1.5 hrs long I won ask anyhing

More information

Introduction to Probability and Statistics Slides 4 Chapter 4

Introduction to Probability and Statistics Slides 4 Chapter 4 Inroducion o Probabiliy and Saisics Slides 4 Chaper 4 Ammar M. Sarhan, asarhan@mahsa.dal.ca Deparmen of Mahemaics and Saisics, Dalhousie Universiy Fall Semeser 8 Dr. Ammar Sarhan Chaper 4 Coninuous Random

More information

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix

More information

Exponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits

Exponential Weighted Moving Average (EWMA) Chart Under The Assumption of Moderateness And Its 3 Control Limits DOI: 0.545/mjis.07.5009 Exponenial Weighed Moving Average (EWMA) Char Under The Assumpion of Moderaeness And Is 3 Conrol Limis KALPESH S TAILOR Assisan Professor, Deparmen of Saisics, M. K. Bhavnagar Universiy,

More information

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

Distribution of Estimates

Distribution of Estimates Disribuion of Esimaes From Economerics (40) Linear Regression Model Assume (y,x ) is iid and E(x e )0 Esimaion Consisency y α + βx + he esimaes approach he rue values as he sample size increases Esimaion

More information

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks -

Deep Learning: Theory, Techniques & Applications - Recurrent Neural Networks - Deep Learning: Theory, Techniques & Applicaions - Recurren Neural Neworks - Prof. Maeo Maeucci maeo.maeucci@polimi.i Deparmen of Elecronics, Informaion and Bioengineering Arificial Inelligence and Roboics

More information

hen found from Bayes rule. Specically, he prior disribuion is given by p( ) = N( ; ^ ; r ) (.3) where r is he prior variance (we add on he random drif

hen found from Bayes rule. Specically, he prior disribuion is given by p( ) = N( ; ^ ; r ) (.3) where r is he prior variance (we add on he random drif Chaper Kalman Filers. Inroducion We describe Bayesian Learning for sequenial esimaion of parameers (eg. means, AR coeciens). The updae procedures are known as Kalman Filers. We show how Dynamic Linear

More information

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS

LECTURE 1: GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS LECTURE : GENERALIZED RAY KNIGHT THEOREM FOR FINITE MARKOV CHAINS We will work wih a coninuous ime reversible Markov chain X on a finie conneced sae space, wih generaor Lf(x = y q x,yf(y. (Recall ha q

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

Graphical Event Models and Causal Event Models. Chris Meek Microsoft Research

Graphical Event Models and Causal Event Models. Chris Meek Microsoft Research Graphical Even Models and Causal Even Models Chris Meek Microsof Research Graphical Models Defines a join disribuion P X over a se of variables X = X 1,, X n A graphical model M =< G, Θ > G =< X, E > is

More information

A Bayesian Approach to Spectral Analysis

A Bayesian Approach to Spectral Analysis Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2

More information

Lecture 9: September 25

Lecture 9: September 25 0-725: Opimizaion Fall 202 Lecure 9: Sepember 25 Lecurer: Geoff Gordon/Ryan Tibshirani Scribes: Xuezhi Wang, Subhodeep Moira, Abhimanu Kumar Noe: LaTeX emplae couresy of UC Berkeley EECS dep. Disclaimer:

More information

HPCFinance research project 8

HPCFinance research project 8 HPCFinance research projec 8 Financial models, volailiy risk, and Bayesian algorihms Hanxue Yang Tampere Universiy of Technology March 14, 2016 Research projec 8 12/2012 11/2015, Tampere Universiy of Technology,

More information

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence

Supplement for Stochastic Convex Optimization: Faster Local Growth Implies Faster Global Convergence Supplemen for Sochasic Convex Opimizaion: Faser Local Growh Implies Faser Global Convergence Yi Xu Qihang Lin ianbao Yang Proof of heorem heorem Suppose Assumpion holds and F (w) obeys he LGC (6) Given

More information

The complexity of climate model drifts

The complexity of climate model drifts The complexiy of climae model drifs Davide Zanchein Angelo Rubino Maeregu Arisido Carlo Gaean Universiy of Venice, Dep. of Environmeal Sc., Informaics and Saisics A conribuion o PREFACE-WP10: (Saisical

More information

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source

Multi-scale 2D acoustic full waveform inversion with high frequency impulsive source Muli-scale D acousic full waveform inversion wih high frequency impulsive source Vladimir N Zubov*, Universiy of Calgary, Calgary AB vzubov@ucalgaryca and Michael P Lamoureux, Universiy of Calgary, Calgary

More information

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation

Course Notes for EE227C (Spring 2018): Convex Optimization and Approximation Course Noes for EE7C Spring 018: Convex Opimizaion and Approximaion Insrucor: Moriz Hard Email: hard+ee7c@berkeley.edu Graduae Insrucor: Max Simchowiz Email: msimchow+ee7c@berkeley.edu Ocober 15, 018 3

More information

Filtering Turbulent Signals Using Gaussian and non-gaussian Filters with Model Error

Filtering Turbulent Signals Using Gaussian and non-gaussian Filters with Model Error Filering Turbulen Signals Using Gaussian and non-gaussian Filers wih Model Error June 3, 3 Nan Chen Cener for Amosphere Ocean Science (CAOS) Couran Insiue of Sciences New York Universiy / I. Ouline Use

More information

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl Time series model fiing via Kalman smoohing and EM esimaion in TimeModels.jl Gord Sephen Las updaed: January 206 Conens Inroducion 2. Moivaion and Acknowledgemens....................... 2.2 Noaion......................................

More information

ELE 538B: Large-Scale Optimization for Data Science. Quasi-Newton methods. Yuxin Chen Princeton University, Spring 2018

ELE 538B: Large-Scale Optimization for Data Science. Quasi-Newton methods. Yuxin Chen Princeton University, Spring 2018 ELE 538B: Large-Scale Opimizaion for Daa Science Quasi-Newon mehods Yuxin Chen Princeon Universiy, Spring 208 00 op ff(x (x)(k)) f p 2 L µ f 05 k f (xk ) k f (xk ) =) f op ieraions converges in only 5

More information

Sequential Importance Resampling (SIR) Particle Filter

Sequential Importance Resampling (SIR) Particle Filter Paricle Filers++ Pieer Abbeel UC Berkeley EECS Many slides adaped from Thrun, Burgard and Fox, Probabilisic Roboics 1. Algorihm paricle_filer( S -1, u, z ): 2. Sequenial Imporance Resampling (SIR) Paricle

More information

Hamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation:

Hamilton- J acobi Equation: Weak S olution We continue the study of the Hamilton-Jacobi equation: M ah 5 7 Fall 9 L ecure O c. 4, 9 ) Hamilon- J acobi Equaion: Weak S oluion We coninue he sudy of he Hamilon-Jacobi equaion: We have shown ha u + H D u) = R n, ) ; u = g R n { = }. ). In general we canno

More information

Recent Developments In Evolutionary Data Assimilation And Model Uncertainty Estimation For Hydrologic Forecasting Hamid Moradkhani

Recent Developments In Evolutionary Data Assimilation And Model Uncertainty Estimation For Hydrologic Forecasting Hamid Moradkhani Feb 6-8, 208 Recen Developmens In Evoluionary Daa Assimilaion And Model Uncerainy Esimaion For Hydrologic Forecasing Hamid Moradkhani Cener for Complex Hydrosysems Research Deparmen of Civil, Consrucion

More information

OBJECTIVES OF TIME SERIES ANALYSIS

OBJECTIVES OF TIME SERIES ANALYSIS OBJECTIVES OF TIME SERIES ANALYSIS Undersanding he dynamic or imedependen srucure of he observaions of a single series (univariae analysis) Forecasing of fuure observaions Asceraining he leading, lagging

More information

A variational radial basis function approximation for diffusion processes.

A variational radial basis function approximation for diffusion processes. A variaional radial basis funcion approximaion for diffusion processes. Michail D. Vreas, Dan Cornford and Yuan Shen {vreasm, d.cornford, y.shen}@ason.ac.uk Ason Universiy, Birmingham, UK hp://www.ncrg.ason.ac.uk

More information

Chapter 3 Boundary Value Problem

Chapter 3 Boundary Value Problem Chaper 3 Boundary Value Problem A boundary value problem (BVP) is a problem, ypically an ODE or a PDE, which has values assigned on he physical boundary of he domain in which he problem is specified. Le

More information

PARTICLE FILTERS FOR SYSTEM IDENTIFICATION OF STATE-SPACE MODELS LINEAR IN EITHER PARAMETERS OR STATES 1

PARTICLE FILTERS FOR SYSTEM IDENTIFICATION OF STATE-SPACE MODELS LINEAR IN EITHER PARAMETERS OR STATES 1 PARTICLE FILTERS FOR SYSTEM IDENTIFICATION OF STATE-SPACE MODELS LINEAR IN EITHER PARAMETERS OR STATES 1 Thomas Schön and Fredrik Gusafsson Division of Auomaic Conrol and Communicaion Sysems Deparmen of

More information

Foundations of Statistical Inference. Sufficient statistics. Definition (Sufficiency) Definition (Sufficiency)

Foundations of Statistical Inference. Sufficient statistics. Definition (Sufficiency) Definition (Sufficiency) Foundaions of Saisical Inference Julien Beresycki Lecure 2 - Sufficiency, Facorizaion, Minimal sufficiency Deparmen of Saisics Universiy of Oxford MT 2016 Julien Beresycki (Universiy of Oxford BS2a MT

More information

Estimation of Poses with Particle Filters

Estimation of Poses with Particle Filters Esimaion of Poses wih Paricle Filers Dr.-Ing. Bernd Ludwig Chair for Arificial Inelligence Deparmen of Compuer Science Friedrich-Alexander-Universiä Erlangen-Nürnberg 12/05/2008 Dr.-Ing. Bernd Ludwig (FAU

More information

Západočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France

Západočeská Univerzita v Plzni, Czech Republic and Groupe ESIEE Paris, France ADAPTIVE SIGNAL PROCESSING USING MAXIMUM ENTROPY ON THE MEAN METHOD AND MONTE CARLO ANALYSIS Pavla Holejšovsá, Ing. *), Z. Peroua, Ing. **), J.-F. Bercher, Prof. Assis. ***) Západočesá Univerzia v Plzni,

More information

Presentation Overview

Presentation Overview Acion Refinemen in Reinforcemen Learning by Probabiliy Smoohing By Thomas G. Dieerich & Didac Busques Speaer: Kai Xu Presenaion Overview Bacground The Probabiliy Smoohing Mehod Experimenal Sudy of Acion

More information

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number

More information

Appendix to Creating Work Breaks From Available Idleness

Appendix to Creating Work Breaks From Available Idleness Appendix o Creaing Work Breaks From Available Idleness Xu Sun and Ward Whi Deparmen of Indusrial Engineering and Operaions Research, Columbia Universiy, New York, NY, 127; {xs2235,ww24}@columbia.edu Sepember

More information

Modeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1

Modeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1 Modeling and Forecasing Volailiy Auoregressive Condiional Heeroskedasiciy Models Anhony Tay Slide 1 smpl @all line(m) sii dl_sii S TII D L _ S TII 4,000. 3,000.1.0,000 -.1 1,000 -. 0 86 88 90 9 94 96 98

More information

Simulating models with heterogeneous agents

Simulating models with heterogeneous agents Simulaing models wih heerogeneous agens Wouer J. Den Haan London School of Economics c by Wouer J. Den Haan Individual agen Subjec o employmen shocks (ε i, {0, 1}) Incomplee markes only way o save is hrough

More information

Lecture 1 Overview. course mechanics. outline & topics. what is a linear dynamical system? why study linear systems? some examples

Lecture 1 Overview. course mechanics. outline & topics. what is a linear dynamical system? why study linear systems? some examples EE263 Auumn 27-8 Sephen Boyd Lecure 1 Overview course mechanics ouline & opics wha is a linear dynamical sysem? why sudy linear sysems? some examples 1 1 Course mechanics all class info, lecures, homeworks,

More information

Lie Derivatives operator vector field flow push back Lie derivative of

Lie Derivatives operator vector field flow push back Lie derivative of Lie Derivaives The Lie derivaive is a mehod of compuing he direcional derivaive of a vecor field wih respec o anoher vecor field We already know how o make sense of a direcional derivaive of real valued

More information

Approximation Algorithms for Unique Games via Orthogonal Separators

Approximation Algorithms for Unique Games via Orthogonal Separators Approximaion Algorihms for Unique Games via Orhogonal Separaors Lecure noes by Konsanin Makarychev. Lecure noes are based on he papers [CMM06a, CMM06b, LM4]. Unique Games In hese lecure noes, we define

More information

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

Hidden Markov Models

Hidden Markov Models Hidden Markov Models Probabilisic reasoning over ime So far, we ve mosly deal wih episodic environmens Excepions: games wih muliple moves, planning In paricular, he Bayesian neworks we ve seen so far describe

More information

EMS SCM joint meeting. On stochastic partial differential equations of parabolic type

EMS SCM joint meeting. On stochastic partial differential equations of parabolic type EMS SCM join meeing Barcelona, May 28-30, 2015 On sochasic parial differenial equaions of parabolic ype Isván Gyöngy School of Mahemaics and Maxwell Insiue Edinburgh Universiy 1 I. Filering problem II.

More information

Lecture 10 Estimating Nonlinear Regression Models

Lecture 10 Estimating Nonlinear Regression Models Lecure 0 Esimaing Nonlinear Regression Models References: Greene, Economeric Analysis, Chaper 0 Consider he following regression model: y = f(x, β) + ε =,, x is kx for each, β is an rxconsan vecor, ε is

More information

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS

SEIF, EnKF, EKF SLAM. Pieter Abbeel UC Berkeley EECS SEIF, EnKF, EKF SLAM Pieer Abbeel UC Berkeley EECS Informaion Filer From an analyical poin of view == Kalman filer Difference: keep rack of he inverse covariance raher han he covariance marix [maer of

More information

Predator - Prey Model Trajectories and the nonlinear conservation law

Predator - Prey Model Trajectories and the nonlinear conservation law Predaor - Prey Model Trajecories and he nonlinear conservaion law James K. Peerson Deparmen of Biological Sciences and Deparmen of Mahemaical Sciences Clemson Universiy Ocober 28, 213 Ouline Drawing Trajecories

More information

A Sequential Smoothing Algorithm with Linear Computational Cost

A Sequential Smoothing Algorithm with Linear Computational Cost A Sequenial Smoohing Algorihm wih Linear Compuaional Cos Paul Fearnhead David Wyncoll Jonahan Tawn May 9, 2008 Absrac In his paper we propose a new paricle smooher ha has a compuaional complexiy of O(N),

More information

Recursive Estimation and Identification of Time-Varying Long- Term Fading Channels

Recursive Estimation and Identification of Time-Varying Long- Term Fading Channels Recursive Esimaion and Idenificaion of ime-varying Long- erm Fading Channels Mohammed M. Olama, Kiran K. Jaladhi, Seddi M. Djouadi, and Charalambos D. Charalambous 2 Universiy of ennessee Deparmen of Elecrical

More information

Empirical Process Theory

Empirical Process Theory Empirical Process heory 4.384 ime Series Analysis, Fall 27 Reciaion by Paul Schrimpf Supplemenary o lecures given by Anna Mikusheva Ocober 7, 28 Reciaion 7 Empirical Process heory Le x be a real-valued

More information

Generalized Least Squares

Generalized Least Squares Generalized Leas Squares Augus 006 1 Modified Model Original assumpions: 1 Specificaion: y = Xβ + ε (1) Eε =0 3 EX 0 ε =0 4 Eεε 0 = σ I In his secion, we consider relaxing assumpion (4) Insead, assume

More information

SMC in Estimation of a State Space Model

SMC in Estimation of a State Space Model SMC in Esimaion of a Sae Space Model Dong-Whan Ko Deparmen of Economics Rugers, he Sae Universiy of New Jersey December 31, 2012 Absrac I briefly summarize procedures for macroeconomic Dynamic Sochasic

More information

Part III: Chap. 2.5,2.6 & 12

Part III: Chap. 2.5,2.6 & 12 Survival Analysis Mah 434 Fall 2011 Par III: Chap. 2.5,2.6 & 12 Jimin Ding Mah Dep. www.mah.wusl.edu/ jmding/mah434/index.hml Jimin Ding, Ocober 4, 2011 Survival Analysis, Fall 2011 - p. 1/14 Jimin Ding,

More information

Lecture 33: November 29

Lecture 33: November 29 36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure

More information

Convergence of Sequential Monte Carlo Methods

Convergence of Sequential Monte Carlo Methods Convergence of Sequenial Mone Carlo Mehods by Dan Crisan - Arnaud Douce * Saisical Laboraory, DPMMS, Universiy of Cambridge, 16 Mill Lane, Cambridge, CB2 1SB, UK. Email: d.crisan@saslab.cam.ac.uk Signal

More information

Speech and Language Processing

Speech and Language Processing Speech and Language rocessing Lecure 4 Variaional inference and sampling Informaion and Communicaions Engineering Course Takahiro Shinozaki 08//5 Lecure lan (Shinozaki s par) I gives he firs 6 lecures

More information

Understanding the asymptotic behaviour of empirical Bayes methods

Understanding the asymptotic behaviour of empirical Bayes methods Undersanding he asympoic behaviour of empirical Bayes mehods Boond Szabo, Aad van der Vaar and Harry van Zanen EURANDOM, 11.10.2011. Conens 2/20 Moivaion Nonparameric Bayesian saisics Signal in Whie noise

More information

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,

More information

Random Walk with Anti-Correlated Steps

Random Walk with Anti-Correlated Steps Random Walk wih Ani-Correlaed Seps John Noga Dirk Wagner 2 Absrac We conjecure he expeced value of random walks wih ani-correlaed seps o be exacly. We suppor his conjecure wih 2 plausibiliy argumens and

More information

Math 10B: Mock Mid II. April 13, 2016

Math 10B: Mock Mid II. April 13, 2016 Name: Soluions Mah 10B: Mock Mid II April 13, 016 1. ( poins) Sae, wih jusificaion, wheher he following saemens are rue or false. (a) If a 3 3 marix A saisfies A 3 A = 0, hen i canno be inverible. True.

More information

5.1 - Logarithms and Their Properties

5.1 - Logarithms and Their Properties Chaper 5 Logarihmic Funcions 5.1 - Logarihms and Their Properies Suppose ha a populaion grows according o he formula P 10, where P is he colony size a ime, in hours. When will he populaion be 2500? We

More information

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2 Financial Economerics Kalman Filer: some applicaions o Finance Universiy of Evry - Maser 2 Eric Bouyé January 27, 2009 Conens 1 Sae-space models 2 2 The Scalar Kalman Filer 2 21 Presenaion 2 22 Summary

More information

Machine Learning 4771

Machine Learning 4771 ony Jebara, Columbia Universiy achine Learning 4771 Insrucor: ony Jebara ony Jebara, Columbia Universiy opic 20 Hs wih Evidence H Collec H Evaluae H Disribue H Decode H Parameer Learning via JA & E ony

More information

Air Traffic Forecast Empirical Research Based on the MCMC Method

Air Traffic Forecast Empirical Research Based on the MCMC Method Compuer and Informaion Science; Vol. 5, No. 5; 0 ISSN 93-8989 E-ISSN 93-8997 Published by Canadian Cener of Science and Educaion Air Traffic Forecas Empirical Research Based on he MCMC Mehod Jian-bo Wang,

More information

Anno accademico 2006/2007. Davide Migliore

Anno accademico 2006/2007. Davide Migliore Roboica Anno accademico 2006/2007 Davide Migliore migliore@ele.polimi.i Today Eercise session: An Off-side roblem Robo Vision Task Measuring NBA layers erformance robabilisic Roboics Inroducion The Bayesian

More information

Optimal Investment under Dynamic Risk Constraints and Partial Information

Optimal Investment under Dynamic Risk Constraints and Partial Information Opimal Invesmen under Dynamic Risk Consrains and Parial Informaion Wolfgang Puschögl Johann Radon Insiue for Compuaional and Applied Mahemaics (RICAM) Ausrian Academy of Sciences www.ricam.oeaw.ac.a 2

More information

Robert Kollmann. 6 September 2017

Robert Kollmann. 6 September 2017 Appendix: Supplemenary maerial for Tracable Likelihood-Based Esimaion of Non- Linear DSGE Models Economics Leers (available online 6 Sepember 207) hp://dx.doi.org/0.06/j.econle.207.08.027 Rober Kollmann

More information

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Robust estimation based on the first- and third-moment restrictions of the power transformation model h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,

More information

STAD57 Time Series Analysis. Lecture 14

STAD57 Time Series Analysis. Lecture 14 STAD57 Time Series Analysis Lecure 14 1 Maximum Likelihood AR(p) Esimaion Insead of Yule-Walker (MM) for AR(p) model, can use Maximum Likelihood (ML) esimaion Likelihood is join densiy of daa {x 1,,x n

More information

Smoothing Algorithms for State-Space Models

Smoothing Algorithms for State-Space Models IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. XX, NO. XX, 200X 1 Smoohing Algorihms for Sae-Space Models Mark Briers, Arnaud Douce, and Simon Maskell Absrac A prevalen problem in saisical signal processing,

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

III. Module 3. Empirical and Theoretical Techniques

III. Module 3. Empirical and Theoretical Techniques III. Module 3. Empirical and Theoreical Techniques Applied Saisical Techniques 3. Auocorrelaion Correcions Persisence affecs sandard errors. The radiional response is o rea he auocorrelaion as a echnical

More information

Stable approximations of optimal filters

Stable approximations of optimal filters Sable approximaions of opimal filers Joaquin Miguez Deparmen of Signal Theory & Communicaions, Universidad Carlos III de Madrid. E-mail: joaquin.miguez@uc3m.es Join work wih Dan Crisan (Imperial College

More information

EXERCISES FOR SECTION 1.5

EXERCISES FOR SECTION 1.5 1.5 Exisence and Uniqueness of Soluions 43 20. 1 v c 21. 1 v c 1 2 4 6 8 10 1 2 2 4 6 8 10 Graph of approximae soluion obained using Euler s mehod wih = 0.1. Graph of approximae soluion obained using Euler

More information

Optima and Equilibria for Traffic Flow on a Network

Optima and Equilibria for Traffic Flow on a Network Opima and Equilibria for Traffic Flow on a Nework Albero Bressan Deparmen of Mahemaics, Penn Sae Universiy bressan@mah.psu.edu Albero Bressan (Penn Sae) Opima and equilibria for raffic flow 1 / 1 A Traffic

More information

Recent Developments in the Unit Root Problem for Moving Averages

Recent Developments in the Unit Root Problem for Moving Averages Recen Developmens in he Uni Roo Problem for Moving Averages Richard A. Davis Colorado Sae Universiy Mei-Ching Chen Chaoyang Insiue of echnology homas Miosch Universiy of Groningen Non-inverible MA() Model

More information

What Ties Return Volatilities to Price Valuations and Fundamentals? On-Line Appendix

What Ties Return Volatilities to Price Valuations and Fundamentals? On-Line Appendix Wha Ties Reurn Volailiies o Price Valuaions and Fundamenals? On-Line Appendix Alexander David Haskayne School of Business, Universiy of Calgary Piero Veronesi Universiy of Chicago Booh School of Business,

More information

Richard A. Davis Colorado State University Bojan Basrak Eurandom Thomas Mikosch University of Groningen

Richard A. Davis Colorado State University Bojan Basrak Eurandom Thomas Mikosch University of Groningen Mulivariae Regular Variaion wih Applicaion o Financial Time Series Models Richard A. Davis Colorado Sae Universiy Bojan Basrak Eurandom Thomas Mikosch Universiy of Groningen Ouline + Characerisics of some

More information

U( θ, θ), U(θ 1/2, θ + 1/2) and Cauchy (θ) are not exponential families. (The proofs are not easy and require measure theory. See the references.

U( θ, θ), U(θ 1/2, θ + 1/2) and Cauchy (θ) are not exponential families. (The proofs are not easy and require measure theory. See the references. Lecure 5 Exponenial Families Exponenial families, also called Koopman-Darmois families, include a quie number of well known disribuions. Many nice properies enjoyed by exponenial families allow us o provide

More information

Continuous Time. Time-Domain System Analysis. Impulse Response. Impulse Response. Impulse Response. Impulse Response. ( t) + b 0.

Continuous Time. Time-Domain System Analysis. Impulse Response. Impulse Response. Impulse Response. Impulse Response. ( t) + b 0. Time-Domain Sysem Analysis Coninuous Time. J. Robers - All Righs Reserved. Edied by Dr. Rober Akl 1. J. Robers - All Righs Reserved. Edied by Dr. Rober Akl 2 Le a sysem be described by a 2 y ( ) + a 1

More information