STAD57 Time Series Analysis. Lecture 14

Size: px
Start display at page:

Download "STAD57 Time Series Analysis. Lecture 14"

Transcription

1 STAD57 Time Series Analysis Lecure 14 1

2 Maximum Likelihood AR(p) Esimaion Insead of Yule-Walker (MM) for AR(p) model, can use Maximum Likelihood (ML) esimaion Likelihood is join densiy of daa {x 1,,x n } as funcion of parameers:, φ, 1,, n;, φ, L f x x ML esimaors are values of parameers ha maximize likelihood funcion Wha is he likelihood of he AR(p) model?

3 Maximum Likelihood AR(p) Esimaion For ML assume AR(p) model is Gaussian X 1 X 1 p ( X p ) W, { W } ~ N(0, ) Thus, condiional disribuion of X becomes X X,, X ~ N X ( X ), This allos us o rie likelihood as: 1 p 1 1 p p, φ, 1,, n;, φ, L f x x f ( x, x,, x ) f ( x x,, x ) f ( x x,, x ) 1 p p1 p 1 n n1 n p n f ( x, x,, x ) f ( x x,, x ) 1 p p1 1 p 3

4 Maximum Likelihood AR(p) Esimaion Densiies f ( x x 1,, x p), p 1,, n are rivial o find, bu iniial densiy f ( x1, x,, x p ) can be complicaed funcion of parameers. Since all densiies are Normal, likelihood funcion ill have form: n/ S, φ L, φ, ( ) g( φ) exp here S(µ,φ) is sum of squares n, φ (, φ) 1 1 p p S h x x x p1 and g(φ), h(µ, φ) are some funcions of µ & φ 4

5 Example Find he likelihood of he AR(1) model: X X W W N 1 1, { } ~ (0, ) 5

6 6

7 Maximum Likelihood AR(p) Esimaion To find ML esimaors maximize likelihood e need o ˆ, φˆ, ˆ L, φ, Can sho alays given by S ˆ, φˆ / n ˆ Bu here is no closed-form soluion for ˆ, φˆ Tha s because of complicaed form of g( φ), h(, φ) In pracice, use numerical echniques for finding ˆ, φˆ (maximizing L) Common mehods are Neon-Raphson or Fisher scoring algorihms 7

8 ML Esimaion in R R funcion ar.mle() performs AR(p) model ML esimaion E.g. daa AR order (i.e. p) ml.fi = ar.mle( rec, order=) ( ˆ ) ( φˆ ) ˆ ( ) ( Cov( φˆ )) To check esimaes: ml.fi$x.mean [1] ml.fi$ar [1] ml.fi$var.pred [1] ml.fi$asy.var [,1] [,] [1,] [,]

9 Condiional Leas Squares AR(p) Esimaion AR(p) model likelihood is: L, φ, f ( x 1, x,, x ) f ( x p x 1 1,, x p ) p Wha complicaes hings is iniial erm f ( x1, x,, x p ) For large n, effec of iniial densiy is small n relaive o produc f ( x x 1,, x p ) p1 In his case, can look a condiional likelihood L x x f x x x Condiion on firs #p values, o remove n n, φ, 1,, p ( 1 1,, p) p f x1 x x p (,,, ) 9

10 Condiional Leas Squares AR(p) Esimaion Condiional likelihood simplifies o n p S c, φ L, φ, x1,, xp ( ) exp here S c (µ,φ) is condiional sum of squares n, φ 1 1 S x x x c p1 p p Maximize condiional likelihood by minimizing S c (µ,φ) using ordinary leas square (OLS) from regression In TS, his is called condiional leas squares (LS) 10

11 Condiional Leas Squares AR(p) Esimaion Condiional LS esimaors for AR(p) model are given by OLS esimaion of X ˆ ˆ X ˆ ˆ ( X ˆ ) 1 1 p p X ˆ ˆ ˆ X, p 1 n p p here: ˆ ˆ, j 1,, j j p ˆ ˆ 1ˆ ˆ 0 1 p ˆ S c n ˆ, φˆ p 11

12 Condiional LS Esimaion in R R funcion ar.ols() performs AR(p) model condiional ML E.g. ls.fi = ar.ols( rec, order=) To check esimaes: daa AR order (i.e. p) ( ˆ ) ( φˆ ) ˆ ( ) ls.fi$x.mean [1] ls.fi$ar [1] ls.fi$var.pred [1]

13 AR(p) Esimaion As n, all AR(p) esimaion mehods (Yule- Walker, ML, condiional LS) give same resuls For large n, prefer Yule-Walker / condiional LS They are faser han ML (no need for numerical opimizaion) and heir resuls are no very differen Hoever for small sample sizes, and especially if TS is Gaussian, ML esimaion performs beer han he oher o mehods For small n, prefer ML esimaion 13

14 ARMA Esimaion Have seen esimaion for pure AR(p) using: Yule-Walker (MM) Condiional Leas Squares (LS) Maximum Likelihood (ML) beer for large n beer for small n For esimaion in general ARMA model, only rely on ML & LS MM can be problemaic / subopimal for models ha conain MA componen 14

15 Example Consider MA(1) model X W W MM: esimae (, ) by maching firs o sample & heoreical nd order momens 1 15

16 Maximum Likelihood ARMA Esimaion Consider Gaussian ARMA(p,q) model: X X X W W W 1 1 p p 1 1 q q Likelihood funcion: ih parameer vecor β, 1,, n; β, L f x x β 1 p 1 q There are o complicaions ih L(β): (,,,,,, ) WN sequence {W } is no observed canno use condiional disribuion X X,, X, W,, W ~ 1 p 1 q 1 1 p p 1 1 q q ~ N ( X ) ( X ) W W W, W depends on enire pas {X 1,,X 1 } (by inveribiliy) 16

17 Maximum Likelihood ARMA Esimaion Wrie ARMA likelihood as: f ( x ) f ( x x ) f ( x x, x ) f x x,, x n n1 1 n β, 1,, 1 L f x x x 1 For Gaussian ARMA models, X is X 1,, X1 1 equal o BLP X, hich is Normally disribued 1 (as linear funcion of Normals) ih variance P!! X X,, X ~ N X, P, L β f x x x ( x x ) 1 1,, 1 exp 1 1 P P 17

18 Maximum Likelihood ARMA Esimaion Given daa { x 1,, x n }, every BLP 1 is funcion of he parameers β rie x ( ) 1 β 1 1 Also P (0) (1 jj ) r ( β), here Thus, here: x j1 1( β) (1 ) ( β) & 1( β) (0) / j0 j r r r n ( β, ) 1 1,, 1 L f x x x 1/ S( β) r β r β r β n/ ( ) 1( ) ( ) n( ) exp n [ x x 1( β)] S( β) 1 r ( β) 18

19 Maximum Likelihood ARMA Esimaion ˆ β, ˆ L( β, ) Can sho alays given by ˆ S( βˆ ) / n There is no closed-form soluion for ˆβ To find ML esimaors maximize likelihood e need o In pracice, use numerical echniques for finding ˆβ (maximizing L) Common mehods are Neon-Raphson or Fisher scoring algorihms 19

20 Condiional Leas Squares ARMA Esimaion Likelihood funcion can be complicaed Need 1-sep-ahead BLP Can simplify esimaion by condiioning on firs p values & using runcaed predicion: Esimae squares p X 1, 1,, n ( β) ( x ) ( x ) ( β), p 1 j1 j j k 1 k k here p p 1 1 q ( β) ( β) ( β) 0 ˆβ Sc ( ) ( ) p 1 q by minimizing condiional sum of n β β and ˆ by S ( βˆ ) / n p c 0

21 Large Sample Behavior of ARMA Esimaion As n, ARMA esimaors (eiher ML or LS) behave as: ˆ ˆ ( φ, θ) ~ N ( φ, θ), Γ n, ˆ Γφφ Γφθ here Γ and p, q Γθφ Γθθ Γ Γ Γ φφ θθ φθ p 1 p, q ( i j) for AR( p) model: ( B) Y W Y i, j1 q ( i j) for AR( q) model: ( B) W i, j1 p, q Γ ( i j) for cross-covariance ( h) θφ Y i1, j1 Noe: Γ is p p, Γ is q q, and Γ is p q φφ θθ φθ Y 1

22 Example Find 1 Γ p, q for ARMA(1,1) model

23 ML Esimaion in R R funcion arima() performs ARMA esimaion For ML esimaion: daa ARMA order as (p,0,q) ml.fi = arima( soi, order=c(,0,) ) To check esimaes: ( βˆ ) ˆ ( ) ( Cov( βˆ )) ml.fi$coef ar1 ar ma1 ma inercep ml.fi$sigma [1] ml.fi$var.coef ar1 ar ma1 ma inercep ar e e e e e-06 ar e e e e e-06 ma e e e e e-06 ma e e e e e-06 inercep e e e e e-04 3

24 LS Esimaion in R For large n, perform condiional LS esimaion Similar resuls o ML, bu simpler ( faser) o run condiional LS: minimize Condiional Sum of Squares (CSS) ls.fi = arima( soi, order=c(,0,), mehod= CSS ) To check esimaes: ( βˆ ) ˆ ( ) ( Cov( βˆ )) ls.fi$coef ar1 ar ma1 ma inercep ls.fi$sigma [1] ls.fi$var.coef ar1 ar ma1 ma inercep ar e e e e-06 ar e e e e-06 ma e e e e-05 ma e e e e-05 inercep e e e e-04 4

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter

State-Space Models. Initialization, Estimation and Smoothing of the Kalman Filter Sae-Space Models Iniializaion, Esimaion and Smoohing of he Kalman Filer Iniializaion of he Kalman Filer The Kalman filer shows how o updae pas predicors and he corresponding predicion error variances when

More information

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD HAN XIAO 1. Penalized Leas Squares Lasso solves he following opimizaion problem, ˆβ lasso = arg max β R p+1 1 N y i β 0 N x ij β j β j (1.1) for some 0.

More information

Lecture 10 Estimating Nonlinear Regression Models

Lecture 10 Estimating Nonlinear Regression Models Lecure 0 Esimaing Nonlinear Regression Models References: Greene, Economeric Analysis, Chaper 0 Consider he following regression model: y = f(x, β) + ε =,, x is kx for each, β is an rxconsan vecor, ε is

More information

Generalized Least Squares

Generalized Least Squares Generalized Leas Squares Augus 006 1 Modified Model Original assumpions: 1 Specificaion: y = Xβ + ε (1) Eε =0 3 EX 0 ε =0 4 Eεε 0 = σ I In his secion, we consider relaxing assumpion (4) Insead, assume

More information

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 Kriging Models Predicing Arazine Concenraions in Surface Waer Draining Agriculural Waersheds Paul L. Mosquin, Jeremy Aldworh, Wenlin Chen Supplemenal Maerial Number

More information

GMM - Generalized Method of Moments

GMM - Generalized Method of Moments GMM - Generalized Mehod of Momens Conens GMM esimaion, shor inroducion 2 GMM inuiion: Maching momens 2 3 General overview of GMM esimaion. 3 3. Weighing marix...........................................

More information

Stationary Time Series

Stationary Time Series 3-Jul-3 Time Series Analysis Assoc. Prof. Dr. Sevap Kesel July 03 Saionary Time Series Sricly saionary process: If he oin dis. of is he same as he oin dis. of ( X,... X n) ( X h,... X nh) Weakly Saionary

More information

References are appeared in the last slide. Last update: (1393/08/19)

References are appeared in the last slide. Last update: (1393/08/19) SYSEM IDEIFICAIO Ali Karimpour Associae Professor Ferdowsi Universi of Mashhad References are appeared in he las slide. Las updae: 0..204 393/08/9 Lecure 5 lecure 5 Parameer Esimaion Mehods opics o be

More information

14 Autoregressive Moving Average Models

14 Autoregressive Moving Average Models 14 Auoregressive Moving Average Models In his chaper an imporan parameric family of saionary ime series is inroduced, he family of he auoregressive moving average, or ARMA, processes. For a large class

More information

OBJECTIVES OF TIME SERIES ANALYSIS

OBJECTIVES OF TIME SERIES ANALYSIS OBJECTIVES OF TIME SERIES ANALYSIS Undersanding he dynamic or imedependen srucure of he observaions of a single series (univariae analysis) Forecasing of fuure observaions Asceraining he leading, lagging

More information

Vectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1

Vectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1 Vecorauoregressive Model and Coinegraion Analysis Par V Time Series Analysis Dr. Sevap Kesel 1 Vecorauoregression Vecor auoregression (VAR) is an economeric model used o capure he evoluion and he inerdependencies

More information

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms

L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS. NA568 Mobile Robotics: Methods & Algorithms L07. KALMAN FILTERING FOR NON-LINEAR SYSTEMS NA568 Mobile Roboics: Mehods & Algorihms Today s Topic Quick review on (Linear) Kalman Filer Kalman Filering for Non-Linear Sysems Exended Kalman Filer (EKF)

More information

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé Bias in Condiional and Uncondiional Fixed Effecs Logi Esimaion: a Correcion * Tom Coupé Economics Educaion and Research Consorium, Naional Universiy of Kyiv Mohyla Academy Address: Vul Voloska 10, 04070

More information

Institute for Mathematical Methods in Economics. University of Technology Vienna. Singapore, May Manfred Deistler

Institute for Mathematical Methods in Economics. University of Technology Vienna. Singapore, May Manfred Deistler MULTIVARIATE TIME SERIES ANALYSIS AND FORECASTING Manfred Deisler E O S Economerics and Sysems Theory Insiue for Mahemaical Mehods in Economics Universiy of Technology Vienna Singapore, May 2004 Inroducion

More information

STAD57 Time Series Analysis. Lecture 17

STAD57 Time Series Analysis. Lecture 17 STAD57 Time Series Analysis Lecure 17 1 Exponenially Weighed Moving Average Model Consider ARIMA(0,1,1), or IMA(1,1), model 1 s order differences follow MA(1) X X W W Y X X W W 1 1 1 1 Very common model

More information

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H. ACE 56 Fall 005 Lecure 5: he Simple Linear Regression Model: Sampling Properies of he Leas Squares Esimaors by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Inference in he Simple

More information

STAD57 Time Series Analysis. Lecture 17

STAD57 Time Series Analysis. Lecture 17 STAD57 Time Series Analysis Lecure 17 1 Exponenially Weighed Moving Average Model Consider ARIMA(0,1,1), or IMA(1,1), model 1 s order differences follow MA(1) X X W W Y X X W W 1 1 1 1 Very common model

More information

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data

Chapter 2. Models, Censoring, and Likelihood for Failure-Time Data Chaper 2 Models, Censoring, and Likelihood for Failure-Time Daa William Q. Meeker and Luis A. Escobar Iowa Sae Universiy and Louisiana Sae Universiy Copyrigh 1998-2008 W. Q. Meeker and L. A. Escobar. Based

More information

Distribution of Estimates

Distribution of Estimates Disribuion of Esimaes From Economerics (40) Linear Regression Model Assume (y,x ) is iid and E(x e )0 Esimaion Consisency y α + βx + he esimaes approach he rue values as he sample size increases Esimaion

More information

Linear Gaussian State Space Models

Linear Gaussian State Space Models Linear Gaussian Sae Space Models Srucural Time Series Models Level and Trend Models Basic Srucural Model (BSM Dynamic Linear Models Sae Space Model Represenaion Level, Trend, and Seasonal Models Time Varying

More information

Robust estimation based on the first- and third-moment restrictions of the power transformation model

Robust estimation based on the first- and third-moment restrictions of the power transformation model h Inernaional Congress on Modelling and Simulaion, Adelaide, Ausralia, 6 December 3 www.mssanz.org.au/modsim3 Robus esimaion based on he firs- and hird-momen resricions of he power ransformaion Nawaa,

More information

STA 114: Statistics. Notes 2. Statistical Models and the Likelihood Function

STA 114: Statistics. Notes 2. Statistical Models and the Likelihood Function STA 114: Saisics Noes 2. Saisical Models and he Likelihood Funcion Describing Daa & Saisical Models A physicis has a heory ha makes a precise predicion of wha s o be observed in daa. If he daa doesn mach

More information

REVIEW OF MAXIMUM LIKELIHOOD ESTIMATION

REVIEW OF MAXIMUM LIKELIHOOD ESTIMATION REVIEW OF MAXIMUM LIKELIHOOD ESIMAION [] Maximum Likelihood Esimaor () Cases in which θ (unknown parameer) is scalar Noaional Clarificaion: From now on, we denoe he rue alue of θ as θ o hen, iew θ as a

More information

Testing the Random Walk Model. i.i.d. ( ) r

Testing the Random Walk Model. i.i.d. ( ) r he random walk heory saes: esing he Random Walk Model µ ε () np = + np + Momen Condiions where where ε ~ i.i.d he idea here is o es direcly he resricions imposed by momen condiions. lnp lnp µ ( lnp lnp

More information

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED

0.1 MAXIMUM LIKELIHOOD ESTIMATION EXPLAINED 0.1 MAXIMUM LIKELIHOOD ESTIMATIO EXPLAIED Maximum likelihood esimaion is a bes-fi saisical mehod for he esimaion of he values of he parameers of a sysem, based on a se of observaions of a random variable

More information

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis

Speaker Adaptation Techniques For Continuous Speech Using Medium and Small Adaptation Data Sets. Constantinos Boulis Speaker Adapaion Techniques For Coninuous Speech Using Medium and Small Adapaion Daa Ses Consaninos Boulis Ouline of he Presenaion Inroducion o he speaker adapaion problem Maximum Likelihood Sochasic Transformaions

More information

Data Fusion using Kalman Filter. Ioannis Rekleitis

Data Fusion using Kalman Filter. Ioannis Rekleitis Daa Fusion using Kalman Filer Ioannis Rekleiis Eample of a arameerized Baesian Filer: Kalman Filer Kalman filers (KF represen poserior belief b a Gaussian (normal disribuion A -d Gaussian disribuion is

More information

Smoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T

Smoothing. Backward smoother: At any give T, replace the observation yt by a combination of observations at & before T Smoohing Consan process Separae signal & noise Smooh he daa: Backward smooher: A an give, replace he observaion b a combinaion of observaions a & before Simple smooher : replace he curren observaion wih

More information

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8) I. Definiions and Problems A. Perfec Mulicollineariy Econ7 Applied Economerics Topic 7: Mulicollineariy (Sudenmund, Chaper 8) Definiion: Perfec mulicollineariy exiss in a following K-variable regression

More information

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles Diebold, Chaper 7 Francis X. Diebold, Elemens of Forecasing, 4h Ediion (Mason, Ohio: Cengage Learning, 006). Chaper 7. Characerizing Cycles Afer compleing his reading you should be able o: Define covariance

More information

Ensamble methods: Bagging and Boosting

Ensamble methods: Bagging and Boosting Lecure 21 Ensamble mehods: Bagging and Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Ensemble mehods Mixure of expers Muliple base models (classifiers, regressors), each covers a differen par

More information

Comparing Means: t-tests for One Sample & Two Related Samples

Comparing Means: t-tests for One Sample & Two Related Samples Comparing Means: -Tess for One Sample & Two Relaed Samples Using he z-tes: Assumpions -Tess for One Sample & Two Relaed Samples The z-es (of a sample mean agains a populaion mean) is based on he assumpion

More information

STAD57 Time Series Analysis. Lecture 5

STAD57 Time Series Analysis. Lecture 5 STAD57 Time Series Analysis Lecure 5 1 Exploraory Daa Analysis Check if given TS is saionary: µ is consan σ 2 is consan γ(s,) is funcion of h= s If no, ry o make i saionary using some of he mehods below:

More information

1. Joint stationarity and long run effects in a simple ADL(1,1) Suppose Xt, Y, also is stationary?

1. Joint stationarity and long run effects in a simple ADL(1,1) Suppose Xt, Y, also is stationary? HG Third lecure - 9. Jan. 04. Join saionariy and long run effecs in a simple ADL(,) Suppose X, Y are wo saionary ime series. Does i follow ha he sum, X Y, also is saionary? The answer is NO in general.

More information

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t Exercise 7 C P = α + β R P + u C = αp + βr + v (a) (b) C R = α P R + β + w (c) Assumpions abou he disurbances u, v, w : Classical assumions on he disurbance of one of he equaions, eg. on (b): E(v v s P,

More information

Lecture 33: November 29

Lecture 33: November 29 36-705: Inermediae Saisics Fall 2017 Lecurer: Siva Balakrishnan Lecure 33: November 29 Today we will coninue discussing he boosrap, and hen ry o undersand why i works in a simple case. In he las lecure

More information

20. Applications of the Genetic-Drift Model

20. Applications of the Genetic-Drift Model 0. Applicaions of he Geneic-Drif Model 1) Deermining he probabiliy of forming any paricular combinaion of genoypes in he nex generaion: Example: If he parenal allele frequencies are p 0 = 0.35 and q 0

More information

Ensamble methods: Boosting

Ensamble methods: Boosting Lecure 21 Ensamble mehods: Boosing Milos Hauskrech milos@cs.pi.edu 5329 Senno Square Schedule Final exam: April 18: 1:00-2:15pm, in-class Term projecs April 23 & April 25: a 1:00-2:30pm in CS seminar room

More information

Understanding the asymptotic behaviour of empirical Bayes methods

Understanding the asymptotic behaviour of empirical Bayes methods Undersanding he asympoic behaviour of empirical Bayes mehods Boond Szabo, Aad van der Vaar and Harry van Zanen EURANDOM, 11.10.2011. Conens 2/20 Moivaion Nonparameric Bayesian saisics Signal in Whie noise

More information

4.1 Other Interpretations of Ridge Regression

4.1 Other Interpretations of Ridge Regression CHAPTER 4 FURTHER RIDGE THEORY 4. Oher Inerpreaions of Ridge Regression In his secion we will presen hree inerpreaions for he use of ridge regression. The firs one is analogous o Hoerl and Kennard reasoning

More information

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2

Financial Econometrics Kalman Filter: some applications to Finance University of Evry - Master 2 Financial Economerics Kalman Filer: some applicaions o Finance Universiy of Evry - Maser 2 Eric Bouyé January 27, 2009 Conens 1 Sae-space models 2 2 The Scalar Kalman Filer 2 21 Presenaion 2 22 Summary

More information

ST4064. Time Series Analysis. Lecture notes

ST4064. Time Series Analysis. Lecture notes ST4064 Time Series Analysis ST4064 Time Series Analysis Lecure noes ST4064 Time Series Analysis Ouline I II Inroducion o ime series analysis Saionariy and ARMA modelling. Saionariy a. Definiions b. Sric

More information

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017

Two Popular Bayesian Estimators: Particle and Kalman Filters. McGill COMP 765 Sept 14 th, 2017 Two Popular Bayesian Esimaors: Paricle and Kalman Filers McGill COMP 765 Sep 14 h, 2017 1 1 1, dx x Bel x u x P x z P Recall: Bayes Filers,,,,,,, 1 1 1 1 u z u x P u z u x z P Bayes z = observaion u =

More information

CS376 Computer Vision Lecture 6: Optical Flow

CS376 Computer Vision Lecture 6: Optical Flow CS376 Compuer Vision Lecure 6: Opical Flow Qiing Huang Feb. 11 h 2019 Slides Credi: Krisen Grauman and Sebasian Thrun, Michael Black, Marc Pollefeys Opical Flow mage racking 3D compuaion mage sequence

More information

Answers to QUIZ

Answers to QUIZ 18441 Answers o QUIZ 1 18441 1 Le P be he proporion of voers who will voe Yes Suppose he prior probabiliy disribuion of P is given by Pr(P < p) p for 0 < p < 1 You ake a poll by choosing nine voers a random,

More information

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model

Modal identification of structures from roving input data by means of maximum likelihood estimation of the state space model Modal idenificaion of srucures from roving inpu daa by means of maximum likelihood esimaion of he sae space model J. Cara, J. Juan, E. Alarcón Absrac The usual way o perform a forced vibraion es is o fix

More information

Approximation Algorithms for Unique Games via Orthogonal Separators

Approximation Algorithms for Unique Games via Orthogonal Separators Approximaion Algorihms for Unique Games via Orhogonal Separaors Lecure noes by Konsanin Makarychev. Lecure noes are based on he papers [CMM06a, CMM06b, LM4]. Unique Games In hese lecure noes, we define

More information

Notes on Kalman Filtering

Notes on Kalman Filtering Noes on Kalman Filering Brian Borchers and Rick Aser November 7, Inroducion Daa Assimilaion is he problem of merging model predicions wih acual measuremens of a sysem o produce an opimal esimae of he curren

More information

How to Deal with Structural Breaks in Practical Cointegration Analysis

How to Deal with Structural Breaks in Practical Cointegration Analysis How o Deal wih Srucural Breaks in Pracical Coinegraion Analysis Roselyne Joyeux * School of Economic and Financial Sudies Macquarie Universiy December 00 ABSTRACT In his noe we consider he reamen of srucural

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LDA, logisic

More information

Linear Combinations of Volatility Forecasts for the WIG20 and Polish Exchange Rates

Linear Combinations of Volatility Forecasts for the WIG20 and Polish Exchange Rates Eliza Buszkowska Universiy of Poznań, Poland Linear Combinaions of Volailiy Forecass for he WIG0 and Polish Exchange Raes Absrak. As is known forecas combinaions may be beer forecass hen forecass obained

More information

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H. ACE 56 Fall 5 Lecure 8: The Simple Linear Regression Model: R, Reporing he Resuls and Predicion by Professor Sco H. Irwin Required Readings: Griffihs, Hill and Judge. "Explaining Variaion in he Dependen

More information

Econ Autocorrelation. Sanjaya DeSilva

Econ Autocorrelation. Sanjaya DeSilva Econ 39 - Auocorrelaion Sanjaya DeSilva Ocober 3, 008 1 Definiion Auocorrelaion (or serial correlaion) occurs when he error erm of one observaion is correlaed wih he error erm of any oher observaion. This

More information

A Shooting Method for A Node Generation Algorithm

A Shooting Method for A Node Generation Algorithm A Shooing Mehod for A Node Generaion Algorihm Hiroaki Nishikawa W.M.Keck Foundaion Laboraory for Compuaional Fluid Dynamics Deparmen of Aerospace Engineering, Universiy of Michigan, Ann Arbor, Michigan

More information

Chapter 3, Part IV: The Box-Jenkins Approach to Model Building

Chapter 3, Part IV: The Box-Jenkins Approach to Model Building Chaper 3, Par IV: The Box-Jenkins Approach o Model Building The ARMA models have been found o be quie useful for describing saionary nonseasonal ime series. A parial explanaion for his fac is provided

More information

Forecasting optimally

Forecasting optimally I) ile: Forecas Evaluaion II) Conens: Evaluaing forecass, properies of opimal forecass, esing properies of opimal forecass, saisical comparison of forecas accuracy III) Documenaion: - Diebold, Francis

More information

CHERNOFF DISTANCE AND AFFINITY FOR TRUNCATED DISTRIBUTIONS *

CHERNOFF DISTANCE AND AFFINITY FOR TRUNCATED DISTRIBUTIONS * haper 5 HERNOFF DISTANE AND AFFINITY FOR TRUNATED DISTRIBUTIONS * 5. Inroducion In he case of disribuions ha saisfy he regulariy condiions, he ramer- Rao inequaliy holds and he maximum likelihood esimaor

More information

HW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts)

HW6: MRI Imaging Pulse Sequences (7 Problems for 100 pts) HW6: MRI Imaging Pulse Sequences (7 Problems for 100 ps) GOAL The overall goal of HW6 is o beer undersand pulse sequences for MRI image reconsrucion. OBJECTIVES 1) Design a spin echo pulse sequence o image

More information

Section 4 NABE ASTEF 232

Section 4 NABE ASTEF 232 Secion 4 NABE ASTEF 3 APPLIED ECONOMETRICS: TIME-SERIES ANALYSIS 33 Inroducion and Review The Naure of Economic Modeling Judgemen calls unavoidable Economerics an ar Componens of Applied Economerics Specificaion

More information

- The whole joint distribution is independent of the date at which it is measured and depends only on the lag.

- The whole joint distribution is independent of the date at which it is measured and depends only on the lag. Saionary Processes Sricly saionary - The whole join disribuion is indeenden of he dae a which i is measured and deends only on he lag. - E y ) is a finie consan. ( - V y ) is a finie consan. ( ( y, y s

More information

Vector autoregression VAR. Case 1

Vector autoregression VAR. Case 1 Vecor auoregression VAR So far we have focused mosl on models where deends onl on as. More generall we migh wan o consider oin models ha involve more han one variable. There are wo reasons: Firs, we migh

More information

KEY. Math 334 Midterm III Winter 2008 section 002 Instructor: Scott Glasgow

KEY. Math 334 Midterm III Winter 2008 section 002 Instructor: Scott Glasgow KEY Mah 334 Miderm III Winer 008 secion 00 Insrucor: Sco Glasgow Please do NOT wrie on his exam. No credi will be given for such work. Raher wrie in a blue book, or on your own paper, preferably engineering

More information

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin ACE 56 Fall 005 Lecure 4: Simple Linear Regression Model: Specificaion and Esimaion by Professor Sco H. Irwin Required Reading: Griffihs, Hill and Judge. "Simple Regression: Economic and Saisical Model

More information

Testing for a Single Factor Model in the Multivariate State Space Framework

Testing for a Single Factor Model in the Multivariate State Space Framework esing for a Single Facor Model in he Mulivariae Sae Space Framework Chen C.-Y. M. Chiba and M. Kobayashi Inernaional Graduae School of Social Sciences Yokohama Naional Universiy Japan Faculy of Economics

More information

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM

Robot Motion Model EKF based Localization EKF SLAM Graph SLAM Robo Moion Model EKF based Localizaion EKF SLAM Graph SLAM General Robo Moion Model Robo sae v r Conrol a ime Sae updae model Noise model of robo conrol Noise model of conrol Robo moion model

More information

BOOTSTRAP PREDICTION INTERVALS FOR TIME SERIES MODELS WITH HETROSCEDASTIC ERRORS. Department of Statistics, Islamia College, Peshawar, KP, Pakistan 2

BOOTSTRAP PREDICTION INTERVALS FOR TIME SERIES MODELS WITH HETROSCEDASTIC ERRORS. Department of Statistics, Islamia College, Peshawar, KP, Pakistan 2 Pak. J. Sais. 017 Vol. 33(1), 1-13 BOOTSTRAP PREDICTIO ITERVAS FOR TIME SERIES MODES WITH HETROSCEDASTIC ERRORS Amjad Ali 1, Sajjad Ahmad Khan, Alamgir 3 Umair Khalil and Dos Muhammad Khan 1 Deparmen of

More information

Mathcad Lecture #8 In-class Worksheet Curve Fitting and Interpolation

Mathcad Lecture #8 In-class Worksheet Curve Fitting and Interpolation Mahcad Lecure #8 In-class Workshee Curve Fiing and Inerpolaion A he end of his lecure, you will be able o: explain he difference beween curve fiing and inerpolaion decide wheher curve fiing or inerpolaion

More information

13.3 Term structure models

13.3 Term structure models 13.3 Term srucure models 13.3.1 Expecaions hypohesis model - Simples "model" a) shor rae b) expecaions o ge oher prices Resul: y () = 1 h +1 δ = φ( δ)+ε +1 f () = E (y +1) (1) =δ + φ( δ) f (3) = E (y +)

More information

Hamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t

Hamilton- J acobi Equation: Explicit Formulas In this lecture we try to apply the method of characteristics to the Hamilton-Jacobi equation: u t M ah 5 2 7 Fall 2 0 0 9 L ecure 1 0 O c. 7, 2 0 0 9 Hamilon- J acobi Equaion: Explici Formulas In his lecure we ry o apply he mehod of characerisics o he Hamilon-Jacobi equaion: u + H D u, x = 0 in R n

More information

A new flexible Weibull distribution

A new flexible Weibull distribution Communicaions for Saisical Applicaions and Mehods 2016, Vol. 23, No. 5, 399 409 hp://dx.doi.org/10.5351/csam.2016.23.5.399 Prin ISSN 2287-7843 / Online ISSN 2383-4757 A new flexible Weibull disribuion

More information

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl

Time series model fitting via Kalman smoothing and EM estimation in TimeModels.jl Time series model fiing via Kalman smoohing and EM esimaion in TimeModels.jl Gord Sephen Las updaed: January 206 Conens Inroducion 2. Moivaion and Acknowledgemens....................... 2.2 Noaion......................................

More information

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important

Non-parametric techniques. Instance Based Learning. NN Decision Boundaries. Nearest Neighbor Algorithm. Distance metric important on-parameric echniques Insance Based Learning AKA: neares neighbor mehods, non-parameric, lazy, memorybased, or case-based learning Copyrigh 2005 by David Helmbold 1 Do no fi a model (as do LTU, decision

More information

Maximum Likelihood Parameter Estimation in State-Space Models

Maximum Likelihood Parameter Estimation in State-Space Models Maximum Likelihood Parameer Esimaion in Sae-Space Models Arnaud Douce Deparmen of Saisics, Oxford Universiy Universiy College London 4 h Ocober 212 A. Douce (UCL Maserclass Oc. 212 4 h Ocober 212 1 / 32

More information

U( θ, θ), U(θ 1/2, θ + 1/2) and Cauchy (θ) are not exponential families. (The proofs are not easy and require measure theory. See the references.

U( θ, θ), U(θ 1/2, θ + 1/2) and Cauchy (θ) are not exponential families. (The proofs are not easy and require measure theory. See the references. Lecure 5 Exponenial Families Exponenial families, also called Koopman-Darmois families, include a quie number of well known disribuions. Many nice properies enjoyed by exponenial families allow us o provide

More information

Unit Root Time Series. Univariate random walk

Unit Root Time Series. Univariate random walk Uni Roo ime Series Univariae random walk Consider he regression y y where ~ iid N 0, he leas squares esimae of is: ˆ yy y y yy Now wha if = If y y hen le y 0 =0 so ha y j j If ~ iid N 0, hen y ~ N 0, he

More information

Lecture 10 - Model Identification

Lecture 10 - Model Identification Lecure - odel Idenificaion Wha is ssem idenificaion? Direc impulse response idenificaion Linear regression Regularizaion Parameric model ID nonlinear LS Conrol Engineering - Wha is Ssem Idenificaion? Experimen

More information

DYNAMIC ECONOMETRIC MODELS vol NICHOLAS COPERNICUS UNIVERSITY - TORUŃ Józef Stawicki and Joanna Górka Nicholas Copernicus University

DYNAMIC ECONOMETRIC MODELS vol NICHOLAS COPERNICUS UNIVERSITY - TORUŃ Józef Stawicki and Joanna Górka Nicholas Copernicus University DYNAMIC ECONOMETRIC MODELS vol.. - NICHOLAS COPERNICUS UNIVERSITY - TORUŃ 996 Józef Sawicki and Joanna Górka Nicholas Copernicus Universiy ARMA represenaion for a sum of auoregressive processes In he ime

More information

Y, where. 1 Estimate St.error

Y, where. 1 Estimate St.error 1 HG Feb 2014 ECON 5101 Exercises III - 24 Feb 2014 Exercise 1 In lecure noes 3 (LN3 page 11) we esimaed an ARMA(1,2) for daa) for he period, 1978q2-2013q2 Le Y ln BNP ln BNP (Norwegian Model: Y Y, where

More information

CH Sean Han QF, NTHU, Taiwan BFS2010. (Joint work with T.-Y. Chen and W.-H. Liu)

CH Sean Han QF, NTHU, Taiwan BFS2010. (Joint work with T.-Y. Chen and W.-H. Liu) CH Sean Han QF, NTHU, Taiwan BFS2010 (Join work wih T.-Y. Chen and W.-H. Liu) Risk Managemen in Pracice: Value a Risk (VaR) / Condiional Value a Risk (CVaR) Volailiy Esimaion: Correced Fourier Transform

More information

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j =

12: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME. Σ j = 1: AUTOREGRESSIVE AND MOVING AVERAGE PROCESSES IN DISCRETE TIME Moving Averages Recall ha a whie noise process is a series { } = having variance σ. The whie noise process has specral densiy f (λ) = of

More information

INTRODUCTION TO MACHINE LEARNING 3RD EDITION

INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN The MIT Press, 2014 Lecure Slides for INTRODUCTION TO MACHINE LEARNING 3RD EDITION alpaydin@boun.edu.r hp://www.cmpe.boun.edu.r/~ehem/i2ml3e CHAPTER 2: SUPERVISED LEARNING Learning a Class

More information

Financial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS

Financial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS Name SOLUTIONS Financial Economerics Jeffrey R. Russell Miderm Winer 009 SOLUTIONS You have 80 minues o complee he exam. Use can use a calculaor and noes. Try o fi all your work in he space provided. If

More information

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum.

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum. January 01 Final Exam Quesions: Mark W. Wason (Poins/Minues are given in Parenheses) (15) 1. Suppose ha y follows he saionary AR(1) process y = y 1 +, where = 0.5 and ~ iid(0,1). Le x = (y + y 1 )/. (11)

More information

Lie Derivatives operator vector field flow push back Lie derivative of

Lie Derivatives operator vector field flow push back Lie derivative of Lie Derivaives The Lie derivaive is a mehod of compuing he direcional derivaive of a vecor field wih respec o anoher vecor field We already know how o make sense of a direcional derivaive of real valued

More information

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004

Augmented Reality II - Kalman Filters - Gudrun Klinker May 25, 2004 Augmened Realiy II Kalman Filers Gudrun Klinker May 25, 2004 Ouline Moivaion Discree Kalman Filer Modeled Process Compuing Model Parameers Algorihm Exended Kalman Filer Kalman Filer for Sensor Fusion Lieraure

More information

System of Linear Differential Equations

System of Linear Differential Equations Sysem of Linear Differenial Equaions In "Ordinary Differenial Equaions" we've learned how o solve a differenial equaion for a variable, such as: y'k5$e K2$x =0 solve DE yx = K 5 2 ek2 x C_C1 2$y''C7$y

More information

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests Ouline Ouline Hypohesis Tes wihin he Maximum Likelihood Framework There are hree main frequenis approaches o inference wihin he Maximum Likelihood framework: he Wald es, he Likelihood Raio es and he Lagrange

More information

What Ties Return Volatilities to Price Valuations and Fundamentals? On-Line Appendix

What Ties Return Volatilities to Price Valuations and Fundamentals? On-Line Appendix Wha Ties Reurn Volailiies o Price Valuaions and Fundamenals? On-Line Appendix Alexander David Haskayne School of Business, Universiy of Calgary Piero Veronesi Universiy of Chicago Booh School of Business,

More information

Box-Jenkins Modelling of Nigerian Stock Prices Data

Box-Jenkins Modelling of Nigerian Stock Prices Data Greener Journal of Science Engineering and Technological Research ISSN: 76-7835 Vol. (), pp. 03-038, Sepember 0. Research Aricle Box-Jenkins Modelling of Nigerian Sock Prices Daa Ee Harrison Euk*, Barholomew

More information

Modeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1

Modeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1 Modeling and Forecasing Volailiy Auoregressive Condiional Heeroskedasiciy Models Anhony Tay Slide 1 smpl @all line(m) sii dl_sii S TII D L _ S TII 4,000. 3,000.1.0,000 -.1 1,000 -. 0 86 88 90 9 94 96 98

More information

COMPARISON OF THE DIFFERENCING PARAMETER ESTIMATION FROM ARFIMA MODEL BY SPECTRAL REGRESSION METHODS. By Gumgum Darmawan, Nur Iriawan, Suhartono

COMPARISON OF THE DIFFERENCING PARAMETER ESTIMATION FROM ARFIMA MODEL BY SPECTRAL REGRESSION METHODS. By Gumgum Darmawan, Nur Iriawan, Suhartono COMPARISON OF THE DIFFERENCING PARAMETER ESTIMATION FROM ARFIMA MODEL BY SPECTRAL REGRESSION METHODS By Gumgum Darmawan, Nur Iriawan, Suharono INTRODUCTION (1) TIME SERIES MODELS BASED ON VALUE OF DIFFERENCING

More information

Presentation Overview

Presentation Overview Acion Refinemen in Reinforcemen Learning by Probabiliy Smoohing By Thomas G. Dieerich & Didac Busques Speaer: Kai Xu Presenaion Overview Bacground The Probabiliy Smoohing Mehod Experimenal Sudy of Acion

More information

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19

m = 41 members n = 27 (nonfounders), f = 14 (founders) 8 markers from chromosome 19 Sequenial Imporance Sampling (SIS) AKA Paricle Filering, Sequenial Impuaion (Kong, Liu, Wong, 994) For many problems, sampling direcly from he arge disribuion is difficul or impossible. One reason possible

More information

The Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form

The Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form Chaper 6 The Simple Linear Regression Model: Reporing he Resuls and Choosing he Funcional Form To complee he analysis of he simple linear regression model, in his chaper we will consider how o measure

More information

Chapter 4. Location-Scale-Based Parametric Distributions. William Q. Meeker and Luis A. Escobar Iowa State University and Louisiana State University

Chapter 4. Location-Scale-Based Parametric Distributions. William Q. Meeker and Luis A. Escobar Iowa State University and Louisiana State University Chaper 4 Locaion-Scale-Based Parameric Disribuions William Q. Meeker and Luis A. Escobar Iowa Sae Universiy and Louisiana Sae Universiy Copyrigh 1998-2008 W. Q. Meeker and L. A. Escobar. Based on he auhors

More information

A Bayesian Approach to Spectral Analysis

A Bayesian Approach to Spectral Analysis Chirped Signals A Bayesian Approach o Specral Analysis Chirped signals are oscillaing signals wih ime variable frequencies, usually wih a linear variaion of frequency wih ime. E.g. f() = A cos(ω + α 2

More information

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010

Simulation-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Simulaion-Solving Dynamic Models ABE 5646 Week 2, Spring 2010 Week Descripion Reading Maerial 2 Compuer Simulaion of Dynamic Models Finie Difference, coninuous saes, discree ime Simple Mehods Euler Trapezoid

More information

Errata (1 st Edition)

Errata (1 st Edition) P Sandborn, os Analysis of Elecronic Sysems, s Ediion, orld Scienific, Singapore, 03 Erraa ( s Ediion) S K 05D Page 8 Equaion (7) should be, E 05D E Nu e S K he L appearing in he equaion in he book does

More information

Recent Developments in the Unit Root Problem for Moving Averages

Recent Developments in the Unit Root Problem for Moving Averages Recen Developmens in he Uni Roo Problem for Moving Averages Richard A. Davis Colorado Sae Universiy Mei-Ching Chen Chaoyang Insiue of echnology homas Miosch Universiy of Groningen Non-inverible MA() Model

More information

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H. ACE 564 Spring 2006 Lecure 7 Exensions of The Muliple Regression Model: Dumm Independen Variables b Professor Sco H. Irwin Readings: Griffihs, Hill and Judge. "Dumm Variables and Varing Coefficien Models

More information