Akaike criterion: Kullback-Leibler discrepancy

Size: px
Start display at page:

Download "Akaike criterion: Kullback-Leibler discrepancy"

Transcription

1 Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E ( 2 log(f (X ; ))) = 2 log(f (x; ))f (x; ) dx. Kullback-Leibler s discrepancy between f ( ; ) andf ( ; ) is Z f (x; ) d( ) = ( ) ( ) = 2 log f (x; ) dx. f (x; ) R n R n 24 novembre / 29

2 Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E ( 2 log(f (X ; ))) = 2 log(f (x; ))f (x; ) dx. Kullback-Leibler s discrepancy between f ( ; ) andf ( ; ) is Z f (x; ) d( ) = ( ) ( ) = 2 log f (x; ) dx. f (x; ) Jensen s inequality implies E(log(Y )) apple log(e(y )) for any random variable. Hence Z f (x; ) d( ) 2 log f (x; ) dx =0 R n f (x; ) with equality only if f (x; ) =f (x; ) a.e.[f ( ; )]. R n R n 24 novembre / 29

3 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. 24 novembre / 29

4 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. As the true model is unknown, we estimate d( ). 24 novembre / 29

5 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. As the true model is unknown, we estimate d( ). Let =(, #, 2 ) the parameters of an ARMA(p,q) model and ˆ the MLE based on X 1,...,X n.lety an independent realization of the same process. Then 2 log L Y ( ˆ, ˆ#, ˆ2 )=n log(2 )+n log( ˆ2 ) + log(r 0...r n 1 )+ S Y ( ˆ, ˆ#) ˆ2 24 novembre / 29

6 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. As the true model is unknown, we estimate d( ). Let =(, #, 2 ) the parameters of an ARMA(p,q) model and ˆ the MLE based on X 1,...,X n.lety an independent realization of the same process. Then 2 log L Y ( ˆ, ˆ#, ˆ2 )=n log(2 )+n log( ˆ2 ) + log(r 0...r n 1 )+ S Y ( ˆ, ˆ#) Indeed remember that for an ARMA(p,q) process L(, #, 1 2 )=(2 2 ) n/2 (r 0...r n 1 ) 1/2 exp S(, #) 2 2 P with S(, #) = n (x j ˆx j ) 2. j=1 r j 1 r 0,...,r n 1 depend only on parameters (, #) and not on observed data. Data enter likelihood only through the terms (x j ˆx j ) 2 in S(, #). ˆ2 24 novembre / 29

7 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. As the true model is unknown, we estimate d( ). Let =(, #, 2 )the parameters of an ARMA(p,q) model and ˆ the MLE based on X 1,...,X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆ, ˆ#, ˆ2 )=nlog(2 )+nlog( ˆ2 ) + log(r 0...r n 1 )+ S Y ( ˆ, ˆ#) ˆ2 24 novembre / 29

8 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. As the true model is unknown, we estimate d( ). Let =(, #, 2 )the parameters of an ARMA(p,q) model and ˆ the MLE based on X 1,...,X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆ, ˆ#, ˆ2 )=nlog(2 )+nlog( ˆ2 ) + log(r 0...r n 1 )+ S Y ( ˆ, ˆ#) = 2 log L X ( ˆ, ˆ#, ˆ2 )+ S Y ( ˆ, ˆ#) ˆ2 S X ( ˆ, ˆ#) ˆ2 ˆ2 24 novembre / 29

9 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. As the true model is unknown, we estimate d( ). Let =(, #, 2 )the parameters of an ARMA(p,q) model and ˆ the MLE based on X 1,...,X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆ, ˆ#, ˆ2 )=nlog(2 )+nlog( ˆ2 ) + log(r 0...r n 1 )+ S Y ( ˆ, ˆ#) = 2 log L X ( ˆ, ˆ#, ˆ2 )+ S Y ( ˆ, ˆ#) ˆ2 = 2 log L X ( ˆ, ˆ#, ˆ2 )+ S Y ( ˆ, ˆ#) ˆ2 S X ( ˆ, ˆ#) ˆ2 n =) ˆ2 24 novembre / 29

10 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,...,X n, we would like to minimize d( ) among all candidate models, given the true model. As the true model is unknown, we estimate d( ). Let =(, #, 2 )the parameters of an ARMA(p,q) model and ˆ the MLE based on X 1,...,X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆ, ˆ#, ˆ2 )=nlog(2 )+nlog( ˆ2 ) + log(r 0...r n 1 )+ S Y ( ˆ, ˆ#) = 2 log L X ( ˆ, ˆ#, ˆ2 )+ S Y ( ˆ, ˆ#) ˆ2 S X ( ˆ, ˆ#) = 2 log L X ( ˆ, ˆ#, ˆ2 )+ S Y ( ˆ, ˆ#) n =) ˆ2! E ( ( ˆ )) = E (,#, 2 )( 2 log L X ( ˆ, ˆ#, ˆ2 S Y ( ˆ, ˆ#) )) + E (,#, 2 ) ˆ2 ˆ2 ˆ2 n. 24 novembre / 29

11 Model choice. Akaike s criterion Kullback-Leibler discrepancy and AICC Using linear approximations, and asymptotic distributions of estimators, one arrives at S Y ( ˆ, ˆ#) 2 (n + p + q). E (,#, 2 ) Similarly n ˆ2 = S X ( ˆ, ˆ#) for large n is distributed as 2 2 (n p q 2) and is asymptotically independent of ( ˆ, ˆ#). 24 novembre / 29

12 Model choice. Akaike s criterion Kullback-Leibler discrepancy and AICC Using linear approximations, and asymptotic distributions of estimators, one arrives at S Y ( ˆ, ˆ#) 2 (n + p + q). E (,#, 2 ) Similarly n ˆ2 = S X ( ˆ, ˆ#) for large n is distributed as 2 2 (n p q 2) and is asymptotically independent of ( ˆ, ˆ#). Hence! S Y ( ˆ, ˆ#) 2 (n + p + q) E (,#, 2 ) ˆ2 2 (n p q 2)/n 24 novembre / 29

13 Model choice. Akaike s criterion Kullback-Leibler discrepancy and AICC Using linear approximations, and asymptotic distributions of estimators, one arrives at S Y ( ˆ, ˆ#) 2 (n + p + q). E (,#, 2 ) Similarly n ˆ2 = S X ( ˆ, ˆ#) for large n is distributed as 2 2 (n p q 2) and is asymptotically independent of ( ˆ, ˆ#). Hence! S Y ( ˆ, ˆ#) 2 (n + p + q) E (,#, 2 ) ˆ2 2 (n p q 2)/n From E ( ( ˆ )) = E (,#, 2 )( 2 log L X ( ˆ, ˆ#, ˆ2 )) + E (,#, 2 ) SY ( ˆ, ˆ#) ˆ2 n AICC = 2 log L X ( ˆ, ˆ#, ˆ2 2(p + q + 1)n )+ n p q 2 is an approximate unbiased estimate of (ˆ ). 24 novembre / 29

14 Model choice. Akaike s criterion Criteria for model choice The order is chosen by minimizing the value of AICC (Corrected Akaike s Information Criterion): 2 log L X ( ˆ, ˆ#, ˆ2 )+ 2(p+q+1)n n p q 2. The second term can be considered a penalty for models with a large number of parameters. 24 novembre / 29

15 Model choice. Akaike s criterion Criteria for model choice The order is chosen by minimizing the value of AICC (Corrected Akaike s Information Criterion): 2 log L X ( ˆ, ˆ#, ˆ2 )+ 2(p+q+1)n n p q 2. The second term can be considered a penalty for models with a large number of parameters. For n large it is approximately the same as Akaike s information Criterion (AIC): 2 log L X ( ˆ, ˆ#, ˆ2 ) + 2(p + q + 1), but carries a higher penalty for finite n, and thus is somewhat less likely to overfit. In R: AICC <- AIC(myfit,k=2*n/(n-p-q-2)) 24 novembre / 29

16 Model choice. Akaike s criterion Criteria for model choice The order is chosen by minimizing the value of AICC (Corrected Akaike s Information Criterion): 2 log L X ( ˆ, ˆ#, ˆ2 )+ 2(p+q+1)n n p q 2. The second term can be considered a penalty for models with a large number of parameters. For n large it is approximately the same as Akaike s information Criterion (AIC): 2 log L X ( ˆ, ˆ#, ˆ2 ) + 2(p + q + 1), but carries a higher penalty for finite n, and thus is somewhat less likely to overfit. In R: AICC <- AIC(myfit,k=2*n/(n-p-q-2)) A rule of thumb is the fits of model 1 and model 2 are not significantly di erent if AICC 1 AICC 2 < 2(only the di erence matters, not the absolute value of AICC). Hence, we may decide to choose model 1 if it simpler than 2 (or its residuals are closer to white-noise) even if AICC 1 > AICC 2 as long as AICC 1 < AICC novembre / 29

17 Model choice. Akaike s criterion Tests on residuals ˆX t (ˆ', ˆ#) predicted value of X t given the estimates ( ˆ', ˆ#). Ŵ t = X t ˆX t (ˆ', ˆ#) 1/2 r t 1 (ˆ', ˆ#) standardized residuals. Portmanteau tests on ACF of Ŵ t : Box-Pierce; Ljung-Box; Test on turning points Rank tests novembre / 29

18 Autocovariance Amutivariatestochasticprocess{X t 2 R m }, t 2 Z is weakly stationary if E(X 2 t,i) < 18t, i E(X t ) µ, Cov(X t+h, X t ) (h). In particular ij(h) =Cov(X t+h,i, X t,j ) = E((X t+h,i µ i )(X t,j µ j )). 24 novembre / 29

19 Autocovariance Amutivariatestochasticprocess{X t 2 R m }, t 2 Z is weakly stationary if E(X 2 t,i) < 18t, i E(X t ) µ, Cov(X t+h, X t ) (h). In particular ij(h) =Cov(X t+h,i, X t,j ) = E((X t+h,i µ i )(X t,j µ j )). Note that in general ij(h) 6= ji (h), while ij(h) =Cov(X t+h,i, X t,j ) = (stationarity) = Cov(X t,i, X t h,j ) =(symmetry)=cov(x t h,j, X t,i )= ji ( h). 24 novembre / 29

20 Autocovariance Amutivariatestochasticprocess{X t 2 R m }, t 2 Z is weakly stationary if E(X 2 t,i) < 18t, i E(X t ) µ, Cov(X t+h, X t ) (h). In particular ij(h) =Cov(X t+h,i, X t,j ) = E((X t+h,i µ i )(X t,j µ j )). Note that in general ij(h) 6= ji (h), while ij(h) =Cov(X t+h,i, X t,j ) = (stationarity) = Cov(X t,i, X t h,j ) =(symmetry)=cov(x t h,j, X t,i )= ji ( h). Another simple property is i,j (h) apple( ii (0) jj (0)) 1/2. The ACF ij (h) = ij(h) ( ii (0) jj (0)) 1/2. 24 novembre / 29

21 Multivariate White-noise and MA Amutivariatestochasticprocess{Z t 2 R m } is a white-noise with covariance S, {Z t } WN(0, S), if ( S h =0 {Z t } is stationary with mean 0 and ACVF (h) = 0 h 6= novembre / 29

22 Multivariate White-noise and MA Amutivariatestochasticprocess{Z t 2 R m } is a white-noise with covariance S, {Z t } WN(0, S), if ( S h =0 {Z t } is stationary with mean 0 and ACVF (h) = 0 h 6= 0. {X t 2 R m } is a linear process if X t = and C k are matrices s.t. +1X k= 1 +1P k= 1 C k Z t k {Z t } WN(0, S) (C k ) ij < +1 for all i, j =1...m. 24 novembre / 29

23 Multivariate White-noise and MA Amutivariatestochasticprocess{Z t 2 R m } is a white-noise with covariance S, {Z t } WN(0, S), if ( S h =0 {Z t } is stationary with mean 0 and ACVF (h) = 0 h 6= 0. {X t 2 R m } is a linear process if X t = and C k are matrices s.t. +1X k= 1 +1P k= 1 {X t } is stationary and X (h) = 1 P C k Z t k {Z t } WN(0, S) (C k ) ij < +1 for all i, j =1...m. k= 1 C k+h SC t k. 24 novembre / 29

24 Estimation of mean The mean µ can be estimated through X n. From the univariate theory, we know E( X n )=µ, V(( X n ) i )! 0(asn!1), if ii(h) h!1! 0 nv(( X n ) i )! +1X h= 1 ii(h) if +1X h= 1 ii (h) < +1. Moreover ( X n ) i is asymptotically normal. Stronger assumptions are required for the vector X n to be asymptotically normal Theorem then n 1/2 ( X n If X t = µ + µ)=) N(0, +1X k= 1 C k Z t k {Z t } WN(0, S) 1P k= 1 C k+h SC t k ). 24 novembre / 29

25 Confidence intervals for the mean In principle, from X n N(µ, 1 1P n m-dimensional confidence ellipsoid. But... k= 1 C k+h SCk t ) one could build an 24 novembre / 29

26 Confidence intervals for the mean In principle, from X n N(µ, 1 1P C k+h SCk t ) one could build an n k= 1 m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n +1P h= 1 ii(h)). 24 novembre / 29

27 Confidence intervals for the mean In principle, from X n N(µ, 1 1P C k+h SCk t ) one could build an n k= 1 m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n +1P h= 1 h= r +1P h= 1 ii(h)). ii(h) =2 f i (0) can be consistently estimated from rx h 2 ˆf i (0) = 1 ˆii (h) where r n!1and r n r n! novembre / 29

28 Confidence intervals for the mean In principle, from X n N(µ, 1 1P C k+h SCk t ) one could build an n k= 1 m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n +1P h= 1 h= r +1P h= 1 ii(h)). ii(h) =2 f i (0) can be consistently estimated from rx h 2 ˆf i (0) = 1 ˆii (h) where r n!1and r n r n! 0. Componentwise confidence intervals can be combined. If we found u i ( ) s.t. P( µ i ( X n ) i < u i (a)) 1, then mx P( µ i ( X n ) i <u i (a), i =1, m) 1 P µ i ( X n ) i u i (a) 1 m. i=1 24 novembre / 29

29 Confidence intervals for the mean In principle, from X n N(µ, 1 1P C k+h SCk t ) one could build an n k= 1 m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n +1P h= 1 h= r +1P h= 1 ii(h)). ii(h) =2 f i (0) can be consistently estimated from rx h 2 ˆf i (0) = 1 ˆii (h) where r n!1and r n r n! 0. Componentwise confidence intervals can be combined. If we found u i ( ) s.t. P( µ i ( X n ) i < u i (a)) 1, then mx P( µ i ( X n ) i <u i (a), i =1, m) 1 P µ i ( X n ) i u i (a) 1 m. Choosing = 0.05 m i=1, one has a 95%-confidence m-rectangle. 24 novembre / 29

30 Estimation of ACVF (bivariate case, m = 2) 8 1 np h >< (X t+h X n )(X t X n ) t 0 apple h < n n t=1 ˆ(h) = 1 np >: (X t+h Xn )(X t Xn ) t n < h < 0. n t= h+1 ˆ ij (h) =ˆij (h)(ˆii (0)ˆjj (0)) 1/2. 24 novembre / 29

31 Estimation of ACVF (bivariate case, m = 2) 8 1 np h >< (X t+h X n )(X t X n ) t 0 apple h < n n t=1 ˆ(h) = 1 np >: (X t+h Xn )(X t Xn ) t n < h < 0. n t= h+1 ˆ ij (h) =ˆij (h)(ˆii (0)ˆjj (0)) 1/2. Theorem If X t = µ + +1X k= 1 C k Z t k {Z t } IID(0, S) then 8 h ˆij (h)! P ij (h) ˆ ij (h)! P ij (h) as n!1. 24 novembre / 29

32 An example: Southern Oscillation Index Southern Oscillation Index (an environmental measure) compared to fish recruitment in South Pacific (1950 to 1985) Southern Oscillation Index Recruitment novembre / 29

33 ACF of Southern Oscillation Index soi soi & rec ACF Lag rec & soi Lag rec Bottom left panel is negative lags. 12 of ACF Lag Lag 24 novembre / 29

34 An example from Box and Jenkins Sales (V2) with a leading indicator (V1) sales V V Time 24 novembre / 29

35 ACF of sales data V1 V1 & V2 ACF Lag V2 & V Lag V2 Data are not stationary. ACF Lag Lag 24 novembre / 29

36 Di erenced sales data dsales V V Time 24 novembre / 29

37 ACF of sales data V1 V1 & V2 ACF ACF Lag V2 & V Lag V2 Only crosscorrelation relevant only at lags 2, Lag Lag 24 novembre / 29

38 Testing for independence of time-series: basis Generally asymptotic distribution of ˆij (h) is complicated. But 24 novembre / 29

39 Testing for independence of time-series: basis Generally asymptotic distribution of ˆij (h) is complicated. But Theorem Let X t,1 = 1X j= 1 with {Z t,1 } WN(0, j Z t j,1 X t,2 = 2 1), {Z t,2 } WN(0, 1X j= 1 jz t j,2 2 2) and independent. 24 novembre / 29

40 Testing for independence of time-series: basis Generally asymptotic distribution of ˆij (h) is complicated. But Theorem Let X t,1 = 1X j= 1 with {Z t,1 } WN(0, j Z t j,1 X t,2 = 2 1), {Z t,2 } WN(0, 1X j= 1 jz t j,2 2 2) and independent. Then nv(ˆ12 (h)) n!1! 1X j= 1 11(j) 22 (j). 24 novembre / 29

41 Testing for independence of time-series: basis Generally asymptotic distribution of ˆij (h) is complicated. But Theorem Let X t,1 = 1X j= 1 with {Z t,1 } WN(0, j Z t j,1 X t,2 = 2 1), {Z t,2 } WN(0, 1X j= 1 jz t j,2 2 2) and independent. Then 1X nv(ˆ12 (h)) n!1! 11(j) 22 (j). 0 n 1/2 ˆ 12 (h) =) j= 1 1X j= (j) 22 (j) A. 24 novembre / 29

42 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with i,i (h) =0.8 h. Then asymptotic variance of ˆ 12 (h) is X 1 n 1 h= h n novembre / 29

43 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with i,i (h) =0.8 h. Then asymptotic variance of ˆ 12 (h) is n 1 1 X h= h n Values of ˆ 12 (h) quite larger than 1.96n 1 should be common even if the two series are independent. 24 novembre / 29

44 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with i,i (h) =0.8 h. Then asymptotic variance of ˆ 12 (h) is n 1 1 X h= h n Values of ˆ 12 (h) quite larger than 1.96n 1 should be common even if the two series are independent. Instead, if one series is white-noise, then V(ˆ 12 (h)) 1 n. 24 novembre / 29

45 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with i,i (h) =0.8 h. Then asymptotic variance of ˆ 12 (h) is n 1 1 X h= h n Values of ˆ 12 (h) quite larger than 1.96n 1 should be common even if the two series are independent. Instead, if one series is white-noise, then V(ˆ 12 (h)) 1 n. Hence, in testing for independence, it is often recommended to prewhiten one series. 24 novembre / 29

46 Pre-whitening a time series Instead of testing ˆ 12 (h) of the original series, one trasforms them into white noise. If {X t,1 } and {X t,2 } are invertible ARMA, then where 1 P k=0 Z t,i = 1X k=0 (i) k X t k,i WN(0, (i) k zk = (i) (z) = (i) (z)/ (i) (z). 2 i ), i =1, 2 24 novembre / 29

47 Pre-whitening a time series Instead of testing ˆ 12 (h) of the original series, one trasforms them into white noise. If {X t,1 } and {X t,2 } are invertible ARMA, then where 1 P k=0 Z t,i = 1X k=0 (i) k X t k,i WN(0, (i) k zk = (i) (z) = (i) (z)/ (i) (z). 2 i ), i =1, 2 {X t,1 } and {X t,2 } are independent if and only if {Z t,1 } and {Z t,2 },hence one test for ˆ Z1,Z 2 (h). 24 novembre / 29

48 Pre-whitening a time series Instead of testing ˆ 12 (h) of the original series, one trasforms them into white noise. If {X t,1 } and {X t,2 } are invertible ARMA, then where 1 P k=0 Z t,i = 1X k=0 (i) k X t k,i WN(0, (i) k zk = (i) (z) = (i) (z)/ (i) (z). 2 i ), i =1, 2 {X t,1 } and {X t,2 } are independent if and only if {Z t,1 } and {Z t,2 },hence one test for ˆ Z1,Z 2 (h). As (i) (z) and (i) (z) not known, one fits an ARMA to the series, and uses the residuals Ŵ t,i in place of Z t,i. It may be enough doing this just to one series. 24 novembre / 29

49 Siimulated data 1st series is AR(1) with ' =0.9; 2nd series is AR(2) with ' 1 =0.7, ' 2 =0.27. dat_sim dat dat Time 24 novembre / 29

50 ACF of simulated data dat1 dat1 & dat2 ACF Lag Lag dat2 & dat1 dat2 ACF Lag Lag 24 novembre / 29

51 ACF of residuals MLE fits the correct model to both series. 24 novembre / 29

52 ACF of residuals MLE fits the correct model to both series. fitunk1$res fitunk1$res & fitunk2$res ACF ACF Lag fitunk2$res & fitunk1$res Lag fitunk2$res A few crosscorrelation coe cient may appear slightly significant Lag Lag 24 novembre / 29

53 Bartlett s formula More generally Theorem If {X t } is a bivariate Gaussian time series with lim ncov (ˆ 12(h), ˆ 12 (k)) = n!1 +1X j= 1 +1 P h= 1 ij (h) < 1, then h 11 (j) 22 (j + k h) + 12 (j + k) 21 (j h) 12 (h)( 11 (j) 12 (j + k)+ 22 (j) 21 (j k)) 12 (k)( 11 (j) 12 (j + h)+ 22 (j) 21 (j h)) (h) 12 (k) (j)+ 2 12(j)+ 1 i (j) 24 novembre / 29

54 Spectral density of multivariate series If +1P h= 1 ij (h) < 1, one can define f ( )= 1 1X e ih (h), 2 [, ] 2 and one obtains h= 1 Z (h) = e i h f ( ) d 24 novembre / 29

55 Spectral density of multivariate series If +1P h= 1 ij (h) < 1, one can define f ( )= 1 1X e ih (h), 2 [, ] 2 and one obtains and h= 1 Z (h) = X t = Z e i h f ( ) d e i h dz( ) where Z i ( ) are (complex) processes with independent increments s.t. Z 2 1 f ij ( ) d = E (Z i ( 2 ) Z i ( j ))(Z j ( 2 ) Z j ( 1 )). 24 novembre / 29

56 Coherence of series For a bivariate series the coherence at frequency is X 12 ( )= f 12 ( ) [f 11 ( )f 22 ( )] 1/2 and represents the correlation between dz 1 ( )anddz 2 ( ). The squared coherency function is X 12 ( ) 2 satisfies 0 apple X 12 ( ) 2 apple novembre / 29

57 Coherence of series For a bivariate series the coherence at frequency is X 12 ( )= f 12 ( ) [f 11 ( )f 22 ( )] 1/2 and represents the correlation between dz 1 ( )anddz 2 ( ). The squared coherency function is X 12 ( ) 2 satisfies 0 apple X 12 ( ) 2 apple 1. Remark. If X t,2 = +1 P k= 1 kx t k,1,then X 12 ( ) novembre / 29

58 Periodogram nx Define J(! j )=n 1/2 X t e it! j, t=1 for j between [(n 1)/2] and [n/2].! j =2 j/n Then I n (! j )=J(! j )J (! j )where means transpose and complex conjugate.!! I 12 (! j )= 1 nx nx X t1 e it! j X t2 e it! j n is the cross periodogram. t=1 t=1 24 novembre / 29

59 Estimation of spectral density and coherence Again, one estimates f ( )by ˆf ( )= 1 2 Xm n k= m n W n (k)i n g(n, )+2 k n. If X t = P +1 k= 1 C kz t k {Z t } IID(0, S) then 0 1 Xm n ˆf ij ( ) ij ( ), f ij ( ) Wn 2 (k) A 0 < <. k= m n The natural estimator of X 12 ( ) 2 is ˆ212( )= ˆf 12 ( ) 2 ˆf 11 ( )ˆf 22 ( ). 24 novembre / 29

60 An example of coherency estimation Squared coherency between SOI and recruitment squared coherency The horizontal line represents a (conservative) test of the assumption X 12 ( ) 2 = 0. Strong coherency at period 1 yr. and longer than frequency 24 novembre / 29

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ

More information

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time

More information

1.4 Properties of the autocovariance for stationary time-series

1.4 Properties of the autocovariance for stationary time-series 1.4 Properties of the autocovariance for stationary time-series In general, for a stationary time-series, (i) The variance is given by (0) = E((X t µ) 2 ) 0. (ii) (h) apple (0) for all h 2 Z. ThisfollowsbyCauchy-Schwarzas

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Classical Decomposition Model Revisited: I

Classical Decomposition Model Revisited: I Classical Decomposition Model Revisited: I recall classical decomposition model for time series Y t, namely, Y t = m t + s t + W t, where m t is trend; s t is periodic with known period s (i.e., s t s

More information

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

ARIMA Models. Richard G. Pierse

ARIMA Models. Richard G. Pierse ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

ARMA Models: I VIII 1

ARMA Models: I VIII 1 ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

The triangular lag window, also known as the Bartlett or Fejér window, W (!) = sin2 ( r!) r sin 2 (!).

The triangular lag window, also known as the Bartlett or Fejér window, W (!) = sin2 ( r!) r sin 2 (!). 4.6 Parametric Spectral Estimation 11 The triangular lag window, also known as the Bartlett or Fejér window, given by w(x) =1 x, x apple 1 leads to the Fejér smoothing window: In this case, (4.73) yields

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

Problem set 1 - Solutions

Problem set 1 - Solutions EMPIRICAL FINANCE AND FINANCIAL ECONOMETRICS - MODULE (8448) Problem set 1 - Solutions Exercise 1 -Solutions 1. The correct answer is (a). In fact, the process generating daily prices is usually assumed

More information

Automatic Autocorrelation and Spectral Analysis

Automatic Autocorrelation and Spectral Analysis Piet M.T. Broersen Automatic Autocorrelation and Spectral Analysis With 104 Figures Sprin ger 1 Introduction 1 1.1 Time Series Problems 1 2 Basic Concepts 11 2.1 Random Variables 11 2.2 Normal Distribution

More information

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation. Chapter 2 Linear models 2.1 Overview Linear process: A process {X n } is a linear process if it has the representation X n = b j ɛ n j j=0 for all n, where ɛ n N(0, σ 2 ) (Gaussian distributed with zero

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Modelling using ARMA processes

Modelling using ARMA processes Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr R Tsay An effective procedure for building empirical time series models is the Box-Jenkins approach, which consists of three stages: model

More information

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. 9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Regression with correlation for the Sales Data

Regression with correlation for the Sales Data Regression with correlation for the Sales Data Scatter with Loess Curve Time Series Plot Sales 30 35 40 45 Sales 30 35 40 45 0 10 20 30 40 50 Week 0 10 20 30 40 50 Week Sales Data What is our goal with

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

REVIEW (MULTIVARIATE LINEAR REGRESSION) Explain/Obtain the LS estimator () of the vector of coe cients (b)

REVIEW (MULTIVARIATE LINEAR REGRESSION) Explain/Obtain the LS estimator () of the vector of coe cients (b) REVIEW (MULTIVARIATE LINEAR REGRESSION) Explain/Obtain the LS estimator () of the vector of coe cients (b) Explain/Obtain the variance-covariance matrix of Both in the bivariate case (two regressors) and

More information

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models

Chapter 3. ARIMA Models. 3.1 Autoregressive Moving Average Models Chapter 3 ARIMA Models Classical regression is often insu cient for explaining all of the interesting dynamics of a time series. For example, the ACF of the residuals of the simple linear regression fit

More information

5 Autoregressive-Moving-Average Modeling

5 Autoregressive-Moving-Average Modeling 5 Autoregressive-Moving-Average Modeling 5. Purpose. Autoregressive-moving-average (ARMA models are mathematical models of the persistence, or autocorrelation, in a time series. ARMA models are widely

More information

Moving Average (MA) representations

Moving Average (MA) representations Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

STAD57 Time Series Analysis. Lecture 23

STAD57 Time Series Analysis. Lecture 23 STAD57 Time Series Analysis Lecture 23 1 Spectral Representation Spectral representation of stationary {X t } is: 12 i2t Xt e du 12 1/2 1/2 for U( ) a stochastic process with independent increments du(ω)=

More information

Kernel-based portmanteau diagnostic test for ARMA time series models

Kernel-based portmanteau diagnostic test for ARMA time series models STATISTICS RESEARCH ARTICLE Kernel-based portmanteau diagnostic test for ARMA time series models Esam Mahdi 1 * Received: 01 October 2016 Accepted: 07 February 2017 First Published: 21 February 2017 *Corresponding

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Akaike Information Criterion

Akaike Information Criterion Akaike Information Criterion Shuhua Hu Center for Research in Scientific Computation North Carolina State University Raleigh, NC February 7, 2012-1- background Background Model statistical model: Y j =

More information

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis CHAPTER 8 MODEL DIAGNOSTICS We have now discussed methods for specifying models and for efficiently estimating the parameters in those models. Model diagnostics, or model criticism, is concerned with testing

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Statistics 349(02) Review Questions

Statistics 349(02) Review Questions Statistics 349(0) Review Questions I. Suppose that for N = 80 observations on the time series { : t T} the following statistics were calculated: _ x = 10.54 C(0) = 4.99 In addition the sample autocorrelation

More information

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom). Original 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000 0 50 100 150 200 Residual 0.05 0.05 ACF 0 500 1000 1500 2000 0 50 100 150 200 Generated 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000

More information

Parametric Inference on Strong Dependence

Parametric Inference on Strong Dependence Parametric Inference on Strong Dependence Peter M. Robinson London School of Economics Based on joint work with Javier Hualde: Javier Hualde and Peter M. Robinson: Gaussian Pseudo-Maximum Likelihood Estimation

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN

More information

Order Selection for Vector Autoregressive Models

Order Selection for Vector Autoregressive Models IEEE TRANSACTIONS ON SIGNAL PROCESSING, VOL. 51, NO. 2, FEBRUARY 2003 427 Order Selection for Vector Autoregressive Models Stijn de Waele and Piet M. T. Broersen Abstract Order-selection criteria for vector

More information

Part 6: Multivariate Normal and Linear Models

Part 6: Multivariate Normal and Linear Models Part 6: Multivariate Normal and Linear Models 1 Multiple measurements Up until now all of our statistical models have been univariate models models for a single measurement on each member of a sample of

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Financial Econometrics and Quantitative Risk Managenent Return Properties

Financial Econometrics and Quantitative Risk Managenent Return Properties Financial Econometrics and Quantitative Risk Managenent Return Properties Eric Zivot Updated: April 1, 2013 Lecture Outline Course introduction Return definitions Empirical properties of returns Reading

More information

Model Selection for Geostatistical Models

Model Selection for Geostatistical Models Model Selection for Geostatistical Models Richard A. Davis Colorado State University http://www.stat.colostate.edu/~rdavis/lectures Joint work with: Jennifer A. Hoeting, Colorado State University Andrew

More information

The Multivariate Gaussian Distribution

The Multivariate Gaussian Distribution The Multivariate Gaussian Distribution Chuong B. Do October, 8 A vector-valued random variable X = T X X n is said to have a multivariate normal or Gaussian) distribution with mean µ R n and covariance

More information

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006. 6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series MA6622, Ernesto Mordecki, CityU, HK, 2006. References for Lecture 5: Quantitative Risk Management. A. McNeil, R. Frey,

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Univariate linear models

Univariate linear models Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation

More information

Chapter 3, Part V: More on Model Identification; Examples

Chapter 3, Part V: More on Model Identification; Examples Chapter 3, Part V: More on Model Identification; Examples Automatic Model Identification Through AIC As mentioned earlier, there is a clear need for automatic, objective methods of identifying the best

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

3 ARIMA Models. 3.1 Introduction

3 ARIMA Models. 3.1 Introduction 3 ARIMA Models 3. Introduction In Chapters and, we introduced autocorrelation and cross-correlation functions (ACFs and CCFs) as tools for clarifying relations that may occur within and between time series

More information

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Extreme inference in stationary time series

Extreme inference in stationary time series Extreme inference in stationary time series Moritz Jirak FOR 1735 February 8, 2013 1 / 30 Outline 1 Outline 2 Motivation The multivariate CLT Measuring discrepancies 3 Some theory and problems The problem

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Lecture 32: Asymptotic confidence sets and likelihoods

Lecture 32: Asymptotic confidence sets and likelihoods Lecture 32: Asymptotic confidence sets and likelihoods Asymptotic criterion In some problems, especially in nonparametric problems, it is difficult to find a reasonable confidence set with a given confidence

More information

ARMA models with time-varying coefficients. Periodic case.

ARMA models with time-varying coefficients. Periodic case. ARMA models with time-varying coefficients. Periodic case. Agnieszka Wy lomańska Hugo Steinhaus Center Wroc law University of Technology ARMA models with time-varying coefficients. Periodic case. 1 Some

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Time Series Analysis

Time Series Analysis Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1 Outline of the lecture Chapter 9 Multivariate time series 2 Transfer function

More information

white noise Time moving average

white noise Time moving average 1.3 Time Series Statistical Models 13 white noise w 3 1 0 1 0 100 00 300 400 500 Time moving average v 1.5 0.5 0.5 1.5 0 100 00 300 400 500 Fig. 1.8. Gaussian white noise series (top) and three-point moving

More information

CHAPTER 8 FORECASTING PRACTICE I

CHAPTER 8 FORECASTING PRACTICE I CHAPTER 8 FORECASTING PRACTICE I Sometimes we find time series with mixed AR and MA properties (ACF and PACF) We then can use mixed models: ARMA(p,q) These slides are based on: González-Rivera: Forecasting

More information