Akaike criterion: Kullback-Leibler discrepancy

Size: px
Start display at page:

Download "Akaike criterion: Kullback-Leibler discrepancy"

Transcription

1 Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ θ) = E θ ( 2 log(f (X ; ψ))) = 2 log(f (x; ψ))f (x; θ) dx. R n Kullback-Leibler s discrepancy between f ( ; ψ) and f ( ; θ) is ( ) f (x; ψ) d(ψ θ) = (ψ θ) (θ θ) = 2 log f (x; θ) dx. R n f (x; θ) 24 novembre / 29

2 Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ψ), ψ Ψ}, Kullback-Leibler s index of f ( ; ψ) relative to f ( ; θ) is (ψ θ) = E θ ( 2 log(f (X ; ψ))) = 2 log(f (x; ψ))f (x; θ) dx. R n Kullback-Leibler s discrepancy between f ( ; ψ) and f ( ; θ) is ( ) f (x; ψ) d(ψ θ) = (ψ θ) (θ θ) = 2 log f (x; θ) dx. R n f (x; θ) Jensen s inequality implies E(log(Y )) log(e(y )) for any random variable. Hence ( ) f (x; ψ) d(ψ θ) 2 log f (x; θ) dx = 0 R n f (x; θ) with equality only if f (x; ψ) = f (x; θ) a.e. [f ( ; θ)]. 24 novembre / 29

3 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. 24 novembre / 29

4 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. As the true model is unknown, we estimate d(ψ θ). 24 novembre / 29

5 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. As the true model is unknown, we estimate d(ψ θ). Let ψ = (φ, ϑ, σ 2 ) the parameters of an ARMA(p,q) model and ˆψ the MLE based on X 1,..., X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆφ, ˆϑ, ˆσ 2 ) = n log(2π) + n log( ˆσ 2 ) + log(r 0... r n 1 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 24 novembre / 29

6 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. As the true model is unknown, we estimate d(ψ θ). Let ψ = (φ, ϑ, σ 2 ) the parameters of an ARMA(p,q) model and ˆψ the MLE based on X 1,..., X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆφ, ˆϑ, ˆσ 2 ) = n log(2π) + n log( ˆσ 2 ) + log(r 0... r n 1 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 Indeed remember that for an ARMA(p,q) process { L(φ, ϑ, σ 2 ) = (2πσ 2 ) n/2 (r 0... r n 1 ) 1/2 exp 1 } S(φ, ϑ) 2σ2 with S(φ, ϑ) = n (x j ˆx j ) 2. j=1 r j 1 r 0,..., r n 1 depend only on parameters (φ, ϑ) and not on observed data. Data enter likelihood only through the terms (x j ˆx j ) 2 in S(φ, ϑ). 24 novembre / 29

7 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. As the true model is unknown, we estimate d(ψ θ). Let ψ = (φ, ϑ, σ 2 ) the parameters of an ARMA(p,q) model and ˆψ the MLE based on X 1,..., X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆφ, ˆϑ, ˆσ 2 ) = n log(2π) + n log( ˆσ 2 ) + log(r 0... r n 1 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 24 novembre / 29

8 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. As the true model is unknown, we estimate d(ψ θ). Let ψ = (φ, ϑ, σ 2 ) the parameters of an ARMA(p,q) model and ˆψ the MLE based on X 1,..., X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆφ, ˆϑ, ˆσ 2 ) = n log(2π) + n log( ˆσ 2 ) + log(r 0... r n 1 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 = 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 S X ( ˆφ, ˆϑ) ˆσ 2 24 novembre / 29

9 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. As the true model is unknown, we estimate d(ψ θ). Let ψ = (φ, ϑ, σ 2 ) the parameters of an ARMA(p,q) model and ˆψ the MLE based on X 1,..., X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆφ, ˆϑ, ˆσ 2 ) = n log(2π) + n log( ˆσ 2 ) + log(r 0... r n 1 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 = 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 S X ( ˆφ, ˆϑ) ˆσ 2 = 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 n = 24 novembre / 29

10 Model choice. Akaike s criterion Approximating Kullback-Leibler discrepancy Given observations X 1,..., X n, we would like to minimize d(ψ θ) among all candidate models ψ, given the true model θ. As the true model is unknown, we estimate d(ψ θ). Let ψ = (φ, ϑ, σ 2 ) the parameters of an ARMA(p,q) model and ˆψ the MLE based on X 1,..., X n. Let Y an independent realization of the same process. Then 2 log L Y ( ˆφ, ˆϑ, ˆσ 2 ) = n log(2π) + n log( ˆσ 2 ) + log(r 0... r n 1 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 = 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + S Y ( ˆφ, ˆϑ) ˆσ 2 S X ( ˆφ, ˆϑ) ˆσ 2 = 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + S Y ( ˆφ, ˆϑ) n = ˆσ 2 ( ) E θ ( ( ˆψ θ)) = E (φ,ϑ,σ 2 )( 2 log L X ( ˆφ, ˆϑ, ˆσ 2 S Y ( ˆφ, ˆϑ) )) + E (φ,ϑ,σ 2 ) n. ˆσ 2 24 novembre / 29

11 Model choice. Akaike s criterion Kullback-Leibler discrepancy and AICC Using linear approximations, and asymptotic distributions of estimators, one arrives at ( ) S Y ( ˆφ, ˆϑ) σ 2 (n + p + q). E (φ,ϑ,σ 2 ) Similarly n ˆσ 2 = S X ( ˆφ, ˆϑ) for large n is distributed as σ 2 χ 2 (n p q 2) and is asymptotically independent of ( ˆφ, ˆϑ). 24 novembre / 29

12 Model choice. Akaike s criterion Kullback-Leibler discrepancy and AICC Using linear approximations, and asymptotic distributions of estimators, one arrives at ( ) S Y ( ˆφ, ˆϑ) σ 2 (n + p + q). E (φ,ϑ,σ 2 ) Similarly n ˆσ 2 = S X ( ˆφ, ˆϑ) for large n is distributed as σ 2 χ 2 (n p q 2) and is asymptotically independent of ( ˆφ, ˆϑ). Hence ( ) S Y ( ˆφ, ˆϑ) σ 2 (n + p + q) E (φ,ϑ,σ 2 ) ˆσ 2 σ 2 (n p q 2)/n 24 novembre / 29

13 Model choice. Akaike s criterion Kullback-Leibler discrepancy and AICC Using linear approximations, and asymptotic distributions of estimators, one arrives at ( ) S Y ( ˆφ, ˆϑ) σ 2 (n + p + q). E (φ,ϑ,σ 2 ) Similarly n ˆσ 2 = S X ( ˆφ, ˆϑ) for large n is distributed as σ 2 χ 2 (n p q 2) and is asymptotically independent of ( ˆφ, ˆϑ). Hence ( ) S Y ( ˆφ, ˆϑ) σ 2 (n + p + q) E (φ,ϑ,σ 2 ) ˆσ 2 σ 2 (n p q 2)/n From E θ ( ( ˆψ θ)) = E (φ,ϑ,σ 2 )( 2 log L X ( ˆφ, ˆϑ, ˆσ 2 )) + E (φ,ϑ,σ 2 ) AICC = 2 log L X ( ˆφ, ˆϑ, ˆσ 2 2(p + q + 1)n ) + n p q 2 is an approximate unbiased estimate of (ˆθ θ). ( ) SY ( ˆφ, ˆϑ) n ˆσ 2 24 novembre / 29

14 Model choice. Akaike s criterion Criteria for model choice The order is chosen by minimizing the value of AICC (Corrected Akaike s Information Criterion): 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + 2(p+q+1)n n p q 2. The second term can be considered a penalty for models with a large number of parameters. 24 novembre / 29

15 Model choice. Akaike s criterion Criteria for model choice The order is chosen by minimizing the value of AICC (Corrected Akaike s Information Criterion): 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + 2(p+q+1)n n p q 2. The second term can be considered a penalty for models with a large number of parameters. For n large it is approximately the same as Akaike s information Criterion (AIC): 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + 2(p + q + 1), but carries a higher penalty for finite n, and thus is somewhat less likely to overfit. In R: AICC <- AIC(myfit,k=2*n/(n-p-q-2)) 24 novembre / 29

16 Model choice. Akaike s criterion Criteria for model choice The order is chosen by minimizing the value of AICC (Corrected Akaike s Information Criterion): 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + 2(p+q+1)n n p q 2. The second term can be considered a penalty for models with a large number of parameters. For n large it is approximately the same as Akaike s information Criterion (AIC): 2 log L X ( ˆφ, ˆϑ, ˆσ 2 ) + 2(p + q + 1), but carries a higher penalty for finite n, and thus is somewhat less likely to overfit. In R: AICC <- AIC(myfit,k=2*n/(n-p-q-2)) A rule of thumb is the fits of model 1 and model 2 are not significantly different if AICC 1 AICC 2 < 2 (only the difference matters, not the absolute value of AICC). Hence, we may decide to choose model 1 if it simpler than 2 (or its residuals are closer to white-noise) even if AICC 1 > AICC 2 as long as AICC 1 < AICC novembre / 29

17 Model choice. Akaike s criterion Tests on residuals ˆX t ( ˆϕ, ˆϑ) predicted value of X t given the estimates ( ˆϕ, ˆϑ). Ŵ t = X t ˆX t ( ˆϕ, ˆϑ) ( ) 1/2 standardized residuals. r t 1 ( ˆϕ, ˆϑ) Portmanteau tests on ACF of Ŵ t : Box-Pierce; Ljung-Box; Test on turning points Rank tests novembre / 29

18 Autocovariance A mutivariate stochastic process {X t R m }, t Z is weakly stationary if E(X 2 t,i) < t, i E(X t ) µ, Cov(X t+h, X t ) Γ(h). In particular γ ij (h) = Cov(X t+h,i, X t,j ) = E((X t+h,i µ i )(X t,j µ j )). 24 novembre / 29

19 Autocovariance A mutivariate stochastic process {X t R m }, t Z is weakly stationary if E(Xt,i) 2 < t, i E(X t ) µ, Cov(X t+h, X t ) Γ(h). In particular γ ij (h) = Cov(X t+h,i, X t,j ) = E((X t+h,i µ i )(X t,j µ j )). Note that in general γ ij (h) γ ji (h), while γ ij (h) = Cov(X t+h,i, X t,j ) = (stationarity) = Cov(X t,i, X t h,j ) = (symmetry) = Cov(X t h,j, X t,i ) = γ ji ( h). 24 novembre / 29

20 Autocovariance A mutivariate stochastic process {X t R m }, t Z is weakly stationary if E(X 2 t,i) < t, i E(X t ) µ, Cov(X t+h, X t ) Γ(h). In particular γ ij (h) = Cov(X t+h,i, X t,j ) = E((X t+h,i µ i )(X t,j µ j )). Note that in general γ ij (h) γ ji (h), while γ ij (h) = Cov(X t+h,i, X t,j ) = (stationarity) = Cov(X t,i, X t h,j ) = (symmetry) = Cov(X t h,j, X t,i ) = γ ji ( h). Another simple property is γ i,j (h) (γ ii (0)γ jj (0)) 1/2. The ACF ρ ij (h) = γ ij (h) (γ ii (0)γ jj (0)) 1/2. 24 novembre / 29

21 Multivariate White-noise and MA A mutivariate stochastic process {Z t R m } is a white-noise with covariance S, {Z t } WN(0, S), if { S h = 0 {Z t } is stationary with mean 0 and ACVF Γ(h) = 0 h novembre / 29

22 Multivariate White-noise and MA A mutivariate stochastic process {Z t R m } is a white-noise with covariance S, {Z t } WN(0, S), if { S h = 0 {Z t } is stationary with mean 0 and ACVF Γ(h) = 0 h 0. {X t R m } is a linear process if X t = and C k are matrices s.t. + k= + k= C k Z t k {Z t } WN(0, S) (C k ) ij < + for all i, j = 1... m. 24 novembre / 29

23 Multivariate White-noise and MA A mutivariate stochastic process {Z t R m } is a white-noise with covariance S, {Z t } WN(0, S), if { S h = 0 {Z t } is stationary with mean 0 and ACVF Γ(h) = 0 h 0. {X t R m } is a linear process if X t = and C k are matrices s.t. + k= + k= {X t } is stationary and Γ X (h) = C k Z t k {Z t } WN(0, S) (C k ) ij < + for all i, j = 1... m. k= C k+h SC t k. 24 novembre / 29

24 Estimation of mean The mean µ can be estimated through X n. From the univariate theory, we know E( X n ) = µ, V(( X n ) i ) 0 (as n ), if γ ii (h) h 0 nv(( X n ) i ) + h= γ ii (h) if + h= γ ii (h) < +. Moreover ( X n ) i is asymptotically normal. Stronger assumptions are required for the vector X n to be asymptotically normal Theorem If X t = µ + then n 1/2 ( X n µ) = N(0, + k= C k Z t k {Z t } WN(0, S) k= C k+h SC t k ). 24 novembre / 29

25 Confidence intervals for the mean In principle, from X n N(µ, 1 n m-dimensional confidence ellipsoid. But... k= C k+h SCk t ) one could build an 24 novembre / 29

26 Confidence intervals for the mean In principle, from X n N(µ, 1 C k+h SCk t ) one could build an n k= m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n + h= γ ii (h)). 24 novembre / 29

27 Confidence intervals for the mean In principle, from X n N(µ, 1 C k+h SCk t ) one could build an n k= m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n + h= + h= γ ii (h)). γ ii (h) = 2πf i (0) can be consistently estimated from r ( 2πˆf i (0) = 1 h ) ˆγ ii (h) where r n and r n r n 0. h= r 24 novembre / 29

28 Confidence intervals for the mean In principle, from X n N(µ, 1 C k+h SCk t ) one could build an n k= m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n + h= + h= γ ii (h)). γ ii (h) = 2πf i (0) can be consistently estimated from r ( 2πˆf i (0) = 1 h ) ˆγ ii (h) where r n and r n r n 0. h= r Componentwise confidence intervals can be combined. If we found u i (α) s.t. P( µ i ( X n ) i < u i (a)) 1 α, then m P( µ i ( X n ) i <u i (a), i = 1, m) 1 P ( µ i ( X n ) i u i (a) ) 1 mα. i=1 24 novembre / 29

29 Confidence intervals for the mean In principle, from X n N(µ, 1 C k+h SCk t ) one could build an n k= m-dimensional confidence ellipsoid. But... not intuitive, C k and S not known and have to be estimated... Instead, build confidence intervals from ( X n ) i N(µ i, 1 n + h= + h= γ ii (h)). γ ii (h) = 2πf i (0) can be consistently estimated from r ( 2πˆf i (0) = 1 h ) ˆγ ii (h) where r n and r n r n 0. h= r Componentwise confidence intervals can be combined. If we found u i (α) s.t. P( µ i ( X n ) i < u i (a)) 1 α, then m P( µ i ( X n ) i <u i (a), i = 1, m) 1 P ( µ i ( X n ) i u i (a) ) 1 mα. Choosing α = 0.05 m i=1, one has a 95%-confidence m-rectangle. 24 novembre / 29

30 Estimation of ACVF (bivariate case, m = 2) 1 n h (X t+h X n )(X t X n ) t 0 h < n n t=1 ˆΓ(h) = 1 n (X t+h n X n )(X t X n ) t n < h < 0. t= h+1 ˆρ ij (h) = ˆγ ij (h)(ˆγ ii (0)ˆγ jj (0)) 1/2. 24 novembre / 29

31 Estimation of ACVF (bivariate case, m = 2) 1 n h (X t+h X n )(X t X n ) t 0 h < n n t=1 ˆΓ(h) = 1 n (X t+h n X n )(X t X n ) t n < h < 0. t= h+1 ˆρ ij (h) = ˆγ ij (h)(ˆγ ii (0)ˆγ jj (0)) 1/2. Theorem If X t = µ + + k= C k Z t k {Z t } IID(0, S) then h ˆγ ij (h) P γ ij (h) ˆρ ij (h) P ρ ij (h) as n. 24 novembre / 29

32 An example: Southern Oscillation Index Southern Oscillation Index (an environmental measure) compared to fish recruitment in South Pacific (1950 to 1985) Southern Oscillation Index Recruitment novembre / 29

33 ACF of Southern Oscillation Index soi soi & rec ACF Lag rec & soi Lag rec Bottom left panel is γ 12 of negative lags. ACF Lag Lag 24 novembre / 29

34 An example from Box and Jenkins Sales (V2) with a leading indicator (V1) sales V V Time 24 novembre / 29

35 ACF of sales data V1 V1 & V2 ACF Lag V2 & V Lag V2 Data are not stationary. ACF Lag Lag 24 novembre / 29

36 Differenced sales data dsales V V Time 24 novembre / 29

37 ACF of sales data V1 V1 & V2 ACF ACF Lag V2 & V Lag V2 Only crosscorrelation relevant only at lags 2, Lag Lag 24 novembre / 29

38 Testing for independence of time-series: basis Generally asymptotic distribution of ˆγ ij (h) is complicated. But 24 novembre / 29

39 Testing for independence of time-series: basis Generally asymptotic distribution of ˆγ ij (h) is complicated. But Theorem Let X t,1 = j= α j Z t j,1 X t,2 = j= β j Z t j,2 with {Z t,1 } WN(0, σ 2 1), {Z t,2 } WN(0, σ 2 2) and independent. 24 novembre / 29

40 Testing for independence of time-series: basis Generally asymptotic distribution of ˆγ ij (h) is complicated. But Theorem Let X t,1 = j= α j Z t j,1 X t,2 = j= β j Z t j,2 with {Z t,1 } WN(0, σ 2 1), {Z t,2 } WN(0, σ 2 2) and independent. Then nv(ˆγ 12 (h)) n j= γ 11 (j)γ 22 (j). 24 novembre / 29

41 Testing for independence of time-series: basis Generally asymptotic distribution of ˆγ ij (h) is complicated. But Theorem Let X t,1 = j= α j Z t j,1 X t,2 = j= β j Z t j,2 with {Z t,1 } WN(0, σ 2 1), {Z t,2 } WN(0, σ 2 2) and independent. Then nv(ˆγ 12 (h)) n n 1/2 ˆρ 12 (h) = N 0, j= j= γ 11 (j)γ 22 (j). ρ 11 (j)ρ 22 (j). 24 novembre / 29

42 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with ρ i,i (h) = 0.8 h. Then asymptotic variance of ˆρ 12 (h) is n 1 h= 0.64 h n novembre / 29

43 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with ρ i,i (h) = 0.8 h. Then asymptotic variance of ˆρ 12 (h) is n 1 h= 0.64 h n Values of ˆρ 12 (h) quite larger than 1.96n 1 should be common even if the two series are independent. 24 novembre / 29

44 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with ρ i,i (h) = 0.8 h. Then asymptotic variance of ˆρ 12 (h) is n 1 h= 0.64 h n Values of ˆρ 12 (h) quite larger than 1.96n 1 should be common even if the two series are independent. Instead, if one series is white-noise, then V(ˆρ 12 (h)) 1 n. 24 novembre / 29

45 Testing for independence of time-series: an example Suppose {X t,1 } and {X t,2 } are independent AR(1) processes with ρ i,i (h) = 0.8 h. Then asymptotic variance of ˆρ 12 (h) is n 1 h= 0.64 h n Values of ˆρ 12 (h) quite larger than 1.96n 1 should be common even if the two series are independent. Instead, if one series is white-noise, then V(ˆρ 12 (h)) 1 n. Hence, in testing for independence, it is often recommended to prewhiten one series. 24 novembre / 29

46 Pre-whitening a time series Instead of testing ˆρ 12 (h) of the original series, one trasforms them into white noise. If {X t,1 } and {X t,2 } are invertible ARMA, then where k=0 π (i) k Z t,i = k=0 π (i) k X t k,i WN(0, σ 2 i ), i = 1, 2 zk = π (i) (z) = φ (i) (z)/θ (i) (z). 24 novembre / 29

47 Pre-whitening a time series Instead of testing ˆρ 12 (h) of the original series, one trasforms them into white noise. If {X t,1 } and {X t,2 } are invertible ARMA, then where k=0 π (i) k Z t,i = k=0 π (i) k X t k,i WN(0, σ 2 i ), i = 1, 2 zk = π (i) (z) = φ (i) (z)/θ (i) (z). {X t,1 } and {X t,2 } are independent if and only if {Z t,1 } and {Z t,2 }, hence one test for ˆρ Z1,Z 2 (h). 24 novembre / 29

48 Pre-whitening a time series Instead of testing ˆρ 12 (h) of the original series, one trasforms them into white noise. If {X t,1 } and {X t,2 } are invertible ARMA, then where k=0 π (i) k Z t,i = k=0 π (i) k X t k,i WN(0, σ 2 i ), i = 1, 2 zk = π (i) (z) = φ (i) (z)/θ (i) (z). {X t,1 } and {X t,2 } are independent if and only if {Z t,1 } and {Z t,2 }, hence one test for ˆρ Z1,Z 2 (h). As φ (i) (z) and θ (i) (z) not known, one fits an ARMA to the series, and uses the residuals Ŵ t,i in place of Z t,i. It may be enough doing this just to one series. 24 novembre / 29

49 Siimulated data 1st series is AR(1) with ϕ = 0.9; 2nd series is AR(2) with ϕ 1 = 0.7, ϕ 2 = dat_sim dat dat Time 24 novembre / 29

50 ACF of simulated data dat1 dat1 & dat2 ACF Lag Lag dat2 & dat1 dat2 ACF Lag Lag 24 novembre / 29

51 ACF of residuals MLE fits the correct model to both series. 24 novembre / 29

52 ACF of residuals MLE fits the correct model to both series. fitunk1$res fitunk1$res & fitunk2$res ACF ACF Lag fitunk2$res & fitunk1$res Lag fitunk2$res A few crosscorrelation coefficient may appear slightly significant Lag Lag 24 novembre / 29

53 Bartlett s formula More generally Theorem If {X t } is a bivariate Gaussian time series with lim ncov (ˆρ 12(h), ˆρ 12 (k)) = n + j= + h= γ ij (h) <, then [ ρ 11 (j)ρ 22 (j + k h) +ρ 12 (j + k)ρ 21 (j h) ρ 12 (h) (ρ 11 (j)ρ 12 (j + k) + ρ 22 (j)ρ 21 (j k)) ρ 12 (k) (ρ 11 (j)ρ 12 (j + h) + ρ 22 (j)ρ 21 (j h)) ( 1 +ρ 12 (h)ρ 12 (k) 2 ρ2 11(j) + ρ 2 12(j) + 1 ) ] 2 ρ2 22(j) 24 novembre / 29

54 Spectral density of multivariate series If + h= γ ij (h) <, one can define f (λ) = 1 e ihλ Γ(h), λ [ π, π] 2π and one obtains h= π Γ(h) = π e iλh f (λ) dλ 24 novembre / 29

55 Spectral density of multivariate series If + h= γ ij (h) <, one can define f (λ) = 1 e ihλ Γ(h), λ [ π, π] 2π and one obtains and h= π Γ(h) = X t = π π π e iλh f (λ) dλ e iλh dz(λ) where Z i ( ) are (complex) processes with independent increments s.t. λ2 λ 1 ( ) f ij (λ) dλ = E (Z i (λ 2 ) Z i (λ j ))(Z j (λ 2 ) Z j (λ 1 )). 24 novembre / 29

56 Coherence of series For a bivariate series the coherence at frequency λ is X 12 (λ) = f 12 (λ) [f 11 (λ)f 22 (λ)] 1/2 and represents the correlation between dz 1 (λ) and dz 2 (λ). The squared coherency function is X 12 (λ) 2 satisfies 0 X 12 (λ) novembre / 29

57 Coherence of series For a bivariate series the coherence at frequency λ is X 12 (λ) = f 12 (λ) [f 11 (λ)f 22 (λ)] 1/2 and represents the correlation between dz 1 (λ) and dz 2 (λ). The squared coherency function is X 12 (λ) 2 satisfies 0 X 12 (λ) 2 1. Remark. If X t,2 = + k= ψ k X t k,1, then X 12 (λ) novembre / 29

58 Periodogram n Define J(ω j ) = n 1/2 X t e itω j, t=1 for j between [(n 1)/2] and [n/2]. ω j = 2πj/n Then I n (ω j ) = J(ω j )J (ω j ) where means transpose and complex conjugate. ( n ) ( n ) I 12 (ω j ) = 1 n is the cross periodogram. t=1 X t1 e itω j t=1 X t2 e itω j 24 novembre / 29

59 Estimation of spectral density and coherence Again, one estimates f (λ) by ˆf (λ) = 1 2π If X t = + k= C kz t k m n k= m n W n (k)i n ( g(n, λ) + 2π k ). n {Z t } IID(0, S) then m n ˆf ij (λ) AN f ij (λ), f ij (λ) Wn 2 (k) 0 < λ < π. k= m n The natural estimator of X 12 (λ) 2 is ˆχ 2 12(λ) = ˆf 12 (λ) 2 ˆf 11 (λ)ˆf 22 (λ). 24 novembre / 29

60 An example of coherency estimation Squared coherency between SOI and recruitment squared coherency The horizontal line represents a (conservative) test of the assumption X 12 (λ) 2 = 0. Strong coherency at period 1 yr. and longer than frequency 24 novembre / 29

Akaike criterion: Kullback-Leibler discrepancy

Akaike criterion: Kullback-Leibler discrepancy Model choice. Akaike s criterion Akaike criterion: Kullback-Leibler discrepancy Given a family of probability densities {f ( ; ), 2 }, Kullback-Leibler s index of f ( ; ) relativetof ( ; ) is Z ( ) =E

More information

Parameter estimation: ACVF of AR processes

Parameter estimation: ACVF of AR processes Parameter estimation: ACVF of AR processes Yule-Walker s for AR processes: a method of moments, i.e. µ = x and choose parameters so that γ(h) = ˆγ(h) (for h small ). 12 novembre 2013 1 / 8 Parameter estimation:

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Introduction to Time Series Analysis. Lecture 11.

Introduction to Time Series Analysis. Lecture 11. Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker

More information

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8

Contents. 1 Time Series Analysis Introduction Stationary Processes State Space Modesl Stationary Processes 8 A N D R E W T U L L O C H T I M E S E R I E S A N D M O N T E C A R L O I N F E R E N C E T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Time Series Analysis 5

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Modelling using ARMA processes

Modelling using ARMA processes Modelling using ARMA processes Step 1. ARMA model identification; Step 2. ARMA parameter estimation Step 3. ARMA model selection ; Step 4. ARMA model checking; Step 5. forecasting from ARMA models. 33

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Multivariate Time Series

Multivariate Time Series Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form

More information

ITSM-R Reference Manual

ITSM-R Reference Manual ITSM-R Reference Manual George Weigt February 11, 2018 1 Contents 1 Introduction 3 1.1 Time series analysis in a nutshell............................... 3 1.2 White Noise Variance.....................................

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr R Tsay An effective procedure for building empirical time series models is the Box-Jenkins approach, which consists of three stages: model

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Econometrics I: Univariate Time Series Econometrics (1)

Econometrics I: Univariate Time Series Econometrics (1) Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Introduction to Time Series Analysis. Lecture 12.

Introduction to Time Series Analysis. Lecture 12. Last lecture: Introduction to Time Series Analysis. Lecture 12. Peter Bartlett 1. Parameter estimation 2. Maximum likelihood estimator 3. Yule-Walker estimation 1 Introduction to Time Series Analysis.

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 II. Basic definitions A time series is a set of observations X t, each

More information

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation.

Linear models. Chapter Overview. Linear process: A process {X n } is a linear process if it has the representation. Chapter 2 Linear models 2.1 Overview Linear process: A process {X n } is a linear process if it has the representation X n = b j ɛ n j j=0 for all n, where ɛ n N(0, σ 2 ) (Gaussian distributed with zero

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

Model selection using penalty function criteria

Model selection using penalty function criteria Model selection using penalty function criteria Laimonis Kavalieris University of Otago Dunedin, New Zealand Econometrics, Time Series Analysis, and Systems Theory Wien, June 18 20 Outline Classes of models.

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics David B. University of South Carolina Department of Statistics What are Time Series Data? Time series data are collected sequentially over time. Some common examples include: 1. Meteorological data (temperatures,

More information

STAT 248: EDA & Stationarity Handout 3

STAT 248: EDA & Stationarity Handout 3 STAT 248: EDA & Stationarity Handout 3 GSI: Gido van de Ven September 17th, 2010 1 Introduction Today s section we will deal with the following topics: the mean function, the auto- and crosscovariance

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment: Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing

More information

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4)

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1

ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 ECON/FIN 250: Forecasting in Finance and Economics: Section 8: Forecast Examples: Part 1 Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Forecast Examples: Part 1 ECON/FIN

More information

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator Lab: Box-Jenkins Methodology - US Wholesale Price Indicator In this lab we explore the Box-Jenkins methodology by applying it to a time-series data set comprising quarterly observations of the US Wholesale

More information

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis Model diagnostics is concerned with testing the goodness of fit of a model and, if the fit is poor, suggesting appropriate modifications. We shall present two complementary approaches: analysis of residuals

More information

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994). Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Model comparison and selection

Model comparison and selection BS2 Statistical Inference, Lectures 9 and 10, Hilary Term 2008 March 2, 2008 Hypothesis testing Consider two alternative models M 1 = {f (x; θ), θ Θ 1 } and M 2 = {f (x; θ), θ Θ 2 } for a sample (X = x)

More information

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. 9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006. References for this Lecture: Introduction to Time Series and Forecasting. P.J. Brockwell and R. A. Davis, Springer Texts

More information

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Dynamic Time Series Regression: A Panacea for Spurious Correlations International Journal of Scientific and Research Publications, Volume 6, Issue 10, October 2016 337 Dynamic Time Series Regression: A Panacea for Spurious Correlations Emmanuel Alphonsus Akpan *, Imoh

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

MAT3379 (Winter 2016)

MAT3379 (Winter 2016) MAT3379 (Winter 2016) Assignment 4 - SOLUTIONS The following questions will be marked: 1a), 2, 4, 6, 7a Total number of points for Assignment 4: 20 Q1. (Theoretical Question, 2 points). Yule-Walker estimation

More information

If we want to analyze experimental or simulated data we might encounter the following tasks:

If we want to analyze experimental or simulated data we might encounter the following tasks: Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Model Selection for Geostatistical Models

Model Selection for Geostatistical Models Model Selection for Geostatistical Models Richard A. Davis Colorado State University http://www.stat.colostate.edu/~rdavis/lectures Joint work with: Jennifer A. Hoeting, Colorado State University Andrew

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

STOR 356: Summary Course Notes

STOR 356: Summary Course Notes STOR 356: Summary Course Notes Richard L. Smith Department of Statistics and Operations Research University of North Carolina Chapel Hill, NC 7599-360 rls@email.unc.edu February 19, 008 Course text: Introduction

More information

Estimating AR/MA models

Estimating AR/MA models September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating

More information

STAT 443 (Winter ) Forecasting

STAT 443 (Winter ) Forecasting Winter 2014 TABLE OF CONTENTS STAT 443 (Winter 2014-1141) Forecasting Prof R Ramezan University of Waterloo L A TEXer: W KONG http://wwkonggithubio Last Revision: September 3, 2014 Table of Contents 1

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1 Forecasting using R Rob J Hyndman 2.4 Non-seasonal ARIMA models Forecasting using R 1 Outline 1 Autoregressive models 2 Moving average models 3 Non-seasonal ARIMA models 4 Partial autocorrelations 5 Estimation

More information

Time Series I Time Domain Methods

Time Series I Time Domain Methods Astrostatistics Summer School Penn State University University Park, PA 16802 May 21, 2007 Overview Filtering and the Likelihood Function Time series is the study of data consisting of a sequence of DEPENDENT

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Lecture on ARMA model

Lecture on ARMA model Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment

More information

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom).

Figure 29: AR model fit into speech sample ah (top), the residual, and the random sample of the model (bottom). Original 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000 0 50 100 150 200 Residual 0.05 0.05 ACF 0 500 1000 1500 2000 0 50 100 150 200 Generated 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 0 500 1000 1500 2000

More information

On the Behavior of Marginal and Conditional Akaike Information Criteria in Linear Mixed Models

On the Behavior of Marginal and Conditional Akaike Information Criteria in Linear Mixed Models On the Behavior of Marginal and Conditional Akaike Information Criteria in Linear Mixed Models Thomas Kneib Department of Mathematics Carl von Ossietzky University Oldenburg Sonja Greven Department of

More information

Automatic Autocorrelation and Spectral Analysis

Automatic Autocorrelation and Spectral Analysis Piet M.T. Broersen Automatic Autocorrelation and Spectral Analysis With 104 Figures Sprin ger 1 Introduction 1 1.1 Time Series Problems 1 2 Basic Concepts 11 2.1 Random Variables 11 2.2 Normal Distribution

More information

Comment about AR spectral estimation Usually an estimate is produced by computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo

Comment about AR spectral estimation Usually an estimate is produced by computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo Comment aout AR spectral estimation Usually an estimate is produced y computing the AR theoretical spectrum at (ˆφ, ˆσ 2 ). With our Monte Carlo simulation approach, for every draw (φ,σ 2 ), we can compute

More information

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models Tessa L. Childers-Day February 8, 2013 1 Introduction Today s section will deal with topics such as: the mean function, the auto- and cross-covariance

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Part II. Time Series

Part II. Time Series Part II Time Series 12 Introduction This Part is mainly a summary of the book of Brockwell and Davis (2002). Additionally the textbook Shumway and Stoffer (2010) can be recommended. 1 Our purpose is to

More information

Chapter 3 - Temporal processes

Chapter 3 - Temporal processes STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY Lecture -18 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc. Summary of the previous lecture Model selection Mean square error

More information

Lesson 4: Stationary stochastic processes

Lesson 4: Stationary stochastic processes Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@univaq.it Stationary stochastic processes Stationarity is a rather intuitive concept, it means

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

Introduction to Estimation Methods for Time Series models Lecture 2

Introduction to Estimation Methods for Time Series models Lecture 2 Introduction to Estimation Methods for Time Series models Lecture 2 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 2 SNS Pisa 1 / 21 Estimators:

More information

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection

SGN Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection SG 21006 Advanced Signal Processing: Lecture 8 Parameter estimation for AR and MA models. Model order selection Ioan Tabus Department of Signal Processing Tampere University of Technology Finland 1 / 28

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

Master s Written Examination

Master s Written Examination Master s Written Examination Option: Statistics and Probability Spring 05 Full points may be obtained for correct answers to eight questions Each numbered question (which may have several parts) is worth

More information

MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD. Copyright c 2012 (Iowa State University) Statistics / 30

MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD. Copyright c 2012 (Iowa State University) Statistics / 30 MISCELLANEOUS TOPICS RELATED TO LIKELIHOOD Copyright c 2012 (Iowa State University) Statistics 511 1 / 30 INFORMATION CRITERIA Akaike s Information criterion is given by AIC = 2l(ˆθ) + 2k, where l(ˆθ)

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X

More information

Econometrics I, Estimation

Econometrics I, Estimation Econometrics I, Estimation Department of Economics Stanford University September, 2008 Part I Parameter, Estimator, Estimate A parametric is a feature of the population. An estimator is a function of the

More information

Minitab Project Report Assignment 3

Minitab Project Report Assignment 3 3.1.1 Simulation of Gaussian White Noise Minitab Project Report Assignment 3 Time Series Plot of zt Function zt 1 0. 0. zt 0-1 0. 0. -0. -0. - -3 1 0 30 0 50 Index 0 70 0 90 0 1 1 1 1 0 marks The series

More information

Akaike Information Criterion

Akaike Information Criterion Akaike Information Criterion Shuhua Hu Center for Research in Scientific Computation North Carolina State University Raleigh, NC February 7, 2012-1- background Background Model statistical model: Y j =

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Statistics 910, #5 1. Regression Methods

Statistics 910, #5 1. Regression Methods Statistics 910, #5 1 Overview Regression Methods 1. Idea: effects of dependence 2. Examples of estimation (in R) 3. Review of regression 4. Comparisons and relative efficiencies Idea Decomposition Well-known

More information

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE)

MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) MAT 3379 (Winter 2016) FINAL EXAM (PRACTICE) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes. Only

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019 This lecture will make use of the tscourse package, which is installed with the following R code: library(devtools) devtools::install_github("gregorkb/tscourse")

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A.

Fall 2017 STAT 532 Homework Peter Hoff. 1. Let P be a probability measure on a collection of sets A. 1. Let P be a probability measure on a collection of sets A. (a) For each n N, let H n be a set in A such that H n H n+1. Show that P (H n ) monotonically converges to P ( k=1 H k) as n. (b) For each n

More information

Notes on the Multivariate Normal and Related Topics

Notes on the Multivariate Normal and Related Topics Version: July 10, 2013 Notes on the Multivariate Normal and Related Topics Let me refresh your memory about the distinctions between population and sample; parameters and statistics; population distributions

More information

14 - Gaussian Stochastic Processes

14 - Gaussian Stochastic Processes 14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state

More information