NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4) questions and comprises SEVEN (7) printed pages. 2. Answer ALL questions. The marks for each question are indicated at the beginning of each question. 3. Answer each question beginning on a FRESH page of the answer book. 4. This IS NOT an OPEN BOOK exam. 5. Candidates may use calculators. However, they should write down systematically the steps in the workings.
QUESTION 1. (30 marks) (a) Suppose that a stochastic process X t satisfies EX t = t 2 and Cov(X t, X t h ) = γ h, where {γ h, h = 0, 1, } are independent of time t. (i) Is X t stationary? Justify your answer. (ii) Let Y t = 2 X t. Is Y t stationary? Justify your answer. (iii) Let U t = 1 + t 2 + X t. Is U t stationary? Justify your answer. Solution (i) Not stationary, because its mean depends on the time t. (ii) A direct calculation yields EY t = E( 2 X t ) = EX t 2EX t 1 +EX t 2 = t 2 +2(t 1) 2 (t 2) 2 = 2 and Cov(Y t, Y t k ) = Cov( 2 X t, 2 X t k ) = Cov(X t 2X t 1 + X t 2, X t k 2X t 1 k + X t 2 k ) independent of time t by the fact that Cov(X t, X t h ) = γ h independent of time t. It follows that it is stationary. (iii) Obviously EU t = 1 + t 2 EX t = 1 and is Cov(U t, U t k ) = Cov(1+t 2 +X t, 1+(t k) 2 +X t k ) = Cov(X t, X t k ) = γ k. It follows that it is stationary. (b) Suppose that Y t is stationary with autocovariance function γ k. Let n Ȳ = 1 Y n t. t=1 (i) Prove that V ar(ȳ ) = 1 n n 1 k= n+1 (1 k n )γ k. 2
Solution Proof: V ar(ȳ ) = 1 Cov(Y n 2 i, Y j ) == 1 γ(i j) n 2 (ii) Define the sample variance as s 2 = 1 n 1 i,j i,j n (Y t Ȳ )2. Find Es 2. t=1 3
QUESTION 2. (30 marks) Let {Z t } be the white noise with mean zero and V ar(z t ) = 1. (a) Suppose that ) X t = exp (t + 2t 2 + S t + Z t, where S t = S t 12. Suggest a transformation for X t so that the transformed series is stationary. Solution Let and then set Y t = log X t = t + 2t 2 + S t + Z t U t = 12 Y t = Y t Y t 12 = (t+2t 2 +S t +Z t ) (t 12+2(t 12) 2 +S t 12 +Z t 12 ) Finally, let = 12 + 24(2t 12) + 12 Z t. V t = U t U t 1 = 48 + 12 Z t, which is stationary and is what we are looking for. (b) Consider the following ARMA(p,q) model (1 0.5B)X t = (1 + B + 0.25B 2 )Z t. (i) Is it stationary and invertible? Justify your answer. Solution It is stationary because the root to the equation 1 0.5z = 0 is z = 2 which lies outside the unit circle. Also the root to the equation 1 + z + 0.25z 2 = 0 is z = 2 which lies outside the unit circle. It follows that it is invertible. 4
(ii) Find its ACF. Solution Multiplying X t k on the both sides of the model and taking expectation we have E(X t X t k ) = 0.5E(X t 1 X t k )+E(Z t X t k )+E(Z t 1 X t k )+0.25E(Z t 2 X t k ). When k = 0 we have γ(0) = 0.5γ(1) + E(Z t X t ) + E(Z t 1 X t ) + 0.25E(Z t 2 X t ). (1) When k > 2 When k = 2, we have and when k = 1 γ(k) = 0.5γ(k 1). (2) γ(k) = 0.5γ(k 1) + 0.25E(Z t 2 X t 2 ), (3) γ(k) = 0.5γ(k 1) + E(Z t 1 X t 1 ) + 0.25E(Z t 2 X t 1 ). (4) Multiplying Z t on the both sides of the model and taking expectation we obtain E(X t Z t ) = EZ 2 t = 1, (5) which is the same as E(Z t 1 X t 1 ) and E(Z t 2 X t 2 ) due to stationarity. Multiplying Z t 1 on the both sides of the model and taking expectation we obtain E(X t Z t 1 ) = 0.5E(Z t 1 X t 1 )+EZ 2 t 1 = (0.6 1.2) 3 = 1.5, (6) Multiplying Z t 2 on the both sides of the model and taking expectation we obtain E(X t Z t 2 ) = 0.5E(Z t 2 X t 1 )+0.25EZ 2 t 2 = 0.6 (1.5)+0.25 = 1.15. (7) Plugging the above values into (3), (4) and (1) yields and γ(2) = 0.5γ(1) + 0.25, γ(1) = 0.5γ(0) + 1.375. γ(0) = 0.5γ(1) + 2.7875. 5
QUESTION 3. (20 marks) A time series X t with 72 observations is differenced at lag 12 and then at lag 1 to produce a zero mean series Y t with the following sample ACF: r(12) = 0.335, r(24) = 0.08, r(36) = 0.012, r(48) = 0.009 and r(j) = 0.5 j, 0 < j < 12. (i) Suggest a seasonal ARIMA for X t and justify your answer. (ii) Estimate the unknown parameters involved in the model. Solution (i) Since 1.96/ 72 = 0.23 r(12) > 0.23 and r(12j) < 0.23, j = 2, 3, 4.. This suggests Q = 1 and P = 0. Note that the time series is differenced at lag 12 and then at lag 1, which suggests that D = d = 1. Moreover r(j) = 0.5 j, 0 < j < 12 indicates that p=1 and q = 0 with corresponding coefficient ϕ 1 = 0.5. It is Seasonal (1, 1, 0) (0, 1, 1) 12. (ii) The estimated model is with Y t = (1 B)(1 B 4 )X t. Y t + 0.5Y t 1 = Z t 0.384Z t 12 6
QUESTION 4. (20 marks) The real data set airline passenger has been fitted by a seasonal ARIMA model with the following summary: (i) Identify the possible components of the time series x according to its time plot below. solution Trend and seasonality. (ii) There are three competing models given below (indicated by the ARIMA Procedure (I,II,III) respectively). Which one is the best? Write down the model and justify your answer. solution Model 1 is the best, because its AIC is the smallest. The model is X t X t 4 = (1 + 0.52832B)(1 0.91401B 4 )Z t. (iii) Is the fit adequate at the 5% level of significance? solution. It is adequate because the p-value for the Ljung-Box statistic is 0.15 which is bigger than 0.05. Name of Variable = xlog 7
Period(s) of Differencing 4 Mean of Working Series 0.092453 Standard Deviation 0.038748 Number of Observations 16 Observation(s) eliminated by differencing 4 8
The ARIMA Procedure (I) Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag MU 0.08960 0.0041821 21.43 <.0001 0 MA1,1-0.52832 0.24063-2.20 0.0469 1 MA2,1 0.91401 0.23379 3.91 0.0018 4 Constant Estimate 0.089601 Variance Estimate 0.000767 Std Error Estimate 0.027694 AIC -66.6859 SBC -64.3681 Number of Residuals 16 * AIC and SBC do not include log determinant. Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq ---Autocorrelations-------------------- 6 6.74 4 0.1500 0.044 0.105-0.004-0.030-0.119-0.460 12 8.31 10 0.5990-0.077-0.034-0.114-0.074 0.052 0.073 Moving Average Factors Factor 1: Factor 2: 1 + 0.52832 B**(1) 1-0.91401 B**(4) 9
The ARIMA Procedure (II) Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag MU 0.08953 0.01240 7.22 <.0001 0 AR1,1 0.64710 0.24510 2.64 0.0204 1 AR2,1-0.57153 0.27888-2.05 0.0612 4 Constant Estimate 0.049652 Variance Estimate 0.000853 Std Error Estimate 0.029205 AIC -64.9857 SBC -62.668 Number of Residuals 16 * AIC and SBC do not include log determinant. Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq Autocorrelations-------------------- 6 5.17 4 0.2707-0.036 0.197 0.010-0.126-0.069-0.365 12 7.18 10 0.7087 0.033-0.203-0.082-0.002 0.049 0.031 Autoregressive Factors Factor 1: Factor 2: 1-0.6471 B**(1) 1 + 0.57153 B**(4) 10
The ARIMA Procedure(III) Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag MU 0.08898 0.0064187 13.86 <.0001 0 MA1,1-0.12647 0.68861-0.18 0.8576 1 MA2,1 0.70853 0.52132 1.36 0.2013 4 AR1,1 0.44064 0.64287 0.69 0.5073 1 AR2,1-0.15695 0.61680-0.25 0.8038 4 Constant Estimate 0.057585 Variance Estimate 0.000882 Std Error Estimate 0.029697 AIC -63.1235 SBC -59.2605 Number of Residuals 16 * AIC and SBC do not include log determinant. Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq --Autocorrelations-------------------- 6 6.36 2 0.0416 0.001 0.014 0.024-0.007-0.088-0.462 12 7.55 8 0.4785-0.004-0.034-0.109-0.070 0.052 0.062 Autoregressive Factors Factor 1: 1-0.44064 B**(1) Factor 2: 1 + 0.15695 B**(4) Moving Average Factors Factor 1: 1 + 0.12647 B**(1) Factor 2: 1-0.70853 B**(4) END OF PAPER 11