NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

Similar documents
Univariate ARIMA Models

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Statistics of stochastic processes

5 Transfer function modelling

Final Examination 7/6/2011

Empirical Market Microstructure Analysis (EMMA)

Econometría 2: Análisis de series de Tiempo

The ARIMA Procedure: The ARIMA Procedure

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Time Series Analysis

Homework 4. 1 Data analysis problems

Stochastic Modelling Solutions to Exercises on Time Series

Scenario 5: Internet Usage Solution. θ j

3 Theory of stationary random processes

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Suan Sunandha Rajabhat University

Some Time-Series Models

Classic Time Series Analysis

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

SAS/ETS 14.1 User s Guide. The ARIMA Procedure

Time Series Outlier Detection

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

FE570 Financial Markets and Trading. Stevens Institute of Technology

Time Series Analysis -- An Introduction -- AMS 586

Time Series Solutions HT 2009

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

ARMA models with time-varying coefficients. Periodic case.

Ross Bettinger, Analytical Consultant, Seattle, WA

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14

Part II. Time Series

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Review Session: Econometrics - CLEFIN (20192)

Econometrics I: Univariate Time Series Econometrics (1)

Advanced Econometrics

Time Series I Time Domain Methods

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Dynamic Time Series Regression: A Panacea for Spurious Correlations

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Minitab Project Report - Assignment 6

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Possibly useful formulas for this exam: b1 = Corr(X,Y) SDY / SDX. confidence interval: Estimate ± (Critical Value) (Standard Error of Estimate)

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

STAT 443 (Winter ) Forecasting

Example: Baseball winning % for NL West pennant winner

STAT 436 / Lecture 16: Key

MATH 5075: Time Series Analysis

Chapter 4: Models for Stationary Time Series

3. ARMA Modeling. Now: Important class of stationary processes

Modelling using ARMA processes

1. Time-dependent data in general

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Box-Jenkins. (1) Identification ( ) (2) Estimation ( ) (3) Diagnostic Checking ( ) (1) Identification: ARMA(p,q) p, q. (2) Estimation: ARMA(p,q)

at least 50 and preferably 100 observations should be available to build a proper model

Univariate Time Series Analysis; ARIMA Models

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Firstly, the dataset is cleaned and the years and months are separated to provide better distinction (sample below).

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Midterm Suggested Solutions

Lecture 2: Univariate Time Series

Time Series Solutions HT Let fx t g be the ARMA(1, 1) process, where jffij < 1 and j j < 1. Show that the autocorrelation function of

A time series is called strictly stationary if the joint distribution of every collection (Y t

Properties of Summation Operator

Solar irradiance forecasting for Chulalongkorn University location using time series models

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

Lecture 5: Estimation of time series

Ch 9. FORECASTING. Time Series Analysis

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Lesson 7: Estimation of Autocorrelation and Partial Autocorrela

Romanian Economic and Business Review Vol. 3, No. 3 THE EVOLUTION OF SNP PETROM STOCK LIST - STUDY THROUGH AUTOREGRESSIVE MODELS

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Discrete time processes

Lesson 4: Stationary stochastic processes

STOR 356: Summary Course Notes

Time Series Analysis

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

TMA4285 December 2015 Time series models, solution.

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Long-range dependence

Exercises - Time series analysis

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Econ 424 Time Series Concepts

Akaike criterion: Kullback-Leibler discrepancy

Chapter 9: Forecasting

Econometrics of Panel Data

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Econometría 2: Análisis de series de Tiempo

Transcription:

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION 2012-2013 MAS451/MTH451 Time Series Analysis May 2013 TIME ALLOWED: 2 HOURS INSTRUCTIONS TO CANDIDATES 1. This examination paper contains FOUR (4) questions and comprises SEVEN (7) printed pages. 2. Answer ALL questions. The marks for each question are indicated at the beginning of each question. 3. Answer each question beginning on a FRESH page of the answer book. 4. This IS NOT an OPEN BOOK exam. 5. Candidates may use calculators. However, they should write down systematically the steps in the workings.

QUESTION 1. (30 marks) (a) Suppose that a stochastic process X t satisfies EX t = t 2 and Cov(X t, X t h ) = γ h, where {γ h, h = 0, 1, } are independent of time t. (i) Is X t stationary? Justify your answer. (ii) Let Y t = 2 X t. Is Y t stationary? Justify your answer. (iii) Let U t = 1 + t 2 + X t. Is U t stationary? Justify your answer. Solution (i) Not stationary, because its mean depends on the time t. (ii) A direct calculation yields EY t = E( 2 X t ) = EX t 2EX t 1 +EX t 2 = t 2 +2(t 1) 2 (t 2) 2 = 2 and Cov(Y t, Y t k ) = Cov( 2 X t, 2 X t k ) = Cov(X t 2X t 1 + X t 2, X t k 2X t 1 k + X t 2 k ) independent of time t by the fact that Cov(X t, X t h ) = γ h independent of time t. It follows that it is stationary. (iii) Obviously EU t = 1 + t 2 EX t = 1 and is Cov(U t, U t k ) = Cov(1+t 2 +X t, 1+(t k) 2 +X t k ) = Cov(X t, X t k ) = γ k. It follows that it is stationary. (b) Suppose that Y t is stationary with autocovariance function γ k. Let n Ȳ = 1 Y n t. t=1 (i) Prove that V ar(ȳ ) = 1 n n 1 k= n+1 (1 k n )γ k. 2

Solution Proof: V ar(ȳ ) = 1 Cov(Y n 2 i, Y j ) == 1 γ(i j) n 2 (ii) Define the sample variance as s 2 = 1 n 1 i,j i,j n (Y t Ȳ )2. Find Es 2. t=1 3

QUESTION 2. (30 marks) Let {Z t } be the white noise with mean zero and V ar(z t ) = 1. (a) Suppose that ) X t = exp (t + 2t 2 + S t + Z t, where S t = S t 12. Suggest a transformation for X t so that the transformed series is stationary. Solution Let and then set Y t = log X t = t + 2t 2 + S t + Z t U t = 12 Y t = Y t Y t 12 = (t+2t 2 +S t +Z t ) (t 12+2(t 12) 2 +S t 12 +Z t 12 ) Finally, let = 12 + 24(2t 12) + 12 Z t. V t = U t U t 1 = 48 + 12 Z t, which is stationary and is what we are looking for. (b) Consider the following ARMA(p,q) model (1 0.5B)X t = (1 + B + 0.25B 2 )Z t. (i) Is it stationary and invertible? Justify your answer. Solution It is stationary because the root to the equation 1 0.5z = 0 is z = 2 which lies outside the unit circle. Also the root to the equation 1 + z + 0.25z 2 = 0 is z = 2 which lies outside the unit circle. It follows that it is invertible. 4

(ii) Find its ACF. Solution Multiplying X t k on the both sides of the model and taking expectation we have E(X t X t k ) = 0.5E(X t 1 X t k )+E(Z t X t k )+E(Z t 1 X t k )+0.25E(Z t 2 X t k ). When k = 0 we have γ(0) = 0.5γ(1) + E(Z t X t ) + E(Z t 1 X t ) + 0.25E(Z t 2 X t ). (1) When k > 2 When k = 2, we have and when k = 1 γ(k) = 0.5γ(k 1). (2) γ(k) = 0.5γ(k 1) + 0.25E(Z t 2 X t 2 ), (3) γ(k) = 0.5γ(k 1) + E(Z t 1 X t 1 ) + 0.25E(Z t 2 X t 1 ). (4) Multiplying Z t on the both sides of the model and taking expectation we obtain E(X t Z t ) = EZ 2 t = 1, (5) which is the same as E(Z t 1 X t 1 ) and E(Z t 2 X t 2 ) due to stationarity. Multiplying Z t 1 on the both sides of the model and taking expectation we obtain E(X t Z t 1 ) = 0.5E(Z t 1 X t 1 )+EZ 2 t 1 = (0.6 1.2) 3 = 1.5, (6) Multiplying Z t 2 on the both sides of the model and taking expectation we obtain E(X t Z t 2 ) = 0.5E(Z t 2 X t 1 )+0.25EZ 2 t 2 = 0.6 (1.5)+0.25 = 1.15. (7) Plugging the above values into (3), (4) and (1) yields and γ(2) = 0.5γ(1) + 0.25, γ(1) = 0.5γ(0) + 1.375. γ(0) = 0.5γ(1) + 2.7875. 5

QUESTION 3. (20 marks) A time series X t with 72 observations is differenced at lag 12 and then at lag 1 to produce a zero mean series Y t with the following sample ACF: r(12) = 0.335, r(24) = 0.08, r(36) = 0.012, r(48) = 0.009 and r(j) = 0.5 j, 0 < j < 12. (i) Suggest a seasonal ARIMA for X t and justify your answer. (ii) Estimate the unknown parameters involved in the model. Solution (i) Since 1.96/ 72 = 0.23 r(12) > 0.23 and r(12j) < 0.23, j = 2, 3, 4.. This suggests Q = 1 and P = 0. Note that the time series is differenced at lag 12 and then at lag 1, which suggests that D = d = 1. Moreover r(j) = 0.5 j, 0 < j < 12 indicates that p=1 and q = 0 with corresponding coefficient ϕ 1 = 0.5. It is Seasonal (1, 1, 0) (0, 1, 1) 12. (ii) The estimated model is with Y t = (1 B)(1 B 4 )X t. Y t + 0.5Y t 1 = Z t 0.384Z t 12 6

QUESTION 4. (20 marks) The real data set airline passenger has been fitted by a seasonal ARIMA model with the following summary: (i) Identify the possible components of the time series x according to its time plot below. solution Trend and seasonality. (ii) There are three competing models given below (indicated by the ARIMA Procedure (I,II,III) respectively). Which one is the best? Write down the model and justify your answer. solution Model 1 is the best, because its AIC is the smallest. The model is X t X t 4 = (1 + 0.52832B)(1 0.91401B 4 )Z t. (iii) Is the fit adequate at the 5% level of significance? solution. It is adequate because the p-value for the Ljung-Box statistic is 0.15 which is bigger than 0.05. Name of Variable = xlog 7

Period(s) of Differencing 4 Mean of Working Series 0.092453 Standard Deviation 0.038748 Number of Observations 16 Observation(s) eliminated by differencing 4 8

The ARIMA Procedure (I) Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag MU 0.08960 0.0041821 21.43 <.0001 0 MA1,1-0.52832 0.24063-2.20 0.0469 1 MA2,1 0.91401 0.23379 3.91 0.0018 4 Constant Estimate 0.089601 Variance Estimate 0.000767 Std Error Estimate 0.027694 AIC -66.6859 SBC -64.3681 Number of Residuals 16 * AIC and SBC do not include log determinant. Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq ---Autocorrelations-------------------- 6 6.74 4 0.1500 0.044 0.105-0.004-0.030-0.119-0.460 12 8.31 10 0.5990-0.077-0.034-0.114-0.074 0.052 0.073 Moving Average Factors Factor 1: Factor 2: 1 + 0.52832 B**(1) 1-0.91401 B**(4) 9

The ARIMA Procedure (II) Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag MU 0.08953 0.01240 7.22 <.0001 0 AR1,1 0.64710 0.24510 2.64 0.0204 1 AR2,1-0.57153 0.27888-2.05 0.0612 4 Constant Estimate 0.049652 Variance Estimate 0.000853 Std Error Estimate 0.029205 AIC -64.9857 SBC -62.668 Number of Residuals 16 * AIC and SBC do not include log determinant. Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq Autocorrelations-------------------- 6 5.17 4 0.2707-0.036 0.197 0.010-0.126-0.069-0.365 12 7.18 10 0.7087 0.033-0.203-0.082-0.002 0.049 0.031 Autoregressive Factors Factor 1: Factor 2: 1-0.6471 B**(1) 1 + 0.57153 B**(4) 10

The ARIMA Procedure(III) Conditional Least Squares Estimation Standard Approx Parameter Estimate Error t Value Pr > t Lag MU 0.08898 0.0064187 13.86 <.0001 0 MA1,1-0.12647 0.68861-0.18 0.8576 1 MA2,1 0.70853 0.52132 1.36 0.2013 4 AR1,1 0.44064 0.64287 0.69 0.5073 1 AR2,1-0.15695 0.61680-0.25 0.8038 4 Constant Estimate 0.057585 Variance Estimate 0.000882 Std Error Estimate 0.029697 AIC -63.1235 SBC -59.2605 Number of Residuals 16 * AIC and SBC do not include log determinant. Autocorrelation Check of Residuals To Chi- Pr > Lag Square DF ChiSq --Autocorrelations-------------------- 6 6.36 2 0.0416 0.001 0.014 0.024-0.007-0.088-0.462 12 7.55 8 0.4785-0.004-0.034-0.109-0.070 0.052 0.062 Autoregressive Factors Factor 1: 1-0.44064 B**(1) Factor 2: 1 + 0.15695 B**(4) Moving Average Factors Factor 1: 1 + 0.12647 B**(1) Factor 2: 1-0.70853 B**(4) END OF PAPER 11