Time Series Solutions HT 2009

Similar documents
Time Series Solutions HT Let fx t g be the ARMA(1, 1) process, where jffij < 1 and j j < 1. Show that the autocorrelation function of

Chapter 4: Models for Stationary Time Series

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Statistics of stochastic processes

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Ch 4. Models For Stationary Time Series. Time Series Analysis

3. ARMA Modeling. Now: Important class of stationary processes

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

3 Theory of stationary random processes

Minitab Project Report Assignment 3

1 Linear Difference Equations

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Review Session: Econometrics - CLEFIN (20192)

Econometría 2: Análisis de series de Tiempo

Class 1: Stationary Time Series Analysis

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Some Time-Series Models

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Non-Stationary Time Series and Unit Root Testing

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Homework 4. 1 Data analysis problems

Time Series Examples Sheet

Chapter 3 - Temporal processes

Discrete time processes

Time Series 2. Robert Almgren. Sept. 21, 2009

Non-Stationary Time Series and Unit Root Testing

Financial Time Series Analysis Week 5

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Minitab Project Report - Assignment 6

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Time Series Examples Sheet

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

E 4101/5101 Lecture 6: Spectral analysis

Characteristics of Time Series

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Non-Stationary Time Series and Unit Root Testing

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

UNIVERSITY OF TORONTO SCARBOROUGH Department of Computer and Mathematical Sciences Midterm Test, March 2014

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Time Series Analysis

Univariate Time Series Analysis; ARIMA Models

Lecture 2: Univariate Time Series

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

STAT 443 (Winter ) Forecasting

Time Series Econometrics 4 Vijayamohanan Pillai N

Time series models in the Frequency domain. The power spectrum, Spectral analysis

7. MULTIVARATE STATIONARY PROCESSES

6.3 Forecasting ARMA processes

MEI Exam Review. June 7, 2002

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

Introduction to Stochastic processes

A time series is called strictly stationary if the joint distribution of every collection (Y t

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

Examination paper for Solution: TMA4285 Time series models

X t = a t + r t, (7.1)

Autoregressive Moving Average (ARMA) Models and their Practical Applications

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Time Series Analysis

Econometrics II Heij et al. Chapter 7.1

Advanced Econometrics

Midterm Suggested Solutions

Econometría 2: Análisis de series de Tiempo

Questions and Answers on Unit Roots, Cointegration, VARs and VECMs

MATH 5075: Time Series Analysis

STAT 720 TIME SERIES ANALYSIS. Lecture Notes. Dewei Wang. Department of Statistics. University of South Carolina

MCMC analysis of classical time series algorithms.

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Lecture 1: Stationary Time Series Analysis

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

5: MULTIVARATE STATIONARY PROCESSES

Part II. Time Series

Lecture 2: ARMA(p,q) models (part 2)

Econ 424 Time Series Concepts

Forecasting locally stationary time series

Classic Time Series Analysis

Homework 2 Solutions

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Lesson 2: Analysis of time series

STAT 436 / Lecture 16: Key

Ross Bettinger, Analytical Consultant, Seattle, WA

Statistics Homework #4

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

ECON 616: Lecture 1: Time Series Basics

Chapter 9: Forecasting

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

white noise Time moving average

Transcription:

Time Series Solutions HT 2009 1. Let {X t } be the ARMA(1, 1) process, X t φx t 1 = ɛ t + θɛ t 1, {ɛ t } WN(0, σ 2 ), where φ < 1 and θ < 1. Show that the autocorrelation function of {X t } is given by ρ(1) = (1 + φθ)(φ + θ) 1 + θ 2 + 2φθ, ρ(h) = φh 1 ρ(1) for h 1. Solution: Taking expectations E(X t ) = φe(x t 1 ), and using φ < 1 and stationarity we get E(X t ) = E(X t 1 ) = 0. For k 2: multiplying X t = φx t 1 + ɛ t + θɛ t 1 by X t k and taking expectations we get γ k = φγ k 1, and hence γ k = φ k 1 γ 1 for k 2. Multiplying the same equation by X t and taking expectations we get and so Also γ 0 = φγ 1 + E[X t (ɛ t + θɛ t 1 )] X t = φx t 1 + ɛ t + θɛ t 1 = φ[φx t 2 + ɛ t 1 + θɛ t 2 ] + ɛ t + θɛ t 1 = φ 2 X t 2 + φɛ t 1 + φθɛ t 2 + ɛ t + θɛ t 1 γ 0 = φγ 1 + E[(φ 2 X t 2 + φɛ t 1 + φθɛ t 2 + ɛ t + θɛ t 1 )(ɛ t + θɛ t 1 )] = φγ 1 + σ 2 [φθ + 1 + θ 2 ]. γ 1 = E(X t X t+1 ) = E[X t (φx t + ɛ t+1 + θɛ t )] = φγ 0 + E[(φX t 1 + ɛ t + θɛ t 1 )(ɛ t+1 + θɛ t )] = φγ 0 + θσ 2. We can now solve the two equations involving γ 0, γ 1, and then find γ k, and hence ρ k, as required. 1

2. Consider a process consisting of a linear trend plus an additive noise term, that is, X t = β 0 + β 1 t + ɛ t where β 0 and β 1 are fixed constants, and where the ɛ t are independent random variables with zero means and variances σ 2. Show that X t is non-stationary, but that the first difference series X t = X t X t 1 is second-order stationary, and find the acf of X t. Solution: E(X t ) = E(β 0 + β 1 t + ɛ t ) = β 0 + β 1 t which depends on t, hence X t is non-stationary. Let Y t = X t = X t X t 1. Then So Y t = β 0 + β 1 t + ɛ t {β 0 + β 1 (t 1) + ɛ t 1 } = β 1 + ɛ t ɛ t 1. cov(y t, Y t+k ) = cov(ɛ t ɛ t 1, ɛ t+k ɛ t+k 1 ) = E(ɛ t ɛ t+k ɛ t 1 ɛ t+k ɛ t ɛ t+k 1 + ɛ t 1 ɛ t+k 1 ) 2σ 2 k = 0 = σ 2 k = 1 0 k 2. Hence Y t is stationary and its acf is 1 k = 0 ρ k = = 1 k = 1 2 0 k 2. 3. Let {S t, t = 0, 1, 2,... } be the random walk with constant drift µ, defined by S 0 = 0 and S t = µ + S t 1 + ɛ t, t = 1, 2,..., where ɛ 1, ɛ 2,... are independent and identically distributed random variables with mean 0 and variance σ 2. Compute the mean of S t and the autocovariance of the process {S t }. Show that { S t } is stationary and compute its mean and autocovariance function. 2

Solution: 4. If So E(S t ) = 0 + tµ + 0 = tµ. S t = ɛ t + µ + S t 1 = ɛ t + µ + ɛ t 1 + µ + S t 2 = ɛ t + ɛ t 1 + 2µ + S t 2 =... t 1 = ɛ t j + tµ + S 0 j=0 For the autocovariance of S t, the autocovariance at lag k is t 1 t+k 1 E[{S t tµ}{s t+k (t + k)µ}] = E( ɛ t j ɛ t+k i ) j=0 i=0 t 1 = E(ɛ t j ɛ t j ) j=0 = tσ 2 since, when moving from the first line to the second line of the above display, E(ɛ t j ɛ t+k i ) = 0 unless i = j + k. Y t = S t = S t S t 1 = µ + ɛ t, which is clearly stationary. E(Y t ) = µ. For the autocovariance of Y t, note Y t µ = ɛ t, and similarly Y t µ = ɛ t, and so for t t each Y t depends on a different ɛ t, and therefore cov(y t, Y t ) = 0 for all t t. So the autocovariance function is σ 2 at lag 0, and is zero at all other lags. X t = a cos(λt) + ɛ t where ɛ t WN(0, σ 2 ), and where a and λ are constants, show that {X t } is not stationary. Now consider the process X t = a cos(λt + Θ) where Θ is uniformly distributed on (0, 2π), and where a and λ are constants. Is this process stationary? Find the autocorrelations and the spectrum of X t. 3

[To find the autocorrelations you may want to use the identity cos α cos β = {cos(α + β) + cos(α β)}.] 1 2 Solution: E(X t ) = E(a cos(λt) + ɛ t ) = a cos(λt), which depends on t, so X t is not stationary. Now for X t = a cos(λt + Θ) we need to consider the joint distributions of (X(t 1 ),..., X(t k )) and of (X(t 1 + τ),..., X(t k + τ)). Since shifting time by t is equivalent to shifting Θ by λt, and since Θ is uniform on (0, 2π), these two joint distributions are the same, and so X t is stationary. E(X t ) = ae(cos(λt + Θ)) 2π = a cos(λt + θ) dθ 2π 0 = a [sin(λt + θ)]2π 0 2π = 0 So ρ t = cos(λt). γ t = E(X t X 0 ) = a 2 E(cos(Θ) cos(λt + Θ)) [ ] 1 = a 2 E {cos(λt + 2Θ) + cos(λt)} 2 = a2 2 [ 1 2π 2π 0 = a2 2 cos(λt) cos(λt + 2θ) + cos(λt) dθ] The spectrum is F where γ t = π π eitω df (ω). Try the discrete distribution for F, F (λ) = F ( λ) = c, a constant, F (ω) = 0 otherwise. Then γ t = e itλ c + e itλ c = c[cos(tλ) + i sin(tλ) + cos(tλ) i sin(tλ)] = 2c cos(λt). So we want 2c = a 2 /2, or c = a 2 /4. So F (λ) = F ( λ) = a 2 /4. 5. Find the Yule-Walker equations for the AR(2) process X t = 1 3 X t 1 + 2 9 X t 2 + ɛ t 4

where ɛ t WN(0, σ 2 ). Hence show that this process has autocorrelation function ( ρ k = 16 2 ) k ( 21 3 + 5 21 1 k 3). [To solve an equation of the form aρ k + bρ k 1 + cρ k 2 = 0, try ρ k = Aλ k for some constants A and λ: solve the resulting quadratic equation for λ and deduce that ρ k is of the form ρ k = Aλ k 1 + Bλ k 2 where A and B are constants.] Solution: The Yule-Walker equations are So as in the hint, to solve ρ k = 1 3 ρ k 1 + 2 9 ρ k 2. ρ k 1 3 ρ k 1 2 9 ρ k 2 = 0 try ρ k = Aλ k. Substituting this into the above equation, and cancelling a factor of λ k 2, we get λ 2 1 3 λ 2 9 = 0 which has roots λ = 2 3 and λ = 1 3, so ρ k = A( 2 3 )k + B( 1 3 )k. We also require ρ 0 = 1 and ρ 1 = 1 + 2ρ 3 9 1. Hence we can solve for A and B: A = 16 and B = 5. So 21 21 ρ k = 16( 2 21 3 )k + 5 ( 1 21 3 )k. 6. Let {Y t } be a stationary process with mean zero and let a and b be constants. (a) If X t = a + bt + s t + Y t where s t is a seasonal component with period 12, show that 12 X t = (1 B)(1 B 12 )X t is stationary. (b) If X t = (a+bt)s t +Y t where s t is again a seasonal component with period 12, show that 2 12X t = (1 B 12 )(1 B 12 )X t is stationary. Solution: (a) X t = a + bt + s t + Y t [a + b(t 1) + s t 1 + Y t 1 ] = b + s t s t 1 + Y t Y t 1 12 X t = b + s t s t 1 + Y t Y t 1 [b + s t 12 s t 13 + Y t 12 Y t 13 ] = Y t Y t 1 Y t 12 + Y t 13 and this is a stationary process since Y t is stationary. (We have used the fact that s t = s t 12 for all t.) 5

(b) Solution: 12 X t = (a + bt)s t + Y t [(a + b(t 12))s t 12 + Y t 12 ] = Y t + 12bs t 12 Y t 12 2 12X t = Y t + 12bs t 12 Y t 12 [Y t 12 + 12bs t 24 Y t 24 ] = Y t 2Y t 12 + Y t 24 and this is stationary since Y t is stationary (again using s t = s t 12 for all t.) 7. Consider the univariate state-space model given by state conditions X 0 = W 0, X t = X t 1 + W t, and observations Y t = X t + V t, t = 1, 2,..., where V t and W t are independent, Gaussian, white noise processes with var(v t ) = σv 2 and var(w t ) = σw 2. Show that the data follow an ARIMA(0,1,1) model, that is, Y t follows an MA(1) model. Include in your answer an expression for the autocorrelation function of Y t in terms of σv 2 and σ2 W. Solution: Y t = Y t Y t 1 = (X t + V t ) (X t 1 + V t 1 ) = X t X t 1 + V t V t 1 = W t + V t V t 1 and so Y t is an MA(1). As V t, V t 1 and W t are independent, Furthermore, γ 0 = V ar( Y t ) γ 1 = Cov( Y t, Y t+1 ) = σ 2 W + 2σ 2 V. = Cov(W t + V t V t 1, W t+1 + V t+1 V t ) = σ 2 V, and, from the independence, γ k = 0 for k 2. Hence the acf is ρ 0 = 1, and ρ k = 0 for k 2. σ 2 V ρ 1 = σw 2 +, 2σ2 V 6