We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Similar documents
Ch 4. Models For Stationary Time Series. Time Series Analysis

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

2. An Introduction to Moving Average Models and ARMA Models

Chapter 6: Model Specification for Time Series

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Chapter 4: Models for Stationary Time Series

3 Theory of stationary random processes

Ch 6. Model Specification. Time Series Analysis

Econometría 2: Análisis de series de Tiempo

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Time Series Analysis

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

STAD57 Time Series Analysis. Lecture 8

Univariate Time Series Analysis; ARIMA Models

Covariances of ARMA Processes

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

FE570 Financial Markets and Trading. Stevens Institute of Technology

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Ross Bettinger, Analytical Consultant, Seattle, WA

STAT Financial Time Series

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Notes on the Book: Time Series Analysis: Forecasting and Control by George E. P. Box and Gwilym M. Jenkins

MAT3379 (Winter 2016)

ARMA (and ARIMA) models are often expressed in backshift notation.

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Chapter 8: Model Diagnostics

Time Series Analysis -- An Introduction -- AMS 586

Introduction to ARMA and GARCH processes

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series I Time Domain Methods

6 NONSEASONAL BOX-JENKINS MODELS

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Examination paper for Solution: TMA4285 Time series models

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Econ 623 Econometrics II Topic 2: Stationary Time Series

Forecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

ARMA models with time-varying coefficients. Periodic case.

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Final Examination 7/6/2011

at least 50 and preferably 100 observations should be available to build a proper model

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

Lesson 9: Autoregressive-Moving Average (ARMA) models

Homework 4. 1 Data analysis problems

Time Series Solutions HT 2009

Stationary Time Series

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

ITSM-R Reference Manual

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

3. ARMA Modeling. Now: Important class of stationary processes

Lecture note 2 considered the statistical analysis of regression models for time

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Discrete time processes

Introduction to Time Series Analysis. Lecture 11.

1 Linear Difference Equations

Lecture 3-4: Probability models for time series

Exponential decay rate of partial autocorrelation coefficients of ARMA and short-memory processes

STAT 436 / Lecture 16: Key

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

Forecasting. Simon Shaw 2005/06 Semester II

Quantitative Finance I

Midterm Suggested Solutions

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

CHAPTER 8 FORECASTING PRACTICE I

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Review Session: Econometrics - CLEFIN (20192)

9. Using Excel matrices functions to calculate partial autocorrelations

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Estimation and application of best ARIMA model for forecasting the uranium price.

An introduction to volatility models with indices

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Applied time-series analysis

A Data-Driven Model for Software Reliability Prediction

TMA4285 December 2015 Time series models, solution.

Empirical Market Microstructure Analysis (EMMA)

Exercises - Time series analysis

Classic Time Series Analysis

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Time Series Outlier Detection

Some Time-Series Models

6.3 Forecasting ARMA processes

Autoregressive Moving Average (ARMA) Models and their Practical Applications

ARIMA Models. Richard G. Pierse

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Time Series Analysis

Recall that the AR(p) model is defined by the equation

Modelling using ARMA processes

STAT 720 sp 2019 Lec 06 Karl Gregory 2/15/2019

Time Series 3. Robert Almgren. Sept. 28, 2009

Stat 565. (S)Arima & Forecasting. Charlotte Wickham. stat565.cwick.co.nz. Feb

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Lesson 2: Analysis of time series

Transcription:

ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that AR(p) model is given by the equation X t = φ 1 X t 1 + φ 2 X t 2 +... + φ p X t p + ω t For the ACF, first we multiply by X t k both side of the autoregressive model equation to obtain, X t k X t = φ 1 X t k X t 1 +... + φ p X t k X t p + ω t X t k 90

By taking expectation at both sides this equation, we get ρ k = φ 1 ρ k 1 + φ 2 ρ k 2 +... + φ p ρ k p or written in lag-operator Φ(B)ρ k = 0 with Φ(B) = 1 φ 1 B φ 2 B 2... φ p B p The implied set of equations for different values of k = 1, 2,..., are known as Yule-Walker equations. We try again a solution of the form ρ k = λ k. which leads to the equation λ p φ 1 λ p 1 φ 2 λ p 2... φ p = 0 The solutions to this equation are the reciprocal roots 91

α 1, α 2,..., α p of the characteristic polynomial Φ(B) The general solution for the difference equation is ρ k = p A i (α i ) k = A 1 (α 1 ) k + A 2 (α 2 ) k +... A p (α p ) k i=1 ρ k has an exponential behavior and cyclical patterns (damped sine wave) may appear if some of the α j s are complex numbers Theorem. Given a general difference equation of the form C(B)Z t = 0 where C(B) = 1 + C 1 B + C 2 B 2 +... C n B n and C(B) = n i=1 (1 R i B) so the R i s are the reciprocal roots of the equation C(B) = 0, we have that the solution is Z t = n i=1 A i R t i (without proof). 92

For the PACF we can apply Cramer s rule for k = 1,..., p which can gives us an expression for P kk. If k > p, then P kk = 0 so the PACF of an AR(p) must cut down to zero after lag k = p, where p is the order of the AR model. ACF and PACF for Moving Average models Lets start with the MA(1) given the equation X t = ω t + θω t 1 with θ the model parameter and ω t N(0, σ 2 ) Lets find an expression for the ACF, ρ k For lag k = 0 γ 0 = V ar(x t ) = V ar(ω t ) + θ 2 V ar(ω t 1 ) = σ 2 (1 + θ 2 ) 93

For lag 1, consider the product X t X t 1. Using the MA(1) equation, we obtain X t X t 1 = (ω t + θω t 1 )(ω t 1 + θω t 2 ) = ω t ω t 1 + θω 2 t 1 + θω t 2 ω t + θ 2 ω t 1 ω t 2 If we take expected value at both sides of the equation, E(X t X t 1 ) = γ 1 = θσ 2 Additionally for any value of k X t X t k = ω t ω t k +θω t ω t k 1 +θω t 1 ω t k +θ 2 ω t 1 ω t k 1 Then, if k > 1, E(X t X t k ) = γ k = 0 94

The ACF of an MA(1) is given by θ/(1 + θ 2 ); k = 1 ρ k = 0 k > 1 Using ρ k = 0 for k > 1,we can show that the PACF P kk = φ kk = θk (1 θ 2 ) 1 θ 2(k+1) k 1 Contrary to its ACF, which cuts off after lag 1, the PACF of an MA(1) model decays exponentially. For a general MA(q) process, the ACF cuts down to zero after lag q and the PACF will have exponential behavior depending on the characteristic roots of Θ(B) = (1 + θ 1 B + θ 2 B 2... + θ q B q ) = 0. 95

Instead of trying to find equations in the general case, we will look at somes examples using simulation. Firstly, we consider an MA(1) process with parameters θ = 0.9 and θ = 0.9. Then, we consider an MA(2) process with parameters (θ 1 = 0.85, θ 2 = 0.5) and with parameters (θ 1 = 0.85, θ 2 = 0.5) Finally, we consider an MA(4) process with parameters (θ 1 = 0.9, θ 2 = 0.8, θ 3 = 0.75, θ 4 = 0.4) Again, we are using this function arima.sim xt=arima.sim(1000,model=list(ma=0.9)) 96

MA(1) process with alpha =.9, sigma^2=1 as.ts(xt) 4 2 0 2 4 0 200 400 600 800 1000 Time ACF 0.0 0.4 0.8 Partial ACF 0.2 0.0 0.2 0.4 97

MA(1) process with alpha=.9, sigma^2=1 as.ts(xt) 4 2 0 2 4 0 200 400 600 800 1000 Time ACF 0.5 0.0 0.5 1.0 Partial ACF 0.5 0.3 0.1 98

MA(2) process with Theta1=.85 and Theta2=.5 as.ts(xt) 4 2 0 2 4 0 200 400 600 800 1000 Time ACF 0.0 0.4 0.8 Partial ACF 0.2 0.2 0.6 99

MA(2) process with Theta1=.85 and Theta2=.5 as.ts(xt) 4 2 0 2 4 0 200 400 600 800 1000 Time ACF 0.2 0.2 0.6 1.0 Partial ACF 0.3 0.1 0.1 100

MA(4) process with Theta1=.9, Theta2=.8, Theta3=.75,Theta4=.4 as.ts(xt) 4 0 2 4 6 0 200 400 600 800 1000 Time ACF 0.2 0.2 0.6 1.0 Partial ACF 0.2 0.0 0.1 0.2 101

To obtain the ACF and PACF for an ARMA(p,q) process, we need to follow the same strategy used to obtain the ACFs and PACFs for AR and MA models. Recall that an ARMA(p,q) process is defined by the equation, X t = φ 1 X t 1 +... + φ p X t p + ω t + θ 1 ω t 1 +... + θ q ω t q As before, if we multiply by X t k both sides of the equation and take expected value, we obtain γ k = φ 1 γ k 1 +... + φ p γ k p + E(X t k ω t ) θ 1 E(X t k ω t 1 )... θ q E(X t k ω t q ) Since E(X t k ω t i ) = 0; k > i 102

we obtain that γ k = φ 1 γ k 1 +... + φ p γ k p ; k q + 1 We know (divide by γ 0 ) that this equivalent to ρ k = φ 1 ρ k 1 +... + φ p ρ k p ; k q + 1 which gives the Yule-Walker equations but with the restriction k q + 1. For a lag k q + 1, the autocorrelation function of an ARMA(p,q) process has a similar behavior to the ACF of a pure AR(p) process. However, the first q autocorrelations ρ 1,ρ 2,...ρ q depend on both autoregressive and moving average parameters. The PACF for an ARMA (p,q) is complicated and usually not needed. 103

This PACF will have a similar behavior as the PACF of a MA(q) process. Lets look at some examples for simulated data of an ARMA(1,1) processes. The examples consider 1000 simulations. The AR coefficient is 0.95 (0.6) and MA coefficient is 0.5. We will also consider an ARMA(2,1) process where the AR part is built with r = 0.95 ω = 0.42 and the MA parameter is θ = 0.7. Finally, we will show an ARMA(10,2) process where AR part is defiened with 10 complex reciprocal roots. 104

ARMA(1,1) process with Phi=.95, Theta=.5 as.ts(xt) 15 5 0 5 10 0 200 400 600 800 1000 Time ACF 0.0 0.4 0.8 Partial ACF 0.4 0.0 0.4 0.8 105

ARMA(1,1) process with Phi=.6, Theta=.5 as.ts(xt) 4 2 0 2 4 0 200 400 600 800 1000 Time ACF 0.0 0.4 0.8 Partial ACF 0.4 0.0 0.4 0.8 106

Partial ACF 0.5 0.0 0.5 ACF 0.5 0.0 0.5 1.0 as.ts(xt) 30 10 10 30 ARMA(2,1) process 0 200 400 600 800 1000 Time 107

Partial ACF 0.4 0.0 0.4 ACF 0.5 0.0 0.5 1.0 as.ts(xt) 4 0 2 4 ARMA(10,2) process 0 100 200 300 400 Time 108