Time Series Analysis

Similar documents
STAT Financial Time Series

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Econometría 2: Análisis de series de Tiempo

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

3 Theory of stationary random processes

Empirical Market Microstructure Analysis (EMMA)

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Chapter 4: Models for Stationary Time Series

Some Time-Series Models

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Chapter 6: Model Specification for Time Series

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

6 NONSEASONAL BOX-JENKINS MODELS

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Time Series 2. Robert Almgren. Sept. 21, 2009

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Introduction to ARMA and GARCH processes

at least 50 and preferably 100 observations should be available to build a proper model

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

FE570 Financial Markets and Trading. Stevens Institute of Technology

Univariate Time Series Analysis; ARIMA Models

2. An Introduction to Moving Average Models and ARMA Models

White Noise Processes (Section 6.2)

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Chapter 8: Model Diagnostics

Midterm Suggested Solutions

Ch 4. Models For Stationary Time Series. Time Series Analysis

AR, MA and ARMA models

3. ARMA Modeling. Now: Important class of stationary processes

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

Lecture 2: Univariate Time Series

Stochastic Modelling Solutions to Exercises on Time Series

Econometrics I: Univariate Time Series Econometrics (1)

A time series is called strictly stationary if the joint distribution of every collection (Y t

Chapter 3 - Temporal processes

Ch 6. Model Specification. Time Series Analysis

1 Linear Difference Equations

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Problem Set 2: Box-Jenkins methodology

Advanced Econometrics

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Reliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends

Stochastic Processes

Non-Stationary Time Series and Unit Root Testing

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Discrete time processes

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Review Session: Econometrics - CLEFIN (20192)

Applied time-series analysis

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Multivariate Time Series

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

Econometría 2: Análisis de series de Tiempo

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Lesson 9: Autoregressive-Moving Average (ARMA) models

MCMC analysis of classical time series algorithms.

Covariances of ARMA Processes

Lesson 2: Analysis of time series

Econ 623 Econometrics II Topic 2: Stationary Time Series

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Time Series 3. Robert Almgren. Sept. 28, 2009

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

An introduction to volatility models with indices

Ch. 14 Stationary ARMA Process

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

LINEAR STOCHASTIC MODELS

Non-Stationary Time Series and Unit Root Testing

Generalised AR and MA Models and Applications

Time Series Analysis

Time Series Analysis. Solutions to problems in Chapter 5 IMM

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Econometrics II Heij et al. Chapter 7.1

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

Modelling using ARMA processes

Ross Bettinger, Analytical Consultant, Seattle, WA

Nonlinear time series

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

Econ 424 Time Series Concepts

Lecture 4a: ARMA Model

Stationary Stochastic Time Series Models

Stochastic processes: basic notions

Non-Stationary Time Series and Unit Root Testing

E 4101/5101 Lecture 6: Spectral analysis

Lecture note 2 considered the statistical analysis of regression models for time

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis -- An Introduction -- AMS 586

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Lecture 1: Stationary Time Series Analysis

Transcription:

Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March 3, 2017 1 / 37

Table of Contents 1 Introduction 2 Stationary Processes 3 AR, MA, & ARMA 4 Sample ACF 5 Yule-Walker Equations 6 Partial ACF Christopher Ting Week 9 March 3, 2017 2 / 37

Introduction Learning Objectives Understand thoroughly the nature of white noise Describe AR, MA, and ARMA time series using white noise as the building block Compute the unconditional and conditional means, as well as variances of these stationary processes Describe the autocorrelation function and how the function is estimated and tested for statistical significance (Box and Pierce Q-statistic, Ljung and Box statistic) Define back-shift operator B n and describe the relationship between MA and AR( ) processes Thoroughly understand the Yule-Walker equations and their applications to obtain partial autocorrelation function Christopher Ting Week 9 March 3, 2017 3 / 37

Introduction Components of a Time Series In general, a time series Y t has the following components: (a) trend-cycle (TC) component: the combined long-term and growth cycle movement of the time series (b) seasonal (S) component: the systematic variations of the time series (c) irregular (I) component: the random fluctuations of short-term movements of the time series. Christopher Ting Week 9 March 3, 2017 4 / 37

Introduction Time Series Trend-Cycle Component Seasonal Component Irregular Component Seasonality Calendar Effects Moving-Holiday Effect Trading-Day Effect Christopher Ting Week 9 March 3, 2017 5 / 37

Introduction Example: Nike s EPS Christopher Ting Week 9 March 3, 2017 6 / 37

Stationary Processes Basic Building Block: White Noise ˇ ( Properties of white noise Independently identically distributed process {u t } Stationary E(u t ) = 0 V(u t ) = σ 2 u C(u t, u t+k ) = 0 for any k 0 Stronger definition: u t is independent of u t+k for all k 0 ˇ ( How doe white noise sound like? Generate white noise with Matlab: u = randn(8192*10, 1); Create an audioplayer object: pu = audioplayer(u, 8192); Listen to white noise: play(pu); Christopher Ting Week 9 March 3, 2017 7 / 37

Stationary Processes AR, MA, and ARMA With basic white noise process {u t }, t =,...,, the following processes are generated: Autoregressive order one: AR(1) process Y t = θ + λ Y t 1 + u t Moving Average order one: MA(1) process Y t = θ + u t + α u t 1 Autoregressive Moving Average order one: ARMA(1, 1) process Y t = θ + λ Y t 1 + u t + α u t 1 Christopher Ting Week 9 March 3, 2017 8 / 37

AR, MA, & ARMA Autoregressive Process AR(1) Repeated substitution for Y t in the AR(1) process leads to For each t, provided λ < 1, Y t = θ + λ(θ + λy t 2 + u t 1 ) + u t Y t = (1 + λ + λ 2 + )θ + (u t + λu t 1 + λ 2 u t 2 + ) E(Y t ) = (1 + λ + λ 2 + )θ = θ 1 λ V(Y t ) = V(u t + λu t 1 + λ 2 u t 2 + ) = σ 2 u(1 + λ 2 + λ 4 + ) = σ 2 u 1 λ 2 σ 2 u C(Y t, Y t 1 ) = C ( ) θ + λy t 1 + u t, Y t 1 = λ 1 λ 2 Christopher Ting Week 9 March 3, 2017 9 / 37

AR, MA, & ARMA Autocorrelation Coefficients of AR(1) Autocovariance at lag k is a function of k only k 1 k 1 C(Y t, Y t k ) = C θ λ j + λ k Y t k + λ j u t j, Y t k = λk V(Y t ) γ(k) j=0 j=0 Accordingly, the serial correlation is Corr(Y t, Y t k ) = C(Y t, Y t k ) V(Y t ) = λ k For a symmetrical time shift t t + k, Corr(Y t+k, Y t ) = C(Y t+k, Y t ) V(Y t+k ) = λ k Hence ρ(k) Corr ( Y t, Y t±k ) = λ k Christopher Ting Week 9 March 3, 2017 10 / 37

AR, MA, & ARMA Illustrative Example of an AR(1) Process Suppose Y t = 2.5 + 0.5Y t 1 + u t, where E(u t ) = 0, V(u t ) = 3 1 What is the mean of Y t? 2 What is the variance of Y t? 3 What is the first-order autocovariance of this AR(1) process? 4 What is the first-order autocorrelation of this AR(1) process? 5 What is the third-order autocorrelation of this AR(1) process? Christopher Ting Week 9 March 3, 2017 11 / 37

Simulation parameters: AR, MA, & ARMA Simulating an AR(1) Process number of samples 5,000 σ 2 u = 3 θ = 2.5, λ 1 = 0.5 n = 5000 su2 = 3 theta, lambda1 = 2.5, 0.5 np.random.seed(20170303) u = np.random.normal(0, 1, n) mu, su = np.mean(u), np.std(u, ddof=1) u -= mu # ensure that mean of u is really 0 u /= su; # ensure that variance of u is really 1 u *= np.sqrt(su2) Y = np.zeros(n, dtype=float) Y[0] = theta + u[0] for t in range(1, n): Y[t] = theta + lambda1*y[t-1] + u[t] Christopher Ting Week 9 March 3, 2017 12 / 37

AR, MA, & ARMA Sample Autocorrelation Function of AR(1) Christopher Ting Week 9 March 3, 2017 13 / 37

AR, MA, & ARMA Another Example of AR(1), λ = 0.5 Christopher Ting Week 9 March 3, 2017 14 / 37

AR, MA, & ARMA Moving Average Process MA(1) For MA(1) process Y t = θ + u t + α u t 1, E ( Y t ) = θ V ( Y t ) = (1 + α 2 )σ 2 u C ( ) ( ) Y t, Y t 1 = C θ + ut + αu t 1, θ + u t 1 + αu t 2 = α σ 2 u Corr ( ) α Y t, Y t 1 = 1 + α 2 Corr ( ) Y t, Y t k = 0 for k > 1 Hence, MA(1) process is covariance-stationary with constant mean θ, constant variance σ 2 u(1 + α 2 ) and autocorrelation function of k = 1 only nonzero. Christopher Ting Week 9 March 3, 2017 15 / 37

AR, MA, & ARMA Illustrative Example of an MA(1) Process Suppose Y t = 2.5 + u t + 0.5u t 1, where E(u t ) = 0, V(u t ) = 3 1 What is the mean of Y t? 2 What is the variance of Y t? 3 What is the first-order autocovariance of this MA(1) process? 4 What is the first-order autocorrelation of this MA(1) process? 5 What is the third-order autocorrelation of this MA(1) process? Christopher Ting Week 9 March 3, 2017 16 / 37

AR, MA, & ARMA Simulating an MA(1) Process Simulation parameters: number of samples 5,000 σ 2 u = 3 θ = 2.5, α = 0.5 n = 5000 su2 = 3 theta, alpha = 2.5, 0.5 np.random.seed(20170303) u = np.random.normal(0, 1, n) mu, su = np.mean(u), np.std(u, ddof=1) u -= mu # ensure that mean of u is really 0 u /= su; # ensure that variance of u is really 1 u *= np.sqrt(su2) Y = np.zeros(n, dtype=float) Y[0] = theta + u[0] for t in range(1, n): Y[t] = theta + alpha*u[t-1] + u[t] Christopher Ting Week 9 March 3, 2017 17 / 37

AR, MA, & ARMA Sample ACF of MA(1) Process Christopher Ting Week 9 March 3, 2017 18 / 37

AR, MA, & ARMA Autoregressive Moving Average Process ARMA(1,1) Repeated substitution of Y t = θ + λy t 1 + u t + α u t 1 leads to Y t = θ λ i + u t + (λ + α) λ i u t i i=0 For each t, provided λ < 1 E(Y t ) = V(Y t ) = σ 2 u θ 1 λ 1 + (λ + α)2 i=0 C(Y t, Y t 1 ) = λ V(Y t k ) + α σ 2 u C(Y t, Y t k ) = λ k V(Y t k ), for k > 1 i=0 λ 2i = σ2 u (1 + ) (λ + α)2 1 λ 2 Christopher Ting Week 9 March 3, 2017 19 / 37

AR, MA, & ARMA Autoregressive Moving Average Process ARMA(1,1) (cont d) Hence, ARMA(1,1) process is covariance-stationary ) with constant θ mean 1 λ, constant variance (λ + σ2 α)2 u (1 + 1 λ 2, and autocovariance a function of k only. Christopher Ting Week 9 March 3, 2017 20 / 37

AR, MA, & ARMA Sample ACF of ARMA(1,1) Process Christopher Ting Week 9 March 3, 2017 21 / 37

AR, MA, & ARMA Autocorrelation Function of ARMA(1,1) Question 1: Derive the autocorrelation function of ARMA(1,1) process Question 2: If λ = α, what is the ACF of ARMA(1,1) at lag 1? Christopher Ting Week 9 March 3, 2017 22 / 37

AR, MA, & ARMA Changing Conditional Means AR(1) ℵ At t + 1, information about Y t is already known, so for AR(1) process, conditional mean is a random variable E ( Y t+1 Yt ) = θ + λyt + E ( u t+1 Yt ) = θ + λyt ℵ Conditional variance is constant MA(1) V ( Y t+1 Yt ) = V ( ut+1 Yt ) = σ 2 u < σ2 u 1 λ 2 θ 1 λ ℵ Conditional mean is stochastic but conditional variance is constant: E ( Y t+1 Yt ) = θ + αut V ( Y t+1 Yt ) = σ 2 u < σ 2 u(1 + α 2 ) Christopher Ting Week 9 March 3, 2017 23 / 37

Sample ACF Sample Autocorrelation Function Sample autocovariance lag k, for k = 0, 1, 2,..., p, with p < T/4. c(k) = 1 T k ( Yt Y )( Y t+k Y ) T The estimator c(k) is consistent, as lim T c(k) = C ( Y k, Y t k ). t=1 Sample autocorrelation lag k is then r(k) = c(k) c(0) = T k t=1 which is also consistent, as lim T r(k) = ρ(k). ( Yt Y )( Y t+k Y ) Tt=1 ( Yt Y ) 2 Variance of r(k) for AR(1) process V ( r(k) ) 1 ( (1 + λ 2 )(1 λ 2k ) ) 2kλ 2k T 1 λ 2 Christopher Ting Week 9 March 3, 2017 24 / 37

Sample ACF Test for ρ = 0, AR(1) Test hypothesis H 0 : ρ(j) = 0 for all j > 0 or H 0 = λ = 0. Then V ( r(k) ) 1 T To test that the j th autocorrelation is zero, the test statistic is z j = r(j) 0 1/T d N(0, 1) Reject H 0 : ρ(j) = 0 at 5% significance level if z > 1.96 Christopher Ting Week 9 March 3, 2017 25 / 37

Sample ACF Test for ρ = 0, MA(q) For MA(q) processes, the variance of r(k) is V ( r(k) ) 1 q T 1 + 2 ρ(i) 2 Thus for MA(1) process, V ( r(k) ) for k > 1 is estimated by i=1 1 ( 1 + 2r(1) 2 ) > 1 T T and rejection of the null hypothesis allows the identification of MA(1). Christopher Ting Week 9 March 3, 2017 26 / 37

Sample ACF Joint Test Statistic To test if all autocorrelations are jointly zero, the null hypothesis is H 0 : ρ(1) = ρ(2) = = ρ(m) = 0 The Box and Pierce Q-statistic for asymptotic test is m m Q m = T r(k) 2 ( ) = T r(k) 2 = m z 2 k k=1 k=1 k=1 d χ 2 m The Ljung and Box test statistic provides an approximate correction for finite sample: Q m = T(T + 2) m k=1 r(k) 2 T k d χ 2 m Christopher Ting Week 9 March 3, 2017 27 / 37

Sample ACF ARMA (p, q) Processes AR(p) process is ARMA (p, 0); MA(q) process is ARMA(0, q). Backward shift operator B: BY t Y t 1 Applying the backward shift operator k times, AR(p) process B k Y t = Y t k Y t = θ + λ 1 Y t 1 + λ 2 Y t 2 + + λ p Y t p + u t = θ + φ ( B ) p Y t 1 λ i B i Y t = θ + u t i=1 p λ i B i Y t + u t The equation φ(b) = 0 is called the characteristic equation of the AR(p) process. For AR(p) to be stationary, the roots of the characteristic equation must lie outside of the unit circle. Christopher Ting Week 9 March 3, 2017 28 / 37 i=1

Sample ACF Representation of MA(1) MA(1) process is representable as AR( ) process Y t = θ + u t + α u t 1 = θ + (1 + α B)u t (1 + αb) 1 Y t = (1 + αb) 1 θ + u t ( 1 αb + α 2 B 2 α 3 B 3 + )Y t = c + u t where c (1 + αb) 1 θ. Hence Y t = c + αy t 1 α 2 Y t 2 + α 3 Y t 3 α 4 Y t 4 + + u t Stationarity requires (1 + αb) = 0 lie outside the unit circle, which is satisfied if α < 1. A stationary MA(q) process is said to be invertible if it can be represented as a stationary AR( ) process. What s the point of inverting an MA process? Christopher Ting Week 9 March 3, 2017 29 / 37

Yule-Walker Equations Yule-Walker Equations (1) Multiply both sides of AR(p) process by Y t k, Y t k Y t = Y t k θ + λ 1 Y t k Y t 1 + λ 2 Y t k Y t 2 + + λ p Y t k Y t p + Y t k u t Taking unconditional expectations on both sides and noting that E ( ) Y t k Y t = γ(k) + µ 2 for any k, then γ(k) + µ 2 ( = µθ + λ 1 γ(k 1) + µ 2 ) ( + λ 2 γ(k 2) + µ 2 ) ( + + λ p γ(k p) + µ 2 ) The unconditional means of an AR(p) process is µ = θ + λ 1 µ + + λ p µ. So µ 2 = µθ + λ 1 µ 2 + + λ p µ 2. Hence for k > 1, γ(k) = λ 1 γ(k 1) + λ 2 γ(k 2) + + λ p γ(k p) Dividing both sides by γ(0) to obtain correlations: ρ(k) = λ 1 ρ(k 1) + λ 2 ρ(k 2) + + λ p ρ(k p) Christopher Ting Week 9 March 3, 2017 30 / 37

Yule-Walker Equations Yule-Walker Equations (2) Note that ρ(0) = 1 and ρ(j) = ρ( j) For each k, there is a corresponding equation with p parameters of λ 1, λ 2,..., λ p, resulting in the Yule-Walker equations: ρ(1) = λ 1 + λ 2 ρ(1) + + λ p ρ(p 1) ρ(2) = λ 1 ρ(1) + λ 2 + + λ p ρ(p 2) ρ(3) = λ 1 ρ(2) + λ 2 ρ(1) + + λ p ρ(p 3). =. +. + +. ρ(p) = λ 1 ρ(p 1) + λ 2 ρ(p 2) + + λ p Christopher Ting Week 9 March 3, 2017 31 / 37

Yule-Walker Equations Yule-Walker Equations in Matrix Form (1) Replace the ACF ρ(p) by sample ACF r(k), the p linear equations can be solved as 1 r(1) r(2) r(p 1) r(1) r(1) 1 r(1) r(p 2) r(2) R. r(p), Φ r(2) r(1) 1 r(p 3)....... r(p 1) r(p 2) r(p 3) 1 Λ λ 1 λ 2. λ p Christopher Ting Week 9 March 3, 2017 32 / 37

Yule-Walker Equations Yule-Walker Equations in Matrix Form (2) Write compactly as R = ΦΛ, so the parameters can be estimated as along with Λ = Φ 1 R, µ = 1 T T t=1 Y t Christopher Ting Week 9 March 3, 2017 33 / 37

Partial ACF Partial Autocorrelation Function ˇ * The sample ACF allows the identification respectively of either an AR or an MA process depending on whether the sample r(k) decays slowly or are clearly zero after some lag k. ˇ * However, even if an AR(p) is identified, it is still difficult to identify the order of the lag, p, since all AR(p) processes show similar decay patterns of ACF. ˇ * Is it possible to identify the order of an AR process? ˇ * Yes, use the Yule-Walker equations to generate a sequence of λ kk, k = 1, 2,..., p which is the PACF. Christopher Ting Week 9 March 3, 2017 34 / 37

Partial ACF Sample PACF Calculation (1) Suppose the true process is AR(p). Take k = 1 and based on AR(1) Y t = θ + λ 11 Y t 1 + u t The Yule-Walker equation for k = 1 gives λ 11 = r(1) Next take k = 2, i.e., assume that it is AR(2). The Yule-Walker system is ) ( ) 1 ( ) ( λ 21 1 r(1) r(1) = λ 22 r(1) 1 r(2) Solution for λ 22 is λ 22 = r(2) r(1)2 1 r(1) 2 Christopher Ting Week 9 March 3, 2017 35 / 37

Partial ACF Sample PACF Calculation (2) In general, to calculate the p-th value, take k = p and the AR(p) is Y t = θ + λ p1 Y t 1 + λ p2 Y t 2 + + λ kp Y t p + u t Solve the Yule-Walker equations Λ = Φ 1 R, If the true order is p, theoretically, λ 11 0, λ 22 0,..., λ pp 0 λ p+1p+1 = λ p+2p+2 = = 0 Christopher Ting Week 9 March 3, 2017 36 / 37

Partial ACF Duality between AR and MA, ACF and PACF While an AR(p) process has a decaying ACF infinite in extent, the PACF cuts off after lag p. Recall that an MA(1) process is invertible into AR( ). In general, this property holds for MA(q) processes. So while the ACF of an MA(q) process cuts off after lag q, the PACF is infinite in extent. Christopher Ting Week 9 March 3, 2017 37 / 37