Time Series Analysis

Similar documents
Time Series Analysis

Ross Bettinger, Analytical Consultant, Seattle, WA

3 Theory of stationary random processes

Time Series I Time Domain Methods

Econometría 2: Análisis de series de Tiempo

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Time Series Analysis

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay Midterm

9. Multivariate Linear Time Series (II). MA6622, Ernesto Mordecki, CityU, HK, 2006.

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Multivariate ARMA Processes

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

Time Series Outlier Detection

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

Some Time-Series Models

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

FE570 Financial Markets and Trading. Stevens Institute of Technology

A time series is called strictly stationary if the joint distribution of every collection (Y t

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Univariate Time Series Analysis; ARIMA Models

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

STA 6857 VAR, VARMA, VARMAX ( 5.7)

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Econometrics II Heij et al. Chapter 7.1

New Introduction to Multiple Time Series Analysis

Lecture # 37. Prof. John W. Sutherland. Nov. 28, 2005

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Identifiability, Invertibility

Empirical Market Microstructure Analysis (EMMA)

Classic Time Series Analysis

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Multivariate Time Series: VAR(p) Processes and Models

Advanced Econometrics

Lecture 2: Univariate Time Series

Statistics 349(02) Review Questions

LINEAR STOCHASTIC MODELS

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

Discrete time processes

Chapter 4: Models for Stationary Time Series

Time Series Analysis

INDIAN INSTITUTE OF SCIENCE STOCHASTIC HYDROLOGY. Lecture -33 Course Instructor : Prof. P. P. MUJUMDAR Department of Civil Engg., IISc.

Univariate ARIMA Models

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

1 Linear Difference Equations

2. Multivariate ARMA

1 Class Organization. 2 Introduction

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Time Series Examples Sheet

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Chapter 2: Unit Roots

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Time Series Analysis

International Journal of Advancement in Physical Sciences, Volume 4, Number 2, 2012

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

Time Series: Theory and Methods

Vector autoregressions, VAR

Vector Auto-Regressive Models

Vector autoregressive Moving Average Process. Presented by Muhammad Iqbal, Amjad Naveed and Muhammad Nadeem

VAR Models and Applications

Examination paper for TMA4285 Time Series Models

Time Series Solutions HT 2009

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

2. An Introduction to Moving Average Models and ARMA Models

Applied time-series analysis

Generalised AR and MA Models and Applications

IDENTIFICATION OF ARMA MODELS

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Econometría 2: Análisis de series de Tiempo

Notes on Time Series Modeling

Introduction to ARMA and GARCH processes

TMA4285 December 2015 Time series models, solution.

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Autoregressive Moving Average (ARMA) Models and their Practical Applications

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Cointegrated VARIMA models: specification and. simulation

Econ 424 Time Series Concepts

Acta Universitatis Carolinae. Mathematica et Physica

5: MULTIVARATE STATIONARY PROCESSES

Estimation and application of best ARIMA model for forecasting the uranium price.

ARMA (and ARIMA) models are often expressed in backshift notation.

Chapter 9: Forecasting

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Time Series 2. Robert Almgren. Sept. 21, 2009

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

MCMC analysis of classical time series algorithms.

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

STAT 443 (Winter ) Forecasting

Elements of Multivariate Time Series Analysis

ARMA models with time-varying coefficients. Periodic case.

Transcription:

Time Series Analysis hm@imm.dtu.dk Informatics and Mathematical Modelling Technical University of Denmark DK-2800 Kgs. Lyngby 1

Outline of the lecture Chapter 9 Multivariate time series 2

Transfer function models with ARMA input Y t = ω(b) δ(b) Bb X t + θ ε(b) ϕ ε (B) ε t X t = θ η(b) ϕ η (B) η t we require {ε t } and {η t } to be mutually uncorrelated. 3

Transfer function models with ARMA input From the above we get Y t = ω(b) δ(b) Bb X t + θ ε(b) ϕ ε (B) ε t X t = θ η(b) ϕ η (B) η t δ(b)ϕ ε (B)Y t ϕ η (B)X t = ϕ ε (B)ω(B)B b X t + δ(b)θ ε (B)ε t = θ η (B)η t The term including X t on the RHS is moved to the LHS 3

Transfer function models with ARMA input Y t = ω(b) δ(b) Bb X t + θ ε(b) ϕ ε (B) ε t X t = θ η(b) ϕ η (B) η t δ(b)ϕ ε (B)Y t ϕ ε (B)ω(B)B b X t ϕ η (B)X t = δ(b)θ ε (B)ε t = θ η (B)η t Which can be written in matrix notation 3

Transfer function models with ARMA input Y t = ω(b) δ(b) Bb X t + θ ε(b) ϕ ε (B) ε t X t = θ η(b) ϕ η (B) η t [ δ(b)ϕε (B) ϕ ε (B)ω(B)B b 0 ϕ η (B) [ Yt X t = [ δ(b)θε (B) 0 0 θ η (B) [ εt η t 3

Transfer function models with ARMA input Y t = ω(b) δ(b) Bb X t + θ ε(b) ϕ ε (B) ε t X t = θ η(b) ϕ η (B) η t [ δ(b)ϕε (B) ϕ ε (B)ω(B)B b 0 ϕ η (B) [ Yt X t = [ δ(b)θε (B) 0 0 θ η (B) [ εt η t For multivariate ARMA-models we replace the zeroes by polynomials in B, allow non-zero correlation between ε t and η t, and generalize to more dimensions 3

Multivariate ARMA models The model can be written φ(b)(y t c) = θ(b)ǫ t The individual time series may have been transformed and differenced The variance-covariance matrix of the multivariate white noise process {ǫ t } is denoted Σ. The matrices φ(b) and θ(b) has elements which are polynomials in the backshift operator The diagonal elements has leading terms of unity The off-diagonal elements have leading terms of zero (i.e. they normally start in B) 4

Air pollution in cities NO and NO 2 [ X1,t X 2,t = [ 0.9 0.1 0.4 0.8 [ X1,t 1 X 2,t 1 + [ ξ1,t ξ 2,t Σ = [ 30 21 21 23 Or X 1,t 0.9X 1,t + 0.1X 2,t 1 = ξ 1,t 0.4X 1,t 1 + X 2,t 0.8X 2,t 1 = ξ 2,t the LHS can be written using a matrix for which the elements are polynomials i B 5

Air pollution in cities NO and NO 2 [ X1,t X 2,t = [ 0.9 0.1 0.4 0.8 [ X1,t 1 X 2,t 1 + [ ξ1,t ξ 2,t Σ = [ 30 21 21 23 Formulation using the backshift operator: [ 1 0.9B 0.1B 0.4B 1 0.8B X t = ξ t or φ(b)x t = ξ t 5

Air pollution in cities NO and NO 2 [ X1,t X 2,t = [ 0.9 0.1 0.4 0.8 [ X1,t 1 X 2,t 1 + [ ξ1,t ξ 2,t Σ = [ 30 21 21 23 Formulation using the backshift operator: [ 1 0.9B 0.1B 0.4B 1 0.8B X t = ξ t or φ(b)x t = ξ t Alternative formulation: [ 0.9 0.1 X t 0.4 0.8 X t 1 = ξ t or X t φ 1 X t 1 = ξ t 5

Stationarity and Invertability The multivariate ARMA process φ(b)(y t c) = θ(b)ǫ t is stationary if is invertible if det(φ(z 1 )) = 0 z < 1 det(θ(z 1 )) = 0 z < 1 6

Two formulations (centered data) Matrices with polynomials in B as elements: φ(b)y t = θ(b)ǫ t Without B, but with matrices as coefficients: Y t φ 1 Y t 1... φ p Y t p = ǫ t θ 1 ǫ t 1... θ q ǫ t q 7

Auto Covariance Matrix Functions Γ k = E[(Y t k µ Y )(Y t µ Y ) T = Γ T k Example for bivariate case Y t = (Y 1,t Y 2,t ) T : Γ k = [ γ11 (k) γ 12 (k) γ 21 (k) γ 22 (k) = [ γ11 (k) γ 12 (k) γ 12 ( k) γ 22 (k) And therefore we will plot autocovariance or autocorrelation functions for k = 0,1,2,... and one of each pair of cross-covariance or cross-correlation functions for k = 0, ±1, ±2,... 8

The Theoretical Autocovariance Matrix Functions Using the matrix coefficients φ 1,...,φ p and θ 1,...,θ q, together with Σ, the theoretical Γ k can be calculated: Pure Autoregressive Models: Γ k is found from a multivariate version of Theorem 5.10 in the book, which leads to the Yule-Walker equations Pure Moving Average Models: Γ k is found from a multivariate version of (5.65) in the book Autoregressive Moving Average Models: Γ k is found multivariate versions of (5.100) and (5.101) in the book Examples can be found in the book note the VAR(1)! 9

Identification using Autocovariance Matrix Functions Sample Correlation Matrix Function; R k near zero for pure moving average processes of order q when k > q Sample Partial Correlation Matrix Function; S k near zero for pure autoregressive processes of order p when k > p Sample q-conditioned Partial Correlation Matrix Function; S k (q) near zero for autoregressive moving average processes of order (p,q) when k > p can be used for univariate processes also. 10

Identification using (multivariate) prewhitening Fit univariate models to each individual series Investigate the residuals as a multivariate time series The cross correlations can then be compared with ±2/ N This is not the same form of prewhitening as in Chapter 8 The multivariate model φ(b)y t = θ(b)ǫ t is equivalent to diag(det(φ(b)))y t = adj(φ(b))θ(b)ǫ t Therefore the corresponding univariate models will have much higher order, so although this approach is often used in the literature: Don t use this approach! 11

Example Muskrat and Mink skins traded 12

Raw data (maybe not exactly as in the paper) Skins traded 200000 600000 1400000 Muskrat Mink 1850 1855 1860 1865 1870 1875 1880 1885 1890 1895 1900 1905 1910 13

Stationary and Gaussian data Skins traded log transformed and muskrat data differenced 0 2 4 6 8 10 12 14 Muskrat Mink 1850 1855 1860 1865 1870 1875 1880 1885 1890 1895 1900 1905 1910 14

w1 SACF w1 and w2 15 ACF 0.2 0.2 0.6 1.0 0.6 0.2 0.0 0.2 0 2 4 6 8 10 12 14 0 2 4 6 8 10 12 14 w2 and w1 w2 ACF 0.3 0.1 0.1 0.3 0.4 0.0 0.4 0.8 14 10 6 4 2 0 Lag 0 2 4 6 8 10 12 14 Lag w1 SPACF w1 and w2 0 2 4 6 8 10 12 14 0 2 4 6 8 10 12 14 Partial ACF 0.2 0.0 0.1 0.2 0.6 0.2 0.0 0.2 w2 and w1 w2 14 10 6 4 2 0 Lag 0 2 4 6 8 10 12 14 Lag Partial ACF 0.2 0.0 0.2 0.4 0.0 0.4

Yule-Walker Estimates > fit.ar3yw <- ar(mmdata.tr, order=3) > fit.ar3yw$ar[1,, # lag 1 [,1 [,2 w1 [1, 0.2307413-0.7228413 [2, 0.4172876 0.7417832 > fit.ar3yw$ar[2,, # lag 2 [,1 [,2 [1, -0.1846027 0.2534646 w2 and w1 [2, -0.2145025 0.1920450 > fit.ar3yw$ar[3,, # lag 3 [,1 [,2 [1, 0.08742842 0.09047402 [2, 0.14015902-0.36146002 Lag > acf(fit.ar3yw$resid) ACF 0.2 0.0 0.2 0.4 0.6 0.8 1.0 ACF 0.2 0.1 0.0 0.1 0.2 0.3 Multivariate Series : fit.ar3yw$resid 0 2 4 6 8 10 12 14 14 12 10 8 6 4 2 0 0.2 0.1 0.0 0.1 0.2 0.3 0.2 0.0 0.2 0.4 0.6 0.8 1.0 w1 and w2 0 2 4 6 8 10 12 14 w2 0 2 4 6 8 10 12 14 Lag 16

Maximum likelihood estimates > ## Load module > module(finmetrics) S+FinMetrics Version 2.0.2 for Linux 2.4.21 : 2005 > ## Means: > colmeans(mmdata.tr) w1 w2 0.02792121 10.7961 > ## Center non-differences data: > tmp.dat <- mmdata.tr > tmp.dat[,2 <- tmp.dat[,2 - mean(tmp.dat[,2) > colmeans(tmp.dat) w1 w2 0.02792121-1.514271e-15 17

Maximum likelihood estimates (cont nd) > mgarch( -1 + ar(3), dvec(0,0), series=tmp.dat, armatype="full")...[deleted... Convergence reached. Call: mgarch(formula.mean = -1 + ar(3), formula.var = dvec(0, 0), series = tmp.dat, armatype = "full") Mean Equation: -1 + ar(3) Conditional Variance Equation: dvec(0, 0) Coefficients: AR(1; 1, 1) 0.23213... 18

Maximum likelihood estimates (cont nd) Coefficients: AR(1; 1, 1) 0.23213 AR(1; 2, 1) 0.43642 AR(1; 1, 2) -0.73288 AR(1; 2, 2) 0.74650 AR(2; 1, 1) -0.17461 AR(2; 2, 1) -0.25434 AR(2; 1, 2) 0.26512 AR(2; 2, 2) 0.21456 AR(3; 1, 1) 0.07007 AR(3; 2, 1) 0.17788 AR(3; 1, 2) 0.12217 AR(3; 2, 2) -0.41734 A(1, 1) 0.06228 A(2, 1) 0.01473 A(2, 2) 0.06381 19

Model Validation For the individual residual series; all the methods from Chapter 6 in the book with the extension for the cross correlation as mentioned in Chapter 8 in the book 20

Forecasting The model: Y t+l = φ 1 Y t+l 1 +...+φ p Y t+l p +ǫ t+l θ 1 ǫ t+l 1... θ q ǫ t+l q 1-step: Ŷ t+1 t = φ 1 Y t+1 1 +... + φ p Y t+1 p +0 θ 1 ǫ t+1 1... θ q ǫ t+1 q 2-step: Ŷ t+2 t = φ 1 Ŷ t+2 1 t +... + φ p Y t+2 p + 0 θ 1 0... θ q ǫ t+2 q and so on... in S-PLUS: > predict(fit.ml, 10) # fit.ml from mgarch() above However, this does not calculate the variance-covariance matrix of the forecast errors use the hint given in the text book. 21