FE570 Financial Markets and Trading. Stevens Institute of Technology

Similar documents
AR, MA and ARMA models

Empirical Market Microstructure Analysis (EMMA)

Lecture Notes of Bus (Spring 2017) Analysis of Financial Time Series Ruey S. Tsay

Some Time-Series Models

Univariate ARIMA Models

Module 4. Stationary Time Series Models Part 1 MA Models and Their Properties

Chapter 6: Model Specification for Time Series

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Ch 6. Model Specification. Time Series Analysis

3 Theory of stationary random processes

Univariate Time Series Analysis; ARIMA Models

Lecture 2: Univariate Time Series

Estimation and application of best ARIMA model for forecasting the uranium price.

Chapter 8: Model Diagnostics

Quantitative Finance I

Advanced Econometrics

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Part 1. Multiple Choice (50 questions, 1 point each) Part 2. Problems/Short Answer (10 questions, 5 points each)

1 Linear Difference Equations

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

SOME BASICS OF TIME-SERIES ANALYSIS

2. Multivariate ARMA

Exercises - Time series analysis

ARMA (and ARIMA) models are often expressed in backshift notation.

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Chapter 4: Models for Stationary Time Series

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Time Series I Time Domain Methods

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Review Session: Econometrics - CLEFIN (20192)

TMA4285 December 2015 Time series models, solution.

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Time Series 4. Robert Almgren. Oct. 5, 2009

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

We use the centered realization z t z in the computation. Also used in computing sample autocovariances and autocorrelations.

Univariate, Nonstationary Processes

2. An Introduction to Moving Average Models and ARMA Models

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Chapter 9: Forecasting

Automatic seasonal auto regressive moving average models and unit root test detection

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

Time Series 2. Robert Almgren. Sept. 21, 2009

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2013, Mr. Ruey S. Tsay. Midterm

CHAPTER 8 FORECASTING PRACTICE I

Circle the single best answer for each multiple choice question. Your choice should be made clearly.

7. Forecasting with ARIMA models

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Booth School of Business, University of Chicago Business 41914, Spring Quarter 2017, Mr. Ruey S. Tsay. Solutions to Midterm

Multivariate Time Series: VAR(p) Processes and Models

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

at least 50 and preferably 100 observations should be available to build a proper model

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Econometric Forecasting

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

BCT Lecture 3. Lukas Vacha.

Lecture 4a: ARMA Model

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

Modelling using ARMA processes

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

6 NONSEASONAL BOX-JENKINS MODELS

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

Econometría 2: Análisis de series de Tiempo

3. ARMA Modeling. Now: Important class of stationary processes

Ross Bettinger, Analytical Consultant, Seattle, WA

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38

Ch 8. MODEL DIAGNOSTICS. Time Series Analysis

Chapter 2: Unit Roots

5 Autoregressive-Moving-Average Modeling

Forecasting using R. Rob J Hyndman. 3.2 Dynamic regression. Forecasting using R 1

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Chapter 5: Models for Nonstationary Time Series

Midterm Suggested Solutions

Classic Time Series Analysis

1 Time Series Concepts and Challenges

STAT 436 / Lecture 16: Key

Ch 4. Models For Stationary Time Series. Time Series Analysis

Final Examination 7/6/2011

Stochastic Processes

Forecasting using R. Rob J Hyndman. 2.5 Seasonal ARIMA models. Forecasting using R 1

Lesson 9: Autoregressive-Moving Average (ARMA) models

Arma-Arch Modeling Of The Returns Of First Bank Of Nigeria

The Identification of ARIMA Models

Lecture 6a: Unit Root and ARIMA Models

CHAPTER 8 MODEL DIAGNOSTICS. 8.1 Residual Analysis

Time Series Forecasting: A Tool for Out - Sample Model Selection and Evaluation

Stochastic Modelling Solutions to Exercises on Time Series

Autoregressive Integrated Moving Average Model to Predict Graduate Unemployment in Indonesia

Econ 623 Econometrics II Topic 2: Stationary Time Series

Lecture 16: ARIMA / GARCH Models Steven Skiena. skiena

Transcription:

FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

Outline 1 Moving Average Models 2 ARMA Models 3 Unit-Root Nonstationary Time Series

Moving Average Model We start, at least in theory, an AR model with infinite order as: r t = φ 0 + φ 1 r t 1 + φ 2 r t 2 +... + a t. (1) However, such an AR model is not realistic because it has infinite many parameters. One way to make the model practical is to assume that the coefficients φ i satisfy some constraints so that they are determined by a finite number of parameters. r t = φ 0 + φ 1 r t 1 + φ 2 1r t 2 + φ 3 1r t 3 +... + a t. (2) A special case of this idea is: the coefficients depend on a single parameter θ 1 via φ i = θ1 i for i 1. To be stationary, θ 1 < 1 must hold; otherwise, θ1 i and the series will explode.

Moving Average Model The previous representation can be rewritten in a rather compact form: r t + φ 1 r t 1 + φ 2 1r t 2 + φ 3 1r t 3 +... = φ 0 + a t. (3) The model for r t 1 is then r t 1 + φ 1 1r t 2 + φ 2 1r t 3 +... = φ 0 + a t 1. (4) Multiplying the last equation with θ 1 and subtract the result from Eq. (3), we obtain r t = φ 0 (1 θ 1 ) + a t θ 1 a t 1, (5) which says that, except for the constant term, r t is a weighted average of shocks a t and a t 1. Therefore, the model is called an MA model of order 1 or MA(1) model for short.

General Moving Average Model The general form of an MA(1) model is: r t = c 0 + a t θ 1 a t 1, or r t = c 0 + (1 θ 1 B)a t, (6) where c 0 is a constant and a t is a white noise series. Similarly, an MA(2) model is in the form r t = c 0 + a t θ 1 a t 1 θ 2 a t 2, (7) The general MA(q) model is r t = c 0 θ 1 a t 1 θ 2 a t 2... θ q a t q, (8) or r t = c 0 (1 θ 1 B θ 2 B 2... θ q B q )a t, where q > 0. (9) where B is a back-shift operator.

Properties of MA Models MA models are always weakly stationary because they are finite linear combinations of a white noise sequence for which the first two moments are time-invariant. For example, consider MA(1) model, we have E(r t ) = c 0 which is time-invariant. Take the variance of MA(1) model Eq. (6), we have Var(r t ) = σ 2 a + θ 2 1σ 2 a = (1 + θ 2 1)σ 2 a. and we have Var(r t ) is time-invariant. For the general MA(q) model, we have E(r t ) = c 0 Var(r t ) = (1 + θ 2 1 + θ 2 2 +... + θ 2 q)σ 2 a.

Autocorrelation Function of MA(q) Model Assume for simplicity that c 0 = 0 for an MA(1) model, and we have r t r t l = r t l a t θ 1 r t l a t 1. γ 1 = θ 1 σ 2 a, and γ l = 0, for l > 1. Using the prior result and the fact that Var(r t ) = (1 + θ 2 1 )σ2 a, we have ρ 0 = 1, ρ 1 = θ 1 1 + θ1 2, ρ l = 0, for l > 1 For MA(2) model, the autocorrelation coefficients are: ρ 1 = θ 1 + θ 1 θ2 1 + θ1 2 +, ρ 2 = θ2 2 θ 2 1 + θ1 2 +, ρ l = 0, for l > 2 θ2 2

Autocorrelation Function of MA(q) Model We can conclude that for MA(1) model, the lag-1 ACF is not zero, but all higher order ACFs are zeros. For the MA(2) model, the ACF cuts off at lag-2 For an MA(q) model, the lag-l ACF is not zero, but ρ l = 0 for l > q. Consequently, an MA(q) series is only linearly related to its first q lagged values - finite-memory model. Invertibility Rewriting a zero-mean MA(1) model as a t = r t + θ 1 a t 1, and a t = r t + θ 1 r t 1 + θ 2 1r t 2 + +θ 3 1r t 3 +... Intuitively, θ j 1 should go to zero as j increases because the remote return r t j should have very little impact on the current shock, if any. We require θ 1 < 1, so that MA(1) is invertable.

Identifying MA(q) Order For a time series r t with ACF ρ l, if ρ q 0, but ρ l = 0 for l > q, then r t follows an MA(q) model. Estimation Maximum likelihood estimation is commonly used to estimate MA models. There are two approaches for evaluating the likelihood function of an MA model: conditional likelihood method vs. exact likelihood method. (We are not going to get into the details, but just note that the latter is preferred over the former). Forecasting Using MA Models Assume that the forecast origin is h and let F h denote the information available at time h. For the l-step ahead forecast of an MA(1) process, the model says r h+1 = c 0 + a h+1 θ 1 a h.

Forecasting Using MA Models Taking the conditional expectation, we have ˆr(1) = E(r h+1 F h ) = c 0 θ 1 a h, e h (1) = r h+1 ˆr(1) = a h+1 The variance of the 1-step ahead forecast error is Var[e h (1)] = σ 2 a. In practice, the quantity a h can be obtained in several ways. For instance, assume that a 0 = 0, then a 1 = r 1 c 0, and we can compute a t for a t h recursively by using a t = r t c 0 + θ 1 a t 1. For the 2-step ahead forecast, from the equation r h+2 = c 0 + a h+2 θ 1 a h+1.

Forecasting Using MA Models Taking the conditional expectation, we have ˆr(2) = E(r h+2 F h ) = c 0, e h (2) = r h+2 ˆr(2) = a h+2 θ 1 a h+1 The variance of the 2-step ahead forecast error is Var[e h (2)] = (1 + θ 2 1 )σ2 a, which is the variance of the model and is greater than or equal to that of the 1-step ahead forecast error. More generally, ˆr h (l) = c 0 for l 2. Similarly, for an MA(2) model, we have r h+l = c 0 + a h+l θ 1 a h+l 1 θ 2 a h+l 2. ˆr h (1) = c 0 θ 1 a h θ 2 a h 1, ˆr h (2) = c 0 θ 2 a h, ˆr h (l) = c 0, for l > 2

Forecasting Using MA Models In general, for an MA(q) model, multi-step ahead forecasts go to the mean after first q steps. Summary for AR and MA Models For MA models, the ACF is useful in specifying the order because the ACF cuts off at lag q for an MA(q) series. For AR models, the PACF is useful in order determination because the PACF cuts off at lag p for an AR(p) process. An MA series is always stationary, but for an AR series to be stationary, all of its characteristic roots must be less than 1 in modulus For stationary series, the multi-step ahead forecasts converge to the mean of the series and the variances of forecast errors converge to the variances of the series.

Building an MA Model Specification: Use sample ACF. Sample ACFs are all small after lag q for an MA(q) series. Constant term? Check the sample mean. Estimation: use maximum likelihood method (Exact method is preferred, but it is more computing intensive.) Model checking: examine residuals (to be white noise) Forecast: use the residuals as {a t }, which can be obtained from the data and fitted parameters, to perform forecasts. R examples to demonstrate ACF and PACF, and build MA models. 1 Simulated AR1, AR2, MA1, and MA2 2 IBM stock return, and E-Mini S&P Simulated time series analysis.

ARMA Models In some applications, the AR and MA models discussed in the previous sections become cumbersome because one may need a high-order model with many parameters to adequately describe the dynamic structure of the data. To overcome this difficulty, the autoregressive moving-average (ARMA) models are introduced. An ARMA model combines the idea of AR and MA models into a compact form so that the number of parameters used is kept small. ARMA(1,1) Model A time series follows an ARMA(1,1) model if it satisfies: r t φ 1 r t 1 = φ 0 + a t θ 1 a t 1 (10) where {a t } is a white noise series.

Properties of ARMA(1,1) Models Taking expectation of Eq. 10, we have E(r t ) φ 1 E(r t 1 ) = φ 0 + E(a t ) θ 1 E(a t 1 ). (11) Because E(a i ) = 0 for all i, the mean of r t is E(r t ) = µ = φ 0 1 φ 1. (12) provided that the series is weakly stationary. Next, we consider the autocovariance function of r t. We multiply Eq. 10 by a t and take expectation (simplify φ 0 = 0) E(r t a t ) = E(a 2 t ) θ 1 E(a t a t 1 ) = E(a 2 t ) = σ 2 a. r t = φ 1 r t 1 + a t θ 1 a t 1.

Properties of ARMA(1,1) Models Taking variance of prior equation: Var(r t ) = φ 2 1Var(r t 1 ) + σ 2 a + θ 2 1σ 2 a 2φ 1 θ 1 E(r t 1 a t 1 ). If the series r t is weakly stationary, then Var(r t ) = Var(r t 1) and we have we need φ 2 1 < 1 Var(r t ) = (1 2φ 1θ 1 + θ1 2)σ2 a 1 φ 2. 1 To obtain the autocovariance function of r t, we assume φ 0 = 0 and multiply the model in 10 by r t l to obtain: r t r t l φ 1 r t 1 r t l = a t r t l θ 1 a t 1 r t l.

Properties of ARMA(1,1) Models For l = 1, taking expectation In terms of ACF, we have γ 1 φ 1 γ 0 = θ 1 σ 2 a, γ l φ 1 γ l 1 = 0, for l > 1. ρ 1 = φ 1 θ 1σa 2 γ 0, ρ l = φ 1 ρ l 1, for l > 1. The ACF of an ARMA(1,1) model does not cut off at any finite lag. The stationary condition of an ARMA(1,1) model is the same as that of an AR(1) model, and the ACF of an ARMA(1,1) exhibits a pattern similar to that of an AR(1) model except that the pattern starts at lag 2.

General ARMA Models: A general ARMA(p,q) model is in the form r t = φ 0 + p φ i r t i i=1 q θ i a t i, (13) where {a t } is a white noise series and p and q are no-negative integers. The AR and MA models are special cases of ARMA(p,q) model. Using the back-shift operator, the model can be written as i=1 (1 φ 1 B... φ p B p )r t = φ 0 + (1 θ 1 B... θ q B q )a t. The polynomial 1 φ 1 B... φ p B p is the AR polynomial of the model. Similarly, 1 θ 1 B... θ q B q is the MA polynomial. We require that there are no common factors between the AR and MA polynomials.

Identifying ARMA Models: The ACF and PACF are not informative in determining the order of an ARMA model. There are two ways to determine the order of ARMA models: The Extended Autocorrelation Function (EACF) - If we can obtain a consistent estimate of the AR component of an ARMA model, then we can derive the MA component. From the derived MA series, we can use the ACF to identify the order of the MA component. The Information Criteria (AIC or BIC) - Typically, for some prespecified positive integers P and Q, one computes AIC (or BIC) for ARMA(p,q) models, where 0 p P and 0 q Q, and selects the model that gives the minimum AIC (or BIC). Once an ARMA(p,q) model is specified, its parameters can be estimated by either the conditional or exact likelihood method. The Ljung-Box statistics of the residuals can be used to check the adequacy of a fitted model.

Forecasting Using ARMA Models: Denote the forecast origin by h and the available information by F h. The 1-step ahead forecast of r h+1 becomes: ˆr h (1) = E(r h+1 F h ) = φ 0 + p φ i r h+1 i i=1 q θ i a h+1 i. i=1 and the forecast error is e h (1) = r h+1 ˆr h (1) = a h+1. The variance of 1-step ahead forecast error is Var[e h (1)] = σ 2 a. For the l-step ahead forecast, we have: ˆr h (l) = E(r h+l F h ) = φ 0 + p φ iˆr h (l i) i=1 q θ i a h (l i). where it is understood that ˆr h (l i) = r h+l i if l i 0 and a h (l i) = 0 if l i > 0 and a h (l i) = a h+l i if l i 0. i=1

Forecasting Using ARMA Models: Thus, the multi-step ahead forecasts of an ARMA model can be computed recursively. The associated forecast error is e h (l) = r h+l ˆr h (l i). Three Model Representations for an ARMA Model The three representations of a stationary ARMA(p,q) models serve different purposes. Knowing these representations can lead to a better understanding of the model. The first representation is the ARMA(p,q) model in Eq. 14. This representation is compact and useful in parameter estimation. It is also useful in computing recursively multi-step ahead forecasts of r t. For the other two representations, we use long division of two polynomials.

Three Model Representations for an ARMA Model Given the following two polynomials: φ(b) = 1 θ(b) = 1 p φ i B i i=1 q θ i B i i=1 we can obtain, by long divisions, that θ(b) φ(b) = 1 + ψ 1B + ψ 2 B 2 +... ψ(b) (14) φ(b) θ(b) = 1 + π 1B + π 2 B 2 +... π(b) (15) from the definition ψ(b)π(b) = 1.

Three Model Representations for an ARMA Model For instance, if θ(b) = 1 θ 1 B and φ(b) = 1 φ 1 B, then φ 0 θ 1 = φ 0 1 θ 1... θ q and φ 0 φ 1 = φ 0 1 φ 1... φ p AR Representation (shows the dependence of the current return r t on the past return r t i where i > 0) r t = φ 0 1 θ 1... θ q + π 1 r t 1 + π 2 r t 2 + π 3 r t 3 +... (16) MA Representation (shows explicitly the impact of the past shock a t i (i > 0) on the current return r t.) r t = φ 0 1 φ 1... φ p + ψ 1 a t 1 + ψ 2 a t 2 + ψ 3 a t 3 +... (17)

Unit-Root Nonstationary Time Series Consider an ARMA model, if one extends the model by allowing the AR polynomial to have 1 as a characteristic root, then the model becomes the well-known autoregressive integrated moving-average (ARIMA) model. A conventional approach for handling unit-root nonstationarity is to use differencing. A time series is said to be an ARMA(p, 1, q) process if the change series c t = y t y t 1 = (1 B)y t, follows a stationary and invertable ARMA(p, q) model. In finance, price series are commonly believed to be nonstationary, but the log return series is stationary. r t = ln(p t ) ln(p t 1 )

Unit-Root Nonstationary Time Series In some scientific field, a time series y t may contain multiple unit roots and needs to be differenced multiple times to become stationary. More complex model ARIMA(p, d, q) can be applied. Unit-Root Test whether the log price p t of an asset follows a random walk or a random walk with drift. p t = θ 1 p t + e t, p t = θ 0 + θ 1 p t 1 + e t, the null hypotheses H 0 : φ 1 = 1 versus the alternative hypothesis H a : φ 1 < 1. This is well-known Dickey-Fuller test. ARIMA also removes the trend and provides a trend-stationary time series.

Example Use R ARIMA module to fit an ARMA model Identify Model Estimate Parameters Model Checking Forecast Using Fitted Model