Quantitative Finance I

Similar documents
FE570 Financial Markets and Trading. Stevens Institute of Technology

at least 50 and preferably 100 observations should be available to build a proper model

Some Time-Series Models

Lecture 2: Univariate Time Series

Classic Time Series Analysis

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Empirical Market Microstructure Analysis (EMMA)

FORECASTING SUGARCANE PRODUCTION IN INDIA WITH ARIMA MODEL

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Univariate ARIMA Models

TIME SERIES ANALYSIS AND FORECASTING USING THE STATISTICAL MODEL ARIMA

Introduction to ARMA and GARCH processes

Ch 6. Model Specification. Time Series Analysis

Univariate Time Series Analysis; ARIMA Models

A SARIMAX coupled modelling applied to individual load curves intraday forecasting

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Estimation and application of best ARIMA model for forecasting the uranium price.

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

Review Session: Econometrics - CLEFIN (20192)

CHAPTER 8 FORECASTING PRACTICE I

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Time Series Analysis -- An Introduction -- AMS 586

Econometría 2: Análisis de series de Tiempo

Chapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis

5 Autoregressive-Moving-Average Modeling

Time Series 4. Robert Almgren. Oct. 5, 2009

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models

Econometrics II Heij et al. Chapter 7.1

Econometrics I: Univariate Time Series Econometrics (1)

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

Univariate, Nonstationary Processes

1 Linear Difference Equations

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

AR, MA and ARMA models

The Identification of ARIMA Models

Forecasting using R. Rob J Hyndman. 2.4 Non-seasonal ARIMA models. Forecasting using R 1

Box-Jenkins ARIMA Advanced Time Series

Econometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016 Organized by African Heritage Institution

Advanced Econometrics

Econometric Forecasting

Lab: Box-Jenkins Methodology - US Wholesale Price Indicator

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Lecture 7: Model Building Bus 41910, Time Series Analysis, Mr. R. Tsay

Time Series I Time Domain Methods

3 Theory of stationary random processes

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

The ARIMA Procedure: The ARIMA Procedure

Circle a single answer for each multiple choice question. Your choice should be made clearly.

Applied time-series analysis

ARIMA Models. Jamie Monogan. January 25, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 25, / 38

Lecture note 2 considered the statistical analysis of regression models for time

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

Modelling using ARMA processes

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Dynamic Time Series Regression: A Panacea for Spurious Correlations

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Homework 4. 1 Data analysis problems

2. An Introduction to Moving Average Models and ARMA Models

MODELING INFLATION RATES IN NIGERIA: BOX-JENKINS APPROACH. I. U. Moffat and A. E. David Department of Mathematics & Statistics, University of Uyo, Uyo

Linear Stochastic Models. Special Types of Random Processes: AR, MA, and ARMA. Digital Signal Processing

Lecture 4a: ARMA Model

ARIMA Models. Richard G. Pierse

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

AR(p) + I(d) + MA(q) = ARIMA(p, d, q)

Design of Time Series Model for Road Accident Fatal Death in Tamilnadu

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Forecasting. Simon Shaw 2005/06 Semester II

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

Forecasting Area, Production and Yield of Cotton in India using ARIMA Model

Lecture Notes of Bus (Spring 2017) Analysis of Financial Time Series Ruey S. Tsay

Ch 4. Models For Stationary Time Series. Time Series Analysis

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Using Analysis of Time Series to Forecast numbers of The Patients with Malignant Tumors in Anbar Provinc

Chapter 3, Part V: More on Model Identification; Examples

Forecasting Egyptian GDP Using ARIMA Models

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER II EXAMINATION MAS451/MTH451 Time Series Analysis TIME ALLOWED: 2 HOURS

Econ 623 Econometrics II Topic 2: Stationary Time Series

STAT Financial Time Series

Final Examination 7/6/2011

ARIMA Modelling and Forecasting

Econometrics for Policy Analysis A Train The Trainer Workshop Oct 22-28, 2016

Scenario 5: Internet Usage Solution. θ j

Chapter 4: Models for Stationary Time Series

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Elements of Multivariate Time Series Analysis

A SEASONAL TIME SERIES MODEL FOR NIGERIAN MONTHLY AIR TRAFFIC DATA

Oil price volatility in the Philippines using generalized autoregressive conditional heteroscedasticity

Unit root problem, solution of difference equations Simple deterministic model, question of unit root

Time Series Analysis Fall 2008

Applied Time. Series Analysis. Wayne A. Woodward. Henry L. Gray. Alan C. Elliott. Dallas, Texas, USA

ECONOMETRIA II. CURSO 2009/2010 LAB # 3

Problem Set 2: Box-Jenkins methodology

Introduction to Time Series Analysis. Lecture 11.

Lecture on ARMA model

Transcription:

Quantitative Finance I Linear AR and MA Models (Lecture 4) Winter Semester 01/013 by Lukas Vacha * If viewed in.pdf format - for full functionality use Mathematica 7 (or higher) notebook (.nb) version of this.pdf Simple Autoregressive (AR) models AR(p ) model Simple autoregressive model of order p, or simply AR(p ) model is represented by: p r t = f 0 + i=1 fi r t-i + e t, where p is non-negative integer and e t ~ WNI0, s M, is generalization of AR(1) to AR(p ) model. Or equivalently (without intercept): fhll r t =e t, where fhll = 1 - f 1 L - f L -... - f p L p. Wold' s Decomposition Theorem Using the Wold's decomposition we can rewrite a stationary AR(p) model as a MA( ): For the AR(p): fhll r t =e t, ignoring the intercept, the Wold decomposition is: r t = yhll e t where, yhll = I1 - f 1 L - f L -... - f p L p M -1 Example of AR (1) model r t = f 0 + f 1 r t-1 + e t,

QF_I_Lecture3.nb where 8e t < is assumed to be a white noise e t ~ WNI0, s M. Simple autoregressive model of order 1, or simply AR(1) model has the same form as simple linear regression model, where r t is dependent and r t-1 explanatory variable, but has different properties. Mean and variance conditional on past returns are: EHr t r t-1 L = f 0 + f 1 r t-1, VarIr t r t-1 L = VarHe t L = s e. Thus given past return r t-1, current return is centered around f 0 + f 1 r t-1 with variability s e. Properties of AR(1) model Assuming that series are weakly stationary, we have EHr t L = m, VarHr t L = and CovIr t, r t- j M = g j, where m and are constants. Thus (unconditional) mean and variance of the AR(1) series is: EHr t L = m = f 0, VarHr 1-f t L = = s e 1, mean of r t exist if f 1 ¹ 1, and is zero if and only if f 0 = 0. Necessary and sufficient condition for the AR(1) model to be weakly stationary is f 1 < 1 ACF of r t satisfies: r { = f 1 r { -1 for { 1, but as r 0 = 1, r { = f 1 {, and ACF of weakly stationary AR(1) series decays exponentially with rate f 1 ü How do we compute the EHr t L EHr t L = EHf 0 + f 1 r t-1 L = f 0 + f 1 EHr t-1 L EHr t L = f 0 + f 1 Hf 0 + f 1 EHr t- LL = f 0 + f 1 f 0 + f 1 EHr t- L ª EHr t L = f 0 I1 + f 1 + f 1 +...M + f 1 y 0 As long as the model is stationary: f 1 = 0 ö EHr t L = f 0 I1 + f 1 + f 1 +...M = f 0 ü How do we compute the VarHr t L Using Wold's decomposition, we get r t = H1 - f 1 LL -1 e t, and thus r t = e t + f 1 e t-1 + f 1 e t- +... VarHr t L = = EHr t - EHr t LL Hr t - EHr t LL since f 0 = 0, EHr t L = 0, and VarHr t L = EHHr t L Hr t LL = EIIe t + f 1 e t-1 + f 1 e t- +...M Ie t + f 1 e t-1 + f 1 e t- +...MM = EIe t + f 1 e t-1 + f 4 1 e t- +... + cross - productsm = s e + f 1 s e + f 4 1 s e +... = s e I1 + f 1 + f 1 4 +...M = s e ü How do we compute the CovHr t, r t-s L g 1 = CovHr t, r t-1 L = EHr t - EHr t LL Hr t-1 - EHr t-1 LL g 1 = EIe t + f 1 e t-1 + f 1 e t- +...M Ie t-1 + f 1 e t- + f 1 e t-3 +...MM ª g 1 = f 1 s + f 1 3 s + f 1 5 s +... = f 1 s

QF_I_Lecture3.nb 3 Second autocovariance, g = f 1 s, and autocovariances of higher orders, g s = f 1 s s, are computed similarly. ü How do we compute the ACF function: r 0 = = 1 r 1 = g 1 = f 1 s s ª ª ª r s = g s = f 1 s s s = f 1 = f 1 s Examples of AR(1) artificial processes and their ACFs i.e. set f 1 = -0.99 Sample ARH1L ACF function PACF function f 0 0 f 1 0.6 - - 0 5 10 15 0 5 30 Example of AR() model Simple autoregressive model of order, or simply ARHL model is represented by : r t = f 0 + f 1 r t-1 + f r t- + e t,

4 QF_I_Lecture3.nb where 8e t < is assumed to be a white noise e t ~ WNI0, s M. Properties of AR() model Mean of the AR() model is: EHr t L = f 0 -f, provided f 1 + f ¹ 1 Necessary and sufficient condition for the AR() model to be weakly stationary is f 1 + f < 1 For a stationary AR() process the characteristic roots of the second order difference equation I1 - f 1 L - f L M = 0 lie outside the unit circle. L denotes lag operator such that L r { = r { -1. Solution to characteristic second order polynomial equation x - f 1 x - f = 0 are characteristic roots of AR() model. If they are real numbers, AR() model is just AR(1) model operating on top of another AR(1). In this case, ACF is mixture of two exponential decays. If the characteristic roots are complex, ACF is damping sine and cosine waves. Complex characteristic roots are important in business cycles. Examples of AR () artificial processes and their ACFs Set f 1 = 0.6 and f = -0.4 for example of random stationary AR() process with complex characteristic roots. Second order difference equation is I1-0.6 L + 0.4 L M r t = e t, and ACF is damping sine and cosine waves Set f 1 = -0. and f = 0.35 for example of random stationary AR() process with real characteristic roots. ACF decays exponentially. Note that f 1 + f must be less than 1, otherwise AR() would not be stationary Sample ARHL ACF function PACF function f 0 0 f 1 0.3 f 0.3 4 0 - -4 0 100 00 300 400 500

QF_I_Lecture3.nb 5» Properties of AR(p ) model Properties of AR (p ) model Mean of the AR(p ) model is: EHr t L = f 0 -...-f p, provided that denominator is zero. Characteristic equation of the model is x p - f 1 x p-1 -... - f p = 0, and if all roots of the equation are less that one in modulus, series is stationary. PACF of AR(p ) model Estimate of the second lag of PACF shows added contribution of r t- to r t over the AR(1) model r t = f 1 r t-1 + e t, the lag-3 PACF shows the added contribution of r t-3 to AR(), and so on. Thus for AR(p) model, lag-p should not be zero, but all other lag-j such as j > p should be close to zero. Thus we can identify AR(p ) process with PACF function, Examples of AR (p) artificial processes and their ACFs Note that f 1 + f +... f p must be less than 1, otherwise AR(p ) would not be stationary Sample ARHpL ACF function PACF function f 0 f 1 0.3 f -0.4 f 3 0.8 f 4 0 f 5 0 - - 0 5 10 15 0 5 Simple moving average (MA) models MA(q ) model Simple Moving average model of order q, or MA(q ) model is represented by:

6 QF_I_Lecture3.nb q r t = q 0 + e t + i=1 qi e t-i, where q 0 is constant and 8e t < is a white noise series, q>0. Properties of MA(q ) model MA models are always weakly stationary because they are finite linear combination of a white noise sequence. EHr t L = q 0, VarHr t L = I1 + q 1 + q +... + q q M s g s = Iq s + q s+1 q 1 + q s+ q +... + q q q q-s M s s = 1,,..., q 0 s > q. the lag-q ACF is not zero, but r l = 0 for all { >q. The Invertibility Condition An MA(q) model is typically required to have roots of the characteristic equation q(z) = 0 greater than one in absolute value (outside the unit circle). The invertibility condition is mathematically same as the stationarity condition for AR(p) models. Example of MA(1) model Simple Moving average model of order 1, or MA(1) model is represented by: r t = q 0 + e t - q 1 e t-1, where q 0 is constant and 8e t < is a white noise series. Properties of MA(1) model MA models are always weakly stationary because they are finite linear combination of a white noise sequence. EHr t L = q 0, VarHr t L = I1 + q 1 M s, the lag-1 ACF is not zero, but all higher order ACFs are zero.

QF_I_Lecture3.nb 7 Examples of MA(1) artificial processes Sample MAH1L ACF function PACF function q 0 0 q 1 0.7 - - 0 5 10 15 0 5 30 Example of MA() model Simple Moving average model of order, or MA() model is represented by: r t = q 0 + e t - q 1 e t-1 - q e t-, where q 0 is constant and 8e t < is a white noise series. Properties of MA() model MA models are always weakly stationary because they are finite linear combination of a white noise sequence. EHr t L = q 0, VarHr t L = I1 + q 1 + q M s, the ACFs of lag 1 and are non-zero, all others are zero. ü How do we compute the VarHr t L In case q 0 = 0, i.e.e(r t )=0, VarHr t L = EHHr t L Hr t LL = EHHe t + q 1 e t-1 + q e t- L He t + q 1 e t-1 + q e t- LL = EIe t + q 1 e t-1 + q e t- +... + cross - productsm

8 QF_I_Lecture3.nb since CovHe t, e t-s L = 0 for s¹ 0, E(cross-products)=0, and VarHr t L = = s + q 1 s + q s = I1 + q 1 + q M s ü How do we compute the CovHr t, r t-s L g 1 = CovHr t, r t-1 L = EHr t - EHr t LL Hr t-1 - EHr t-1 LL g 1 = Hq 1 + q 1 q L s g = q s g 3 = 0 autocovariances of higher orders, g s = 0, for s>. ü How do we compute the ACF function: r 0 = = 1 r 1 = g 1 = Hq 1+q 1 q L s I1+q 1 +q M s = q 1+q 1 q 1+q 1 +q r = g = ª ª ª r s = g s = 0 " s > q s I1+q 1 +q M s = q 1+q 1 +q

QF_I_Lecture3.nb 9 Examples of MA() artificial processes Sample MAHL ACF function PACF function q 0 0 q 1 1.5 q 1.58 - - 0 5 10 15 0 5 30 Examples of MA(q ) artificial processes Example of random MA(5) process - note changes in ACF

10 QF_I_Lecture3.nb Sample MAH5L ACF function PACF function q 0 0 q 1 0.8 q 98 q 3 0.6 q 4 - q 5 0.4-0 5 10 15 0 5 Simple Autoregressive Moving-Average (ARMA) models ARMA models basically combines AR and MA models into a compact form. In returns series, the ARMA models do not seem to be a use full tool, but in volatility modeling, they are considered as highly relevant. ARMA(1,1) A time series r t follows ARMA(1,1) model, if it satisfies: r t - f 1 r t-1 = f 0 + e t - q 1 e t-1, where 8e t < is a white noise series, left-hand side is AR(1) model and right-hand side is MA(1) model. Properties of ARMA(1,1) model Provided series are weakly stationary, mean is the same as the mean of AR(1), because EHe t L = 0 EHr t L = m = f 0, VarHr 1-f t L = I1- f 1 q 1 +q 1 M s 1, and f 1 < 1 condition is needed for stationarity condition. ACF for stationary ARMA(1,1) series is r 1 = f 1 - q 1 s, r { = f 1 r { -1 for { >1. Thus ACF of ARMA(1,1) behaves very much like the ACF of AR(1), PACF of an ARMA(1,1) does not cut off at any lag.

QF_I_Lecture3.nb 11 Examples of ARMA(1,1) artificial processes Sample ARMAH1,1L ACF function PACF function f 0 0 f 1 0.8 q 1-0.7 - - 0 5 10 15 0 5 30 General ARMA(p,q ) models A general ARMA(p,q ) model is in the form: p q r t = f 0 + i=1 fi r t-i + e t - l=1 qi e t-l, where 8e t < is a white noise series and p and q are non-negative integers. The AR and MA models are special cases of the ARMA(p,q ) model. Using the back-shift operator, the model is written as: I1 - f 1 L -... - f p L p M r t = f 0 + I1 - q 1 L -... - q q L q M e t The ACF and PACF are not very informative in determining the order of an ARMA model

1 QF_I_Lecture3.nb Examples of ARMA(p,q ) artificial processes Sample ARMAHp,qL ACF function PACF function f 0 0 p parameters 80.6, -0.4< q parameters 80.1, < Export Simulated Series - - 0 5 10 15 0 5 30 ARIMA (p,d,q ) model If we extend the ARMA model by allowing the AR polynomial to have 1 as a characteristic root, then the model becomes AutoRegressive Integrated Moving-Average (ARIMA) model. ARIMA model is unit-root nonstationary because its AR polynomial has a unit root. Like a random-walk model, an ARIMA model has strong memory. If we want to handle nonstationarity, we use differencing as a common approach Differencing Series are said to be ARIMA(p,1,q ), when the process y t - y t-1 = H1 - LL y t follows a stationary and invertible ARMA(p,q ). Parameter d is equal to 1, and means, that ARMA(p,q ) series are differenced once. This approach is common in finance, as prices are nonstationary, but their log return series r t = lnhp t L - lnhp t-1 L, is stationary. Thus if we find prices of any stock to follow ARMA(p,q ) process, their returns will follow ARIMA(p,1,q ). Sometimes, we need to difference more than once, but with each additional difference we lose some information.

QF_I_Lecture3.nb 13 Examples of ARIMA(1,1,1) artificial processes Sample ARIMAH1,1,1L ACF function PACF function difference 1 f 1 0.946 q 1 0.1 process is unit-root nonstationary - - 0 10 0 30 40

14 QF_I_Lecture3.nb Examples of ARIMA(p,d,q ) artificial processes Sample ARIMAHp,d,qL ACF function PACF function difference 0 p parameters 80.6, -0.4< q parameters 80.1, < Export Simulated Series process is unit-root stationary - - 0 10 0 30 40 Estimation Box and Jenkins (1970) were the first to approach the task of estimating an ARMA model in a systematic manner. There are 3 steps to their approach: 1. Identification. Estimation 3. Model diagnostic checking The Information Criteria for Model Selection The information criteria vary according to how stiff the penalty term is. The three most popular criteria are Akaike s (1974) information criterion (AIC), Schwarz s (1978) Bayesian information criterion (SBIC), and the Hannan-Quinn criterion (HQIC): AIC = lnhs` L + k T SBIC = lnhs` L + k T lnhtl, HQIC = lnhs` L + k T lnhlnhtll

QF_I_Lecture3.nb 15 T where k = p + q + 1, T= sample size. Thus we choose the order of p and q according to minimal Information Criteria. Example of ARMA estimation on real-world data Send homework via email to vachal@utia.cas.cz Homework #3 Deadline: Tue 6.11, 1:00 PM :] Exercise 1 [: Simulate an ARMA(1,1) model and then estimate the parameters of the simulated process. :] Exercise [: Estimate AR, MA or ARMA model on daily closing prices of DAX30 and FTSE 100 for the two time periods: 01/1998-1/007 and 01/007-10/011. Compare and discuss the results.