Quantitative Finance I Linear AR and MA Models (Lecture 4) Winter Semester 01/013 by Lukas Vacha * If viewed in.pdf format - for full functionality use Mathematica 7 (or higher) notebook (.nb) version of this.pdf Simple Autoregressive (AR) models AR(p ) model Simple autoregressive model of order p, or simply AR(p ) model is represented by: p r t = f 0 + i=1 fi r t-i + e t, where p is non-negative integer and e t ~ WNI0, s M, is generalization of AR(1) to AR(p ) model. Or equivalently (without intercept): fhll r t =e t, where fhll = 1 - f 1 L - f L -... - f p L p. Wold' s Decomposition Theorem Using the Wold's decomposition we can rewrite a stationary AR(p) model as a MA( ): For the AR(p): fhll r t =e t, ignoring the intercept, the Wold decomposition is: r t = yhll e t where, yhll = I1 - f 1 L - f L -... - f p L p M -1 Example of AR (1) model r t = f 0 + f 1 r t-1 + e t,
QF_I_Lecture3.nb where 8e t < is assumed to be a white noise e t ~ WNI0, s M. Simple autoregressive model of order 1, or simply AR(1) model has the same form as simple linear regression model, where r t is dependent and r t-1 explanatory variable, but has different properties. Mean and variance conditional on past returns are: EHr t r t-1 L = f 0 + f 1 r t-1, VarIr t r t-1 L = VarHe t L = s e. Thus given past return r t-1, current return is centered around f 0 + f 1 r t-1 with variability s e. Properties of AR(1) model Assuming that series are weakly stationary, we have EHr t L = m, VarHr t L = and CovIr t, r t- j M = g j, where m and are constants. Thus (unconditional) mean and variance of the AR(1) series is: EHr t L = m = f 0, VarHr 1-f t L = = s e 1, mean of r t exist if f 1 ¹ 1, and is zero if and only if f 0 = 0. Necessary and sufficient condition for the AR(1) model to be weakly stationary is f 1 < 1 ACF of r t satisfies: r { = f 1 r { -1 for { 1, but as r 0 = 1, r { = f 1 {, and ACF of weakly stationary AR(1) series decays exponentially with rate f 1 ü How do we compute the EHr t L EHr t L = EHf 0 + f 1 r t-1 L = f 0 + f 1 EHr t-1 L EHr t L = f 0 + f 1 Hf 0 + f 1 EHr t- LL = f 0 + f 1 f 0 + f 1 EHr t- L ª EHr t L = f 0 I1 + f 1 + f 1 +...M + f 1 y 0 As long as the model is stationary: f 1 = 0 ö EHr t L = f 0 I1 + f 1 + f 1 +...M = f 0 ü How do we compute the VarHr t L Using Wold's decomposition, we get r t = H1 - f 1 LL -1 e t, and thus r t = e t + f 1 e t-1 + f 1 e t- +... VarHr t L = = EHr t - EHr t LL Hr t - EHr t LL since f 0 = 0, EHr t L = 0, and VarHr t L = EHHr t L Hr t LL = EIIe t + f 1 e t-1 + f 1 e t- +...M Ie t + f 1 e t-1 + f 1 e t- +...MM = EIe t + f 1 e t-1 + f 4 1 e t- +... + cross - productsm = s e + f 1 s e + f 4 1 s e +... = s e I1 + f 1 + f 1 4 +...M = s e ü How do we compute the CovHr t, r t-s L g 1 = CovHr t, r t-1 L = EHr t - EHr t LL Hr t-1 - EHr t-1 LL g 1 = EIe t + f 1 e t-1 + f 1 e t- +...M Ie t-1 + f 1 e t- + f 1 e t-3 +...MM ª g 1 = f 1 s + f 1 3 s + f 1 5 s +... = f 1 s
QF_I_Lecture3.nb 3 Second autocovariance, g = f 1 s, and autocovariances of higher orders, g s = f 1 s s, are computed similarly. ü How do we compute the ACF function: r 0 = = 1 r 1 = g 1 = f 1 s s ª ª ª r s = g s = f 1 s s s = f 1 = f 1 s Examples of AR(1) artificial processes and their ACFs i.e. set f 1 = -0.99 Sample ARH1L ACF function PACF function f 0 0 f 1 0.6 - - 0 5 10 15 0 5 30 Example of AR() model Simple autoregressive model of order, or simply ARHL model is represented by : r t = f 0 + f 1 r t-1 + f r t- + e t,
4 QF_I_Lecture3.nb where 8e t < is assumed to be a white noise e t ~ WNI0, s M. Properties of AR() model Mean of the AR() model is: EHr t L = f 0 -f, provided f 1 + f ¹ 1 Necessary and sufficient condition for the AR() model to be weakly stationary is f 1 + f < 1 For a stationary AR() process the characteristic roots of the second order difference equation I1 - f 1 L - f L M = 0 lie outside the unit circle. L denotes lag operator such that L r { = r { -1. Solution to characteristic second order polynomial equation x - f 1 x - f = 0 are characteristic roots of AR() model. If they are real numbers, AR() model is just AR(1) model operating on top of another AR(1). In this case, ACF is mixture of two exponential decays. If the characteristic roots are complex, ACF is damping sine and cosine waves. Complex characteristic roots are important in business cycles. Examples of AR () artificial processes and their ACFs Set f 1 = 0.6 and f = -0.4 for example of random stationary AR() process with complex characteristic roots. Second order difference equation is I1-0.6 L + 0.4 L M r t = e t, and ACF is damping sine and cosine waves Set f 1 = -0. and f = 0.35 for example of random stationary AR() process with real characteristic roots. ACF decays exponentially. Note that f 1 + f must be less than 1, otherwise AR() would not be stationary Sample ARHL ACF function PACF function f 0 0 f 1 0.3 f 0.3 4 0 - -4 0 100 00 300 400 500
QF_I_Lecture3.nb 5» Properties of AR(p ) model Properties of AR (p ) model Mean of the AR(p ) model is: EHr t L = f 0 -...-f p, provided that denominator is zero. Characteristic equation of the model is x p - f 1 x p-1 -... - f p = 0, and if all roots of the equation are less that one in modulus, series is stationary. PACF of AR(p ) model Estimate of the second lag of PACF shows added contribution of r t- to r t over the AR(1) model r t = f 1 r t-1 + e t, the lag-3 PACF shows the added contribution of r t-3 to AR(), and so on. Thus for AR(p) model, lag-p should not be zero, but all other lag-j such as j > p should be close to zero. Thus we can identify AR(p ) process with PACF function, Examples of AR (p) artificial processes and their ACFs Note that f 1 + f +... f p must be less than 1, otherwise AR(p ) would not be stationary Sample ARHpL ACF function PACF function f 0 f 1 0.3 f -0.4 f 3 0.8 f 4 0 f 5 0 - - 0 5 10 15 0 5 Simple moving average (MA) models MA(q ) model Simple Moving average model of order q, or MA(q ) model is represented by:
6 QF_I_Lecture3.nb q r t = q 0 + e t + i=1 qi e t-i, where q 0 is constant and 8e t < is a white noise series, q>0. Properties of MA(q ) model MA models are always weakly stationary because they are finite linear combination of a white noise sequence. EHr t L = q 0, VarHr t L = I1 + q 1 + q +... + q q M s g s = Iq s + q s+1 q 1 + q s+ q +... + q q q q-s M s s = 1,,..., q 0 s > q. the lag-q ACF is not zero, but r l = 0 for all { >q. The Invertibility Condition An MA(q) model is typically required to have roots of the characteristic equation q(z) = 0 greater than one in absolute value (outside the unit circle). The invertibility condition is mathematically same as the stationarity condition for AR(p) models. Example of MA(1) model Simple Moving average model of order 1, or MA(1) model is represented by: r t = q 0 + e t - q 1 e t-1, where q 0 is constant and 8e t < is a white noise series. Properties of MA(1) model MA models are always weakly stationary because they are finite linear combination of a white noise sequence. EHr t L = q 0, VarHr t L = I1 + q 1 M s, the lag-1 ACF is not zero, but all higher order ACFs are zero.
QF_I_Lecture3.nb 7 Examples of MA(1) artificial processes Sample MAH1L ACF function PACF function q 0 0 q 1 0.7 - - 0 5 10 15 0 5 30 Example of MA() model Simple Moving average model of order, or MA() model is represented by: r t = q 0 + e t - q 1 e t-1 - q e t-, where q 0 is constant and 8e t < is a white noise series. Properties of MA() model MA models are always weakly stationary because they are finite linear combination of a white noise sequence. EHr t L = q 0, VarHr t L = I1 + q 1 + q M s, the ACFs of lag 1 and are non-zero, all others are zero. ü How do we compute the VarHr t L In case q 0 = 0, i.e.e(r t )=0, VarHr t L = EHHr t L Hr t LL = EHHe t + q 1 e t-1 + q e t- L He t + q 1 e t-1 + q e t- LL = EIe t + q 1 e t-1 + q e t- +... + cross - productsm
8 QF_I_Lecture3.nb since CovHe t, e t-s L = 0 for s¹ 0, E(cross-products)=0, and VarHr t L = = s + q 1 s + q s = I1 + q 1 + q M s ü How do we compute the CovHr t, r t-s L g 1 = CovHr t, r t-1 L = EHr t - EHr t LL Hr t-1 - EHr t-1 LL g 1 = Hq 1 + q 1 q L s g = q s g 3 = 0 autocovariances of higher orders, g s = 0, for s>. ü How do we compute the ACF function: r 0 = = 1 r 1 = g 1 = Hq 1+q 1 q L s I1+q 1 +q M s = q 1+q 1 q 1+q 1 +q r = g = ª ª ª r s = g s = 0 " s > q s I1+q 1 +q M s = q 1+q 1 +q
QF_I_Lecture3.nb 9 Examples of MA() artificial processes Sample MAHL ACF function PACF function q 0 0 q 1 1.5 q 1.58 - - 0 5 10 15 0 5 30 Examples of MA(q ) artificial processes Example of random MA(5) process - note changes in ACF
10 QF_I_Lecture3.nb Sample MAH5L ACF function PACF function q 0 0 q 1 0.8 q 98 q 3 0.6 q 4 - q 5 0.4-0 5 10 15 0 5 Simple Autoregressive Moving-Average (ARMA) models ARMA models basically combines AR and MA models into a compact form. In returns series, the ARMA models do not seem to be a use full tool, but in volatility modeling, they are considered as highly relevant. ARMA(1,1) A time series r t follows ARMA(1,1) model, if it satisfies: r t - f 1 r t-1 = f 0 + e t - q 1 e t-1, where 8e t < is a white noise series, left-hand side is AR(1) model and right-hand side is MA(1) model. Properties of ARMA(1,1) model Provided series are weakly stationary, mean is the same as the mean of AR(1), because EHe t L = 0 EHr t L = m = f 0, VarHr 1-f t L = I1- f 1 q 1 +q 1 M s 1, and f 1 < 1 condition is needed for stationarity condition. ACF for stationary ARMA(1,1) series is r 1 = f 1 - q 1 s, r { = f 1 r { -1 for { >1. Thus ACF of ARMA(1,1) behaves very much like the ACF of AR(1), PACF of an ARMA(1,1) does not cut off at any lag.
QF_I_Lecture3.nb 11 Examples of ARMA(1,1) artificial processes Sample ARMAH1,1L ACF function PACF function f 0 0 f 1 0.8 q 1-0.7 - - 0 5 10 15 0 5 30 General ARMA(p,q ) models A general ARMA(p,q ) model is in the form: p q r t = f 0 + i=1 fi r t-i + e t - l=1 qi e t-l, where 8e t < is a white noise series and p and q are non-negative integers. The AR and MA models are special cases of the ARMA(p,q ) model. Using the back-shift operator, the model is written as: I1 - f 1 L -... - f p L p M r t = f 0 + I1 - q 1 L -... - q q L q M e t The ACF and PACF are not very informative in determining the order of an ARMA model
1 QF_I_Lecture3.nb Examples of ARMA(p,q ) artificial processes Sample ARMAHp,qL ACF function PACF function f 0 0 p parameters 80.6, -0.4< q parameters 80.1, < Export Simulated Series - - 0 5 10 15 0 5 30 ARIMA (p,d,q ) model If we extend the ARMA model by allowing the AR polynomial to have 1 as a characteristic root, then the model becomes AutoRegressive Integrated Moving-Average (ARIMA) model. ARIMA model is unit-root nonstationary because its AR polynomial has a unit root. Like a random-walk model, an ARIMA model has strong memory. If we want to handle nonstationarity, we use differencing as a common approach Differencing Series are said to be ARIMA(p,1,q ), when the process y t - y t-1 = H1 - LL y t follows a stationary and invertible ARMA(p,q ). Parameter d is equal to 1, and means, that ARMA(p,q ) series are differenced once. This approach is common in finance, as prices are nonstationary, but their log return series r t = lnhp t L - lnhp t-1 L, is stationary. Thus if we find prices of any stock to follow ARMA(p,q ) process, their returns will follow ARIMA(p,1,q ). Sometimes, we need to difference more than once, but with each additional difference we lose some information.
QF_I_Lecture3.nb 13 Examples of ARIMA(1,1,1) artificial processes Sample ARIMAH1,1,1L ACF function PACF function difference 1 f 1 0.946 q 1 0.1 process is unit-root nonstationary - - 0 10 0 30 40
14 QF_I_Lecture3.nb Examples of ARIMA(p,d,q ) artificial processes Sample ARIMAHp,d,qL ACF function PACF function difference 0 p parameters 80.6, -0.4< q parameters 80.1, < Export Simulated Series process is unit-root stationary - - 0 10 0 30 40 Estimation Box and Jenkins (1970) were the first to approach the task of estimating an ARMA model in a systematic manner. There are 3 steps to their approach: 1. Identification. Estimation 3. Model diagnostic checking The Information Criteria for Model Selection The information criteria vary according to how stiff the penalty term is. The three most popular criteria are Akaike s (1974) information criterion (AIC), Schwarz s (1978) Bayesian information criterion (SBIC), and the Hannan-Quinn criterion (HQIC): AIC = lnhs` L + k T SBIC = lnhs` L + k T lnhtl, HQIC = lnhs` L + k T lnhlnhtll
QF_I_Lecture3.nb 15 T where k = p + q + 1, T= sample size. Thus we choose the order of p and q according to minimal Information Criteria. Example of ARMA estimation on real-world data Send homework via email to vachal@utia.cas.cz Homework #3 Deadline: Tue 6.11, 1:00 PM :] Exercise 1 [: Simulate an ARMA(1,1) model and then estimate the parameters of the simulated process. :] Exercise [: Estimate AR, MA or ARMA model on daily closing prices of DAX30 and FTSE 100 for the two time periods: 01/1998-1/007 and 01/007-10/011. Compare and discuss the results.