Introduction to ARMA and GARCH processes

Size: px
Start display at page:

Download "Introduction to ARMA and GARCH processes"

Transcription

1 Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

2 Stationarity Strict stationarity: (X 1, X 2,..., X n) d = (X 1+k, X 2+k,..., X n+k ) for any integer n > 1, k Weak/second-order/covariance stationarity: E[x t] = µ E[X t µ] 2 = σ 2 < + (i.e. constant and indipendent of t) E[(X t µ)(x t+k µ)] = γ( k ) (i.e. indipendent of t for each k) Interpretation: mean and variance are constant mean reversion shocks are transient covariance between X t and X t k tends to 0 as k Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

3 Stationarity Strict stationarity: (X 1, X 2,..., X n) d = (X 1+k, X 2+k,..., X n+k ) for any integer n > 1, k Weak/second-order/covariance stationarity: E[x t] = µ E[X t µ] 2 = σ 2 < + (i.e. constant and indipendent of t) E[(X t µ)(x t+k µ)] = γ( k ) (i.e. indipendent of t for each k) Interpretation: mean and variance are constant mean reversion shocks are transient covariance between X t and X t k tends to 0 as k Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

4 Stationarity Strict stationarity: (X 1, X 2,..., X n) d = (X 1+k, X 2+k,..., X n+k ) for any integer n > 1, k Weak/second-order/covariance stationarity: E[x t] = µ E[X t µ] 2 = σ 2 < + (i.e. constant and indipendent of t) E[(X t µ)(x t+k µ)] = γ( k ) (i.e. indipendent of t for each k) Interpretation: mean and variance are constant mean reversion shocks are transient covariance between X t and X t k tends to 0 as k Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

5 Withe noise weak (uncorrelated) E(ǫ t) = 0 t V(ǫ t) = σ 2 t ρ(ǫ t, ǫ s) = 0 s t where ρ γ( t s ) γ(0) strong (independence) ǫ t I.I.D.(0, σ 2 ) Gaussian (weak=strong) ǫ t N.I.D.(0, σ 2 ) Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

6 Lag operator the Lag operator is defined as: LX t X t 1 is a linear operator: L(βX t) = β LX t = βx t 1 L(X t + Y t) = LX t + LY t = X t 1 + Y t 1 and admits power exponent, for instance: L 2 X t = L(LX t) = LX t 1 = X t 2 L k X t = X t k L 1 X t = X t+1 Some examples: X t = X t X t 1 = X t LX t = (1 L)X t y t = (θ 1 + θ 2 L)LX t = (θ 1 L + θ 2 L 2 )X t = θ 1 X t 1 + θ 2 X t 2 Expression like (θ 0 + θ 1 L + θ 2 L θ nl n ) with possibly n =, are called lag polynomial and are indicated as θ(l) Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

7 Lag operator the Lag operator is defined as: LX t X t 1 is a linear operator: L(βX t) = β LX t = βx t 1 L(X t + Y t) = LX t + LY t = X t 1 + Y t 1 and admits power exponent, for instance: L 2 X t = L(LX t) = LX t 1 = X t 2 L k X t = X t k L 1 X t = X t+1 Some examples: X t = X t X t 1 = X t LX t = (1 L)X t y t = (θ 1 + θ 2 L)LX t = (θ 1 L + θ 2 L 2 )X t = θ 1 X t 1 + θ 2 X t 2 Expression like (θ 0 + θ 1 L + θ 2 L θ nl n ) with possibly n =, are called lag polynomial and are indicated as θ(l) Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

8 Lag operator the Lag operator is defined as: LX t X t 1 is a linear operator: L(βX t) = β LX t = βx t 1 L(X t + Y t) = LX t + LY t = X t 1 + Y t 1 and admits power exponent, for instance: L 2 X t = L(LX t) = LX t 1 = X t 2 L k X t = X t k L 1 X t = X t+1 Some examples: X t = X t X t 1 = X t LX t = (1 L)X t y t = (θ 1 + θ 2 L)LX t = (θ 1 L + θ 2 L 2 )X t = θ 1 X t 1 + θ 2 X t 2 Expression like (θ 0 + θ 1 L + θ 2 L θ nl n ) with possibly n =, are called lag polynomial and are indicated as θ(l) Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

9 Lag operator the Lag operator is defined as: LX t X t 1 is a linear operator: L(βX t) = β LX t = βx t 1 L(X t + Y t) = LX t + LY t = X t 1 + Y t 1 and admits power exponent, for instance: L 2 X t = L(LX t) = LX t 1 = X t 2 L k X t = X t k L 1 X t = X t+1 Some examples: X t = X t X t 1 = X t LX t = (1 L)X t y t = (θ 1 + θ 2 L)LX t = (θ 1 L + θ 2 L 2 )X t = θ 1 X t 1 + θ 2 X t 2 Expression like (θ 0 + θ 1 L + θ 2 L θ nl n ) with possibly n =, are called lag polynomial and are indicated as θ(l) Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

10 Moving Average (MA) process The simplest way to construct a stationary process is to use a lag polynomial θ(l) with j=0 θ2 j < to construct a sort of weighted moving average of withe noises ǫ t, i.e. MA(q) Y t = θ(l)ǫ t = ǫ t + θ 1 ǫ t 1 + θ 2 ǫ t θ qǫ t q Example, MA(1) Y t = ǫ t + θǫ t 1 = (1 + θl)ǫ t being E[Y t] = 0 γ(0) = E[Y ty t] = E[(ǫ t + θǫ t 1 )(ǫ t + θǫ t 1 )] = σ 2 (1 + θ 2 ); and, γ(1) = E[Y ty t 1 ] = E[(ǫ t + θǫ t 1 )(ǫ t 1 + θǫ t 2 )] = σ 2 θ; γ(k) = E[Y ty t k ] = E[(ǫ t + θǫ t 1 )(ǫ t k + θǫ t k 1 )] = 0 k > 1 ρ(1) = γ(1) γ(0) = θ 1 + θ 2 ρ(k) = γ(k) γ(0) = 0 k > 1 hence, while a withe noise is 0-correlated, MA(1) is 1-correlated (i.e. it has only the first correlation ρ(1) different from zero) Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

11 Moving Average (MA) process The simplest way to construct a stationary process is to use a lag polynomial θ(l) with j=0 θ2 j < to construct a sort of weighted moving average of withe noises ǫ t, i.e. MA(q) Y t = θ(l)ǫ t = ǫ t + θ 1 ǫ t 1 + θ 2 ǫ t θ qǫ t q Example, MA(1) Y t = ǫ t + θǫ t 1 = (1 + θl)ǫ t being E[Y t] = 0 γ(0) = E[Y ty t] = E[(ǫ t + θǫ t 1 )(ǫ t + θǫ t 1 )] = σ 2 (1 + θ 2 ); and, γ(1) = E[Y ty t 1 ] = E[(ǫ t + θǫ t 1 )(ǫ t 1 + θǫ t 2 )] = σ 2 θ; γ(k) = E[Y ty t k ] = E[(ǫ t + θǫ t 1 )(ǫ t k + θǫ t k 1 )] = 0 k > 1 ρ(1) = γ(1) γ(0) = θ 1 + θ 2 ρ(k) = γ(k) γ(0) = 0 k > 1 hence, while a withe noise is 0-correlated, MA(1) is 1-correlated (i.e. it has only the first correlation ρ(1) different from zero) Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

12 Properties MA(q) In general for a MA(q) process we have and γ(0) = σ 2 (1 + θ θ θ2 q ) γ(k) = q k σ 2 θ j θ j+k j=0 k q = 0 k > q q k ρ(k) = j=0 θ jθ j+k 1 + q j=1 θ2 j k q = 0 k > q Hence, an MA(q) is q-correlated and it can also be shown that any stationary q-correlated process can be represented as an MA(q). Wold Theorem: any mean zero covariance stationary process can be represented in the form, MA( ) + deterministic component (the two being uncorrelated). But, given a q-correlated process, is the MA(q) process unique? In general no, indeed it can be shown that for a q-correlated process there are 2 q possible MA(q) with same autocovariance structure. However, there is only one MA(q) which is invertible. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

13 Properties MA(q) In general for a MA(q) process we have and γ(0) = σ 2 (1 + θ θ θ2 q ) γ(k) = q k σ 2 θ j θ j+k j=0 k q = 0 k > q q k ρ(k) = j=0 θ jθ j+k 1 + q j=1 θ2 j k q = 0 k > q Hence, an MA(q) is q-correlated and it can also be shown that any stationary q-correlated process can be represented as an MA(q). Wold Theorem: any mean zero covariance stationary process can be represented in the form, MA( ) + deterministic component (the two being uncorrelated). But, given a q-correlated process, is the MA(q) process unique? In general no, indeed it can be shown that for a q-correlated process there are 2 q possible MA(q) with same autocovariance structure. However, there is only one MA(q) which is invertible. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

14 Properties MA(q) In general for a MA(q) process we have and γ(0) = σ 2 (1 + θ θ θ2 q ) γ(k) = q k σ 2 θ j θ j+k j=0 k q = 0 k > q q k ρ(k) = j=0 θ jθ j+k 1 + q j=1 θ2 j k q = 0 k > q Hence, an MA(q) is q-correlated and it can also be shown that any stationary q-correlated process can be represented as an MA(q). Wold Theorem: any mean zero covariance stationary process can be represented in the form, MA( ) + deterministic component (the two being uncorrelated). But, given a q-correlated process, is the MA(q) process unique? In general no, indeed it can be shown that for a q-correlated process there are 2 q possible MA(q) with same autocovariance structure. However, there is only one MA(q) which is invertible. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

15 Invertibility conditions for MA first consider the MA(1) case: given the result Y t = (1 + θl)ǫ t (1 + θl) 1 = (1 θl + θ 2 L 2 θ 3 L 3 + θ 4 L ) = inverting the θ(l) lag polynomial, we can write (1 θl + θ 2 L 2 θ 3 L 3 + θ 4 L )Y t = ǫ t which can be considered an AR( ) process. ( θl) i If an MA process can be written as an AR( ) of this type, such MA representation is said to be invertible. For MA(1) process the invertibility condition is given by θ < 1. For a general MA(q) process Y t = (1 + θ 1 L + θ 2 L θ ql q )ǫ t the invertibility conditions are that the roots of the lag polynomial 1 + θ 1 z + θ 2 z θ qz q = 0 lie outside the unit circle. Then the MA(q) can be written as an AR( ) by inverting θ(l). i=0 Invertibility also has important practical consequence in application. In fact, given that the ǫ t are not observable they have to be reconstructed from the observed Y s through the AR( ) representation. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

16 Invertibility conditions for MA first consider the MA(1) case: given the result Y t = (1 + θl)ǫ t (1 + θl) 1 = (1 θl + θ 2 L 2 θ 3 L 3 + θ 4 L ) = inverting the θ(l) lag polynomial, we can write (1 θl + θ 2 L 2 θ 3 L 3 + θ 4 L )Y t = ǫ t which can be considered an AR( ) process. ( θl) i If an MA process can be written as an AR( ) of this type, such MA representation is said to be invertible. For MA(1) process the invertibility condition is given by θ < 1. For a general MA(q) process Y t = (1 + θ 1 L + θ 2 L θ ql q )ǫ t the invertibility conditions are that the roots of the lag polynomial 1 + θ 1 z + θ 2 z θ qz q = 0 lie outside the unit circle. Then the MA(q) can be written as an AR( ) by inverting θ(l). i=0 Invertibility also has important practical consequence in application. In fact, given that the ǫ t are not observable they have to be reconstructed from the observed Y s through the AR( ) representation. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

17 Auto-Regressive Process (AR) A general AR process is defined as φ(l)y t = ǫ t It is always invertible but not always stationary. Example: AR(1) (1 φl)y t = ǫ t or Y t = φy t 1 + ǫ t by inverting the lag polynomial (1 φl) the AR(1) can be written as Y t = (1 φl) 1 ǫ t = (φl) i ǫ t = φ i ǫ t i = MA( ) i=0 hence the stationarity condition is that φ < 1. From this representation we can apply the general formula of MA to compute γ( ) and ρ( ). In particular, ρ(k) = φ k k i.e. monotonic exponential decay for φ > 0 and exponentially damped oscillatory decay for φ < 0. In general an AR(p) process i=0 Y t = φ 1 Y t 1 + φ 2 Y t φ py t p + ǫ t is stationarity if all the roots of the characteristic equation of the lag polynomial are outside the unit circle. 1 φ 1 z φ 2 z 2... φ pz p = 0 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

18 Auto-Regressive Process (AR) A general AR process is defined as φ(l)y t = ǫ t It is always invertible but not always stationary. Example: AR(1) (1 φl)y t = ǫ t or Y t = φy t 1 + ǫ t by inverting the lag polynomial (1 φl) the AR(1) can be written as Y t = (1 φl) 1 ǫ t = (φl) i ǫ t = φ i ǫ t i = MA( ) i=0 hence the stationarity condition is that φ < 1. From this representation we can apply the general formula of MA to compute γ( ) and ρ( ). In particular, ρ(k) = φ k k i.e. monotonic exponential decay for φ > 0 and exponentially damped oscillatory decay for φ < 0. In general an AR(p) process i=0 Y t = φ 1 Y t 1 + φ 2 Y t φ py t p + ǫ t is stationarity if all the roots of the characteristic equation of the lag polynomial are outside the unit circle. 1 φ 1 z φ 2 z 2... φ pz p = 0 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

19 State Space Representation of AR(p) to gain more intuition on the AR stationarity conditions write an AR(p) in its state space form Y t φ 1 φ 2 φ 3... φ p 1 φ p Y t 1 ǫ t Y t Y. = t Y t p Y t p 0 X t = F X t 1 + v t Hence, the expected value of X t satisfy, E[X t] = F X t 1 and E[X t+j ] = F j+1 X t 1 is a linear map in R p whose dynamic properties are given by the eigenvalues of the matrix F. The eigenvalues of F are given by solving the characteristic equation λ p φ 1 λ p 1 φ 2 λ p 2... φ p 1 λ φ p = 0. Comparing this with the characteristic equation of the lag polynomial φ(l) 1 φ 1 z φ 2 z 2... φ p 1 z p 1 φ pz p = 0 we can see that the roots of the 2 equations are such that z 1 = λ 1 1, z 2 = λ 1 2,..., z p = λ 1 p Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

20 State Space Representation of AR(p) to gain more intuition on the AR stationarity conditions write an AR(p) in its state space form Y t φ 1 φ 2 φ 3... φ p 1 φ p Y t 1 ǫ t Y t Y. = t Y t p Y t p 0 X t = F X t 1 + v t Hence, the expected value of X t satisfy, E[X t] = F X t 1 and E[X t+j ] = F j+1 X t 1 is a linear map in R p whose dynamic properties are given by the eigenvalues of the matrix F. The eigenvalues of F are given by solving the characteristic equation λ p φ 1 λ p 1 φ 2 λ p 2... φ p 1 λ φ p = 0. Comparing this with the characteristic equation of the lag polynomial φ(l) 1 φ 1 z φ 2 z 2... φ p 1 z p 1 φ pz p = 0 we can see that the roots of the 2 equations are such that z 1 = λ 1 1, z 2 = λ 1 2,..., z p = λ 1 p Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

21 ARMA(p,q) An ARMA(p,q) process is defined as φ(l)y t = θ(l)ǫ t where φ( ) and θ( ) are p th and q th lag polynomials. the process is stationary if all the roots of φ(z) 1 φ 1 z φ 2 z 2... φ p 1 z p 1 φ pz p = 0 lie outside the unit circle and, hence, admits the MA( ) representation: Y t = φ(l) 1 θ(l)ǫ t the process is invertible if all the roots of θ(z) 1 + θ 1 z + θ 2 z θ qz q = 0 lie outside the unit circle and, hence, admits the AR( ) representation: ǫ t = θ(l) 1 φ(l)y t Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

22 Estimation For AR(p) Y t = φ 1 Y t 1 + φ 2 Y t φ py t p + ǫ t OLS are consistent and under gaussianity asymptotically equivalent to MLE asymptotically efficient For example, the full Likelihood of an AR(1) can be written as L(θ) f (y T, y T 1,..., y 1 ; θ) = f Y1 (y 1 ; θ) }{{} marginal first obs T f Yt Yt 1 (y t y t 1 ; θ) t=2 }{{} conditional likelihood under normality OLS=MLE For a general ARMA(p,q) Y t = φ 1 Y t φ py t p + ǫ t + θ 1 ǫ t θ qǫ t q Y t 1 is correlated with ǫ t 1,..., ǫ t q OLS not consistent. MLE with numerical optimization procedures. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

23 Estimation For AR(p) Y t = φ 1 Y t 1 + φ 2 Y t φ py t p + ǫ t OLS are consistent and under gaussianity asymptotically equivalent to MLE asymptotically efficient For example, the full Likelihood of an AR(1) can be written as L(θ) f (y T, y T 1,..., y 1 ; θ) = f Y1 (y 1 ; θ) }{{} marginal first obs T f Yt Yt 1 (y t y t 1 ; θ) t=2 }{{} conditional likelihood under normality OLS=MLE For a general ARMA(p,q) Y t = φ 1 Y t φ py t p + ǫ t + θ 1 ǫ t θ qǫ t q Y t 1 is correlated with ǫ t 1,..., ǫ t q OLS not consistent. MLE with numerical optimization procedures. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

24 Prediction write the model in its AR( ) representation: η(l)(y t µ) = ǫ t then the optimal prediction of Y t+s is given by [ η(l) 1 ] E[Y t+s Y t, Y t 1,...] = µ + L s η(l)(y t µ) + which is known as Wiener-Kolmogorov prediction formula. with [ L k] + = 0 for k < 0 In the case of an AR(p) process the prediction formula can also be written as E[Y t+s Y t, Y t 1,...] = µ + f (s) 11 (Yt µ) + f(s) 12 (Y t 1 µ) f (s) 1p (Y t p+1 µ) where f (j) 11 is the element (1, 1) of the matrix F j. The easiest way to compute prediction from AR(p) model is, however, through recursive methods. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

25 Prediction write the model in its AR( ) representation: η(l)(y t µ) = ǫ t then the optimal prediction of Y t+s is given by [ η(l) 1 ] E[Y t+s Y t, Y t 1,...] = µ + L s η(l)(y t µ) + which is known as Wiener-Kolmogorov prediction formula. with [ L k] + = 0 for k < 0 In the case of an AR(p) process the prediction formula can also be written as E[Y t+s Y t, Y t 1,...] = µ + f (s) 11 (Yt µ) + f(s) 12 (Y t 1 µ) f (s) 1p (Y t p+1 µ) where f (j) 11 is the element (1, 1) of the matrix F j. The easiest way to compute prediction from AR(p) model is, however, through recursive methods. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

26 Box-Jenkins Approach check for stationarity: if not try different transformation (ex differentiation ARIMA models) Identification: check the autocorrelation (ACF) function: a q-correlated process is an MA(q) model check the partial autocorrelation (PACF) function: for an AR(p) process, while the k lag ACF can be interpreted as simple regression Y t = ρ(k)y t k + error, the k lag PACF can be seen as a multiple regression Y t = b 1 Y t 1 + b 2 Y t b k Y t k + error it can be computed by solving the Yule-Walker system: b 1 b 2. b k = 1 γ(0) γ(1)... γ(k 1) γ(1) γ(1) γ(0)... γ(k 2) γ(2) γ(k 1) γ(k 2)... γ(0) γ(k) Importantly, AR(p) processes are p partially correlated identification of AR order Validation: check the appropriateness of the model by some measure of fit. AIC/Akaike = T log ˆσ 2 e + 2 m BIC/Schwarz = T log ˆσ 2 e + m log T with σ 2 e estimation error variance, m = p + q + 1 n of parameters, and T n of obs Diagnostic checking of the residuals. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

27 Box-Jenkins Approach check for stationarity: if not try different transformation (ex differentiation ARIMA models) Identification: check the autocorrelation (ACF) function: a q-correlated process is an MA(q) model check the partial autocorrelation (PACF) function: for an AR(p) process, while the k lag ACF can be interpreted as simple regression Y t = ρ(k)y t k + error, the k lag PACF can be seen as a multiple regression Y t = b 1 Y t 1 + b 2 Y t b k Y t k + error it can be computed by solving the Yule-Walker system: b 1 b 2. b k = 1 γ(0) γ(1)... γ(k 1) γ(1) γ(1) γ(0)... γ(k 2) γ(2) γ(k 1) γ(k 2)... γ(0) γ(k) Importantly, AR(p) processes are p partially correlated identification of AR order Validation: check the appropriateness of the model by some measure of fit. AIC/Akaike = T log ˆσ 2 e + 2 m BIC/Schwarz = T log ˆσ 2 e + m log T with σ 2 e estimation error variance, m = p + q + 1 n of parameters, and T n of obs Diagnostic checking of the residuals. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

28 Box-Jenkins Approach check for stationarity: if not try different transformation (ex differentiation ARIMA models) Identification: check the autocorrelation (ACF) function: a q-correlated process is an MA(q) model check the partial autocorrelation (PACF) function: for an AR(p) process, while the k lag ACF can be interpreted as simple regression Y t = ρ(k)y t k + error, the k lag PACF can be seen as a multiple regression Y t = b 1 Y t 1 + b 2 Y t b k Y t k + error it can be computed by solving the Yule-Walker system: b 1 b 2. b k = 1 γ(0) γ(1)... γ(k 1) γ(1) γ(1) γ(0)... γ(k 2) γ(2) γ(k 1) γ(k 2)... γ(0) γ(k) Importantly, AR(p) processes are p partially correlated identification of AR order Validation: check the appropriateness of the model by some measure of fit. AIC/Akaike = T log ˆσ 2 e + 2 m BIC/Schwarz = T log ˆσ 2 e + m log T with σ 2 e estimation error variance, m = p + q + 1 n of parameters, and T n of obs Diagnostic checking of the residuals. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

29 ARIMA Integrated ARMA model: ARIMA(p,1,q) denote a nonstationary process Y t for which the first difference Y t Y t 1 = (1 L)Y t is a stationary ARMA(p,q) process. Y t is said to be integrated of order 1 or I(1). If 2 differentiations of Y t are necessary to get a stationary process i.e. (1 L) 2 Y t then the process Y t is said to be integrated of order 2 or I(2). I(0) indicate a stationary process. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

30 ARIMA Integrated ARMA model: ARIMA(p,1,q) denote a nonstationary process Y t for which the first difference Y t Y t 1 = (1 L)Y t is a stationary ARMA(p,q) process. Y t is said to be integrated of order 1 or I(1). If 2 differentiations of Y t are necessary to get a stationary process i.e. (1 L) 2 Y t then the process Y t is said to be integrated of order 2 or I(2). I(0) indicate a stationary process. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

31 ARFIMA The k-difference operator (1 L) n with integer n can be generalized to a fractional difference operator (1 L) d with 0 < d < 1 defined by the binomial expansion (1 L) d = 1 dl + d(d 1)L 2 /2! d(d 1)(d 2)L 3 /3! +... obtaining a fractionally integrated process of order d i.e. I(d). If d < 0.5 the process is cov stationary and admits an AR( ) representation. The usefulness of a fractional filter (1 L) d is that it produces hyperbolic decaying autocorrelations i.e. the so called long memory. In fact, for ARFIMA(p,d,q) processes φ(l)(1 L) d Y t = θ(l)ǫ t the autocorrelation functions is proportional to ρ(k) ck 2d 1 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

32 ARFIMA The k-difference operator (1 L) n with integer n can be generalized to a fractional difference operator (1 L) d with 0 < d < 1 defined by the binomial expansion (1 L) d = 1 dl + d(d 1)L 2 /2! d(d 1)(d 2)L 3 /3! +... obtaining a fractionally integrated process of order d i.e. I(d). If d < 0.5 the process is cov stationary and admits an AR( ) representation. The usefulness of a fractional filter (1 L) d is that it produces hyperbolic decaying autocorrelations i.e. the so called long memory. In fact, for ARFIMA(p,d,q) processes φ(l)(1 L) d Y t = θ(l)ǫ t the autocorrelation functions is proportional to ρ(k) ck 2d 1 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

33 ARCH and GARCH models Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

34 Basic Structure and Properties of ARMA model standard time series models have: Y t = E[Y t Ω t 1 ] + ǫ t E[Y t Ω t 1 ] = f (Ω t 1 ; θ) ] Var [Y t Ω t 1 ] = E [ǫ 2 t Ω t 1 = σ 2 hence, Conditional mean: varies with Ω t 1 Conditional variance: constant (unfortunately) k-step-ahead mean forecasts: generally depends on Ω t 1 k-step-ahead variance forecasts : depends only on k, not on Ω t 1 (again unfortunately) Unconditional mean: constant Unconditional variance: constant Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

35 AutoRegressive Conditional Heteroskedasticity (ARCH) model Engle (1982, Econometrica) intruduced the ARCH models: Y t = E[Y t Ω t 1 ] + ǫ t E[Y t Ω t 1 ] = f (Ω t 1 ; θ) [ ] Var [Y t Ω t 1 ] = E ǫ 2 t Ω t 1 = σ (Ω t 1 ; θ) σt 2 hence, Conditional mean: varies with Ω t 1 Conditional variance: varies with Ω t 1 k-step-ahead mean forecasts: generally depends on Ω t 1 k-step-ahead variance forecasts : generally depends on Ω t 1 Unconditional mean: constant Unconditional variance: constant Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

36 ARCH(q) How to parameterize E [ ǫ 2 t Ω t 1] = σ((ωt 1 ; θ) σ 2 t? ARCH(q) postulated that the conditional variance is a linear function of the past q squared innovations q σt 2 = ω + α i ǫ 2 t i = ω + α(l)ǫ2 t 1 i=1 Defining v t = ǫ 2 t σ 2 t, the ARCH(q) model can be written as ǫ 2 t = ω + α(l)ǫ 2 t 1 + vt Since E t 1 (v t) = 0, the model corresponds directly to an AR(q) model for the squared innovations, ǫ 2 t. The process is covariance stationary if and only if the sum of the positive AR parameters is less than 1. Then, the unconditional variance of ǫ t is Var(ǫ t) = σ 2 = w/(1 α 1 α 2... α q). Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

37 ARCH(q) How to parameterize E [ ǫ 2 t Ω t 1] = σ((ωt 1 ; θ) σ 2 t? ARCH(q) postulated that the conditional variance is a linear function of the past q squared innovations q σt 2 = ω + α i ǫ 2 t i = ω + α(l)ǫ2 t 1 i=1 Defining v t = ǫ 2 t σ 2 t, the ARCH(q) model can be written as ǫ 2 t = ω + α(l)ǫ 2 t 1 + vt Since E t 1 (v t) = 0, the model corresponds directly to an AR(q) model for the squared innovations, ǫ 2 t. The process is covariance stationary if and only if the sum of the positive AR parameters is less than 1. Then, the unconditional variance of ǫ t is Var(ǫ t) = σ 2 = w/(1 α 1 α 2... α q). Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

38 AR(1)-ARCH(1) Example: the AR(1)-ARCH(1) model Y t = φy t 1 + ǫ t σ 2 t = ω + αǫ 2 t 1 ǫ t N(0, σ 2 t ) - Conditional mean: E(Y t Ω t 1 ) = φy t 1 - Conditional variance: E([Y t E(Y t Ω t 1 )] 2 Ω t 1 = ω + αǫ 2 t 1 - Unconditional mean: E(Y t) = 0 - Unconditional variance: E(Y t E(Y t)) 2 = 1 ω (1 φ 2 ) (1 α) Note that the unconditional distribution of ǫ t has Fat Tail. In fact, the unconditional kurtosis E(ǫ 4 t )/E(ǫ 2 t ) 2 is [ ] [ ] [ ] E ǫ 4 t = E E(ǫ 4 t Ω t 1) = 3E σt 4 = 3[Var(σ t 2 ) + E(σ2 t )2 ] = 3[Var(σ t 2 }{{} ) +E(ǫ 2 t )2 ] > 3E(ǫ 2 t )2. >0 Hence, Kurtosis(ǫ t) = E(ǫ 4 t )/E(ǫ 2 t ) 2 > 3 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

39 AR(1)-ARCH(1) Example: the AR(1)-ARCH(1) model Y t = φy t 1 + ǫ t σ 2 t = ω + αǫ 2 t 1 ǫ t N(0, σ 2 t ) - Conditional mean: E(Y t Ω t 1 ) = φy t 1 - Conditional variance: E([Y t E(Y t Ω t 1 )] 2 Ω t 1 = ω + αǫ 2 t 1 - Unconditional mean: E(Y t) = 0 - Unconditional variance: E(Y t E(Y t)) 2 = 1 ω (1 φ 2 ) (1 α) Note that the unconditional distribution of ǫ t has Fat Tail. In fact, the unconditional kurtosis E(ǫ 4 t )/E(ǫ 2 t ) 2 is [ ] [ ] [ ] E ǫ 4 t = E E(ǫ 4 t Ω t 1) = 3E σt 4 = 3[Var(σ t 2 ) + E(σ2 t )2 ] = 3[Var(σ t 2 }{{} ) +E(ǫ 2 t )2 ] > 3E(ǫ 2 t )2. >0 Hence, Kurtosis(ǫ t) = E(ǫ 4 t )/E(ǫ 2 t ) 2 > 3 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

40 GARCH(p,q) Problem: empirical volatility very persistent Large q i.e. too many α s Bollerslev (1986, J. of Econometrics) proposed the Generalized ARCH model. The GARCH(p,q) is defined as q p σt 2 = ω + α i ǫ 2 t i + β j σt j 2 = ω + α(l)ǫ2 t 1 + β(l)σ2 t 1 i=1 j=1 As before also the GARCH(p,q) can be rewritten as ǫ 2 t = ω + [α(l) + β(l)] ǫ 2 t 1 β(l)v t 1 + v t which defines an ARMA[max(p, q),p] model for et z. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

41 GARCH(p,q) Problem: empirical volatility very persistent Large q i.e. too many α s Bollerslev (1986, J. of Econometrics) proposed the Generalized ARCH model. The GARCH(p,q) is defined as q p σt 2 = ω + α i ǫ 2 t i + β j σt j 2 = ω + α(l)ǫ2 t 1 + β(l)σ2 t 1 i=1 j=1 As before also the GARCH(p,q) can be rewritten as ǫ 2 t = ω + [α(l) + β(l)] ǫ 2 t 1 β(l)v t 1 + v t which defines an ARMA[max(p, q),p] model for et z. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

42 GARCH(1,1) By far the most commonly used is the GARCH(1,1): σt 2 = ω + α ǫ 2 t 1 + β j σt 1 2 with ω > 0, α > 0, β > 0. By recursive substitution, the GARCH(1,1) may be written as the following ARCH( ): σt 2 = ω(1 β) + α β i 1 ǫ 2 t i i=1 which reduces to an exponentially weighted moving average filter for ω = 0 and α + β = 1 (sometimes referred to as Integrated GARCH or IGARCH(1,1)). Moreover, GARCH(1,1) implies an ARMA(1,1) representation in the ǫ 2 t ǫ 2 t = ω + [α + β]ǫ 2 t 1 βv t 1 + v t Forecasting. Denoting the unconditional variance σ 2 ω(1 α β) 1 we have: ˆσ 2 t+h t = σ2 + (α + β) h 1 (σ 2 t+1 σ2 ) showing that the forecasts of the conditional variance revert to the long-run unconditional variance at an exponential rate dictated by α + β Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

43 GARCH(1,1) By far the most commonly used is the GARCH(1,1): σt 2 = ω + α ǫ 2 t 1 + β j σt 1 2 with ω > 0, α > 0, β > 0. By recursive substitution, the GARCH(1,1) may be written as the following ARCH( ): σt 2 = ω(1 β) + α β i 1 ǫ 2 t i i=1 which reduces to an exponentially weighted moving average filter for ω = 0 and α + β = 1 (sometimes referred to as Integrated GARCH or IGARCH(1,1)). Moreover, GARCH(1,1) implies an ARMA(1,1) representation in the ǫ 2 t ǫ 2 t = ω + [α + β]ǫ 2 t 1 βv t 1 + v t Forecasting. Denoting the unconditional variance σ 2 ω(1 α β) 1 we have: ˆσ 2 t+h t = σ2 + (α + β) h 1 (σ 2 t+1 σ2 ) showing that the forecasts of the conditional variance revert to the long-run unconditional variance at an exponential rate dictated by α + β Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

44 Asymmetric GARCH In standard GARCH model: σ 2 t = ω + αr 2 t 1 + βσ2 t 1 σ 2 t responds symmetrically to past returns. The so called news impact curve is a parabola conditional variance News Impact Curve symmetric GARCH asymmetric GARCH standardized lagged shocks Empirically negative r t 1 impact more than positive ones asymmetric news impact curve GJR or T-GARCH σ 2 t = ω + αr 2 t 1 + γr2 t 1 D t 1 + βσ 2 t 1 with D t = - Positive returns (good news): α - Negative returns (bad news): α + γ - Empirically γ > 0 Leverage effect Exponential GARCH (EGARCH) ln(σt 2 ) = ω + α r t 1 σ + γ r t 1 + βln(σt 1 2 t 1 σ ) t 1 { 1 if rt < 0 0 otherwise Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

45 Asymmetric GARCH In standard GARCH model: σ 2 t = ω + αr 2 t 1 + βσ2 t 1 σ 2 t responds symmetrically to past returns. The so called news impact curve is a parabola conditional variance News Impact Curve symmetric GARCH asymmetric GARCH standardized lagged shocks Empirically negative r t 1 impact more than positive ones asymmetric news impact curve GJR or T-GARCH σ 2 t = ω + αr 2 t 1 + γr2 t 1 D t 1 + βσ 2 t 1 with D t = - Positive returns (good news): α - Negative returns (bad news): α + γ - Empirically γ > 0 Leverage effect Exponential GARCH (EGARCH) ln(σt 2 ) = ω + α r t 1 σ + γ r t 1 + βln(σt 1 2 t 1 σ ) t 1 { 1 if rt < 0 0 otherwise Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

46 Estimation A GARCH process with gaussian innovation: has conditional densities: r t Ω t 1 N(µ t(θ), σ 2 t (θ)) f(r t Ω t 1 ; θ) = 1 ( σt 1 (θ) exp 2π 1 2 ) (r t µ(θ)) σt 2 (θ) using the prediction error decomposition of the likelihood L(r T, r T 1,..., r 1 ; θ) = f(r T Ω T 1 ; θ) f(r T 1 Ω T 2 ; θ)... f(r 1 Ω 0 ; θ) the log-likelihood becomes: log L(r T, r T 1,..., r 1 ; θ) = T T 2 log(2π) log σ t(θ) 1 T 2 t=1 t=1 (r t µ(θ)) σ 2 t (θ) Non linear function in θ Numerical optimization techiques. Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March / 24

Università di Pavia. Forecasting. Eduardo Rossi

Università di Pavia. Forecasting. Eduardo Rossi Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The

More information

Forecasting with ARMA

Forecasting with ARMA Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50 GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Heteroskedasticity in Time Series

Heteroskedasticity in Time Series Heteroskedasticity in Time Series Figure: Time Series of Daily NYSE Returns. 206 / 285 Key Fact 1: Stock Returns are Approximately Serially Uncorrelated Figure: Correlogram of Daily Stock Market Returns.

More information

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design

Marcel Dettling. Applied Time Series Analysis SS 2013 Week 05. ETH Zürich, March 18, Institute for Data Analysis and Process Design Marcel Dettling Institute for Data Analysis and Process Design Zurich University of Applied Sciences marcel.dettling@zhaw.ch http://stat.ethz.ch/~dettling ETH Zürich, March 18, 2013 1 Basics of Modeling

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Introduction to Estimation Methods for Time Series models Lecture 2

Introduction to Estimation Methods for Time Series models Lecture 2 Introduction to Estimation Methods for Time Series models Lecture 2 Fulvio Corsi SNS Pisa Fulvio Corsi Introduction to Estimation () Methods for Time Series models Lecture 2 SNS Pisa 1 / 21 Estimators:

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Classic Time Series Analysis

Classic Time Series Analysis Classic Time Series Analysis Concepts and Definitions Let Y be a random number with PDF f Y t ~f,t Define t =E[Y t ] m(t) is known as the trend Define the autocovariance t, s =COV [Y t,y s ] =E[ Y t t

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)

More information

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models Module 3 Descriptive Time Series Statistics and Introduction to Time Series Models Class notes for Statistics 451: Applied Time Series Iowa State University Copyright 2015 W Q Meeker November 11, 2015

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Review Session: Econometrics - CLEFIN (20192)

Review Session: Econometrics - CLEFIN (20192) Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Stationary Stochastic Time Series Models

Stationary Stochastic Time Series Models Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 III. Stationary models 1 Purely random process 2 Random walk (non-stationary)

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

5 Transfer function modelling

5 Transfer function modelling MSc Further Time Series Analysis 5 Transfer function modelling 5.1 The model Consider the construction of a model for a time series (Y t ) whose values are influenced by the earlier values of a series

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

2. An Introduction to Moving Average Models and ARMA Models

2. An Introduction to Moving Average Models and ARMA Models . An Introduction to Moving Average Models and ARMA Models.1 White Noise. The MA(1) model.3 The MA(q) model..4 Estimation and forecasting of MA models..5 ARMA(p,q) models. The Moving Average (MA) models

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

Ch. 19 Models of Nonstationary Time Series

Ch. 19 Models of Nonstationary Time Series Ch. 19 Models of Nonstationary Time Series In time series analysis we do not confine ourselves to the analysis of stationary time series. In fact, most of the time series we encounter are non stationary.

More information

Univariate ARIMA Models

Univariate ARIMA Models Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.

More information

Volatility. Gerald P. Dwyer. February Clemson University

Volatility. Gerald P. Dwyer. February Clemson University Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use

More information

Time Series Analysis

Time Series Analysis Time Series Analysis Christopher Ting http://mysmu.edu.sg/faculty/christophert/ christopherting@smu.edu.sg Quantitative Finance Singapore Management University March 3, 2017 Christopher Ting Week 9 March

More information

Lecture note 2 considered the statistical analysis of regression models for time

Lecture note 2 considered the statistical analysis of regression models for time DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

Quantitative Finance I

Quantitative Finance I Quantitative Finance I Linear AR and MA Models (Lecture 4) Winter Semester 01/013 by Lukas Vacha * If viewed in.pdf format - for full functionality use Mathematica 7 (or higher) notebook (.nb) version

More information

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore,

4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2. Mean: where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, 61 4. MA(2) +drift: y t = µ + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 Mean: y t = µ + θ(l)ɛ t, where θ(l) = 1 + θ 1 L + θ 2 L 2. Therefore, E(y t ) = µ + θ(l)e(ɛ t ) = µ 62 Example: MA(q) Model: y t = ɛ t + θ 1 ɛ

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

Econometric Forecasting

Econometric Forecasting Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Problem Set 6 Solution

Problem Set 6 Solution Problem Set 6 Solution May st, 009 by Yang. Causal Expression of AR Let φz : αz βz. Zeros of φ are α and β, both of which are greater than in absolute value by the assumption in the question. By the theorem

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Lecture on ARMA model

Lecture on ARMA model Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Trend-Cycle Decompositions

Trend-Cycle Decompositions Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models

Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models Lecture 6: Univariate Volatility Modelling: ARCH and GARCH Models Prof. Massimo Guidolin 019 Financial Econometrics Winter/Spring 018 Overview ARCH models and their limitations Generalized ARCH models

More information

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.

TIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M. TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

1 Time Series Concepts and Challenges

1 Time Series Concepts and Challenges Forecasting Time Series Data Notes from Rebecca Sela Stern Business School Spring, 2004 1 Time Series Concepts and Challenges The linear regression model (and most other models) assume that observations

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

ECONOMETRICS Part II PhD LBS

ECONOMETRICS Part II PhD LBS ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Lecture 2: ARMA(p,q) models (part 2)

Lecture 2: ARMA(p,q) models (part 2) Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Chapter 6: Model Specification for Time Series

Chapter 6: Model Specification for Time Series Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

Ch 6. Model Specification. Time Series Analysis

Ch 6. Model Specification. Time Series Analysis We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests

ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests ECON/FIN 250: Forecasting in Finance and Economics: Section 7: Unit Roots & Dickey-Fuller Tests Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Unit Root Tests ECON/FIN

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

TMA4285 December 2015 Time series models, solution.

TMA4285 December 2015 Time series models, solution. Norwegian University of Science and Technology Department of Mathematical Sciences Page of 5 TMA4285 December 205 Time series models, solution. Problem a) (i) The slow decay of the ACF of z t suggest that

More information

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013

18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 18.S096 Problem Set 4 Fall 2013 Time Series Due Date: 10/15/2013 1. Covariance Stationary AR(2) Processes Suppose the discrete-time stochastic process {X t } follows a secondorder auto-regressive process

More information

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27 ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why

More information

GARCH processes probabilistic properties (Part 1)

GARCH processes probabilistic properties (Part 1) GARCH processes probabilistic properties (Part 1) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/

More information

6.3 Forecasting ARMA processes

6.3 Forecasting ARMA processes 6.3. FORECASTING ARMA PROCESSES 123 6.3 Forecasting ARMA processes The purpose of forecasting is to predict future values of a TS based on the data collected to the present. In this section we will discuss

More information

Time Series Econometrics 4 Vijayamohanan Pillai N

Time Series Econometrics 4 Vijayamohanan Pillai N Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving

More information

Univariate Volatility Modeling

Univariate Volatility Modeling Univariate Volatility Modeling Kevin Sheppard http://www.kevinsheppard.com Oxford MFE This version: January 10, 2013 January 14, 2013 Financial Econometrics (Finally) This term Volatility measurement and

More information

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010

Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically

More information

A Data-Driven Model for Software Reliability Prediction

A Data-Driven Model for Software Reliability Prediction A Data-Driven Model for Software Reliability Prediction Author: Jung-Hua Lo IEEE International Conference on Granular Computing (2012) Young Taek Kim KAIST SE Lab. 9/4/2013 Contents Introduction Background

More information

Econometrics of financial markets, -solutions to seminar 1. Problem 1

Econometrics of financial markets, -solutions to seminar 1. Problem 1 Econometrics of financial markets, -solutions to seminar 1. Problem 1 a) Estimate with OLS. For any regression y i α + βx i + u i for OLS to be unbiased we need cov (u i,x j )0 i, j. For the autoregressive

More information

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before.

We will only present the general ideas on how to obtain. follow closely the AR(1) and AR(2) cases presented before. ACF and PACF of an AR(p) We will only present the general ideas on how to obtain the ACF and PACF of an AR(p) model since the details follow closely the AR(1) and AR(2) cases presented before. Recall that

More information

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS)

MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) MAT 3379 (Winter 2016) FINAL EXAM (SOLUTIONS) 15 April 2016 (180 minutes) Professor: R. Kulik Student Number: Name: This is closed book exam. You are allowed to use one double-sided A4 sheet of notes.

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

ARIMA Models. Richard G. Pierse

ARIMA Models. Richard G. Pierse ARIMA Models Richard G. Pierse 1 Introduction Time Series Analysis looks at the properties of time series from a purely statistical point of view. No attempt is made to relate variables using a priori

More information

ECON3327: Financial Econometrics, Spring 2016

ECON3327: Financial Econometrics, Spring 2016 ECON3327: Financial Econometrics, Spring 2016 Wooldridge, Introductory Econometrics (5th ed, 2012) Chapter 11: OLS with time series data Stationary and weakly dependent time series The notion of a stationary

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications

Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive

More information

Ross Bettinger, Analytical Consultant, Seattle, WA

Ross Bettinger, Analytical Consultant, Seattle, WA ABSTRACT DYNAMIC REGRESSION IN ARIMA MODELING Ross Bettinger, Analytical Consultant, Seattle, WA Box-Jenkins time series models that contain exogenous predictor variables are called dynamic regression

More information