Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013
Fundamentals Stationarity A time series is a sequence of random variables x t, t = 1,..., T usually measured at equal intervals. The bulding block of time series analysis is the concept of stationarity. Two main formulations: Strict stationarity: A time series x t is said to be strictly stationary if the joitn distribution of (x t1,..., x tk ) is identical to that of (x t1 +m,..., x tk +m) for all t, with k arbitrarily positive. This is a very strong assumption which is hard to verify empirically. Weak (or covariance) stationarity: A time series x t is said to be covariance stationary if E[x t ]=μ Cov[x t, x t l ]=γ l l
Fundamentals Stationarity A simple example of a stationary process is the Gaussian White Noise: x t = ɛ t, ɛ t N(0, σ 2 ) E[x t ]=0, VAR(x t )=σ 2, γ l = 0, l 0
Fundamentals Stationarity 4 3 2 1 0 1 2 3 4 0 100 200 300 400 500 600 700 800 900 1000 1 Sample Autocorrelation Function Sample Autocorrelation 0.5 0 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag
A simple autoregressive model Formulation and distribution properties A simple AR(1) model is defined as x t = α + φx t 1 + ɛ t with ɛ t iid(0, σ 2 )
A simple autoregressive model Formulation and distribution properties A simple AR(1) model is defined as x t = α + φx t 1 + ɛ t with ɛ t iid(0, σ 2 ) This simple model entails a first order Markov dependence structure since E t 1 x t = α + φx t 1 Var t 1 x t = σ 2 Let us define E(x t )=μand Var(x t )=γ 0. Under stationarity the unconditional moments are given by μ = α + φμ s.t. μ = α 1 φ γ 0 = γ 0 φ 2 + σ 2 s.t. γ 0 = σ2 1 φ
A simple autoregressive model Properties: Stationarity The standard AR(1) model boils down to be stationarity if φ < 1, indeed therefore solving (1 φl)x t = ɛ t (1 φz) gives z = 1 which is φ 1 > 1 if φ < 1 φ given stationarity the AR(1) can be decomposed as a linear combination of white noise processes x t = α t 1 1 φ + φ i ɛ t i This can be easily proved by starting with x 2 = α + φx 1 + ɛ 2 then iterating forward. i=0
A simple autoregressive model Properties: Autocorrelation AR(1) with φ = 0.9 8 6 4 2 0 2 4 6 8 0 100 200 300 400 500 600 700 800 900 1000 4 AR(1) with φ = 0.1 3 2 1 0 1 2 3 4 0 100 200 300 400 500 600 700 800 900 1000
A simple autoregressive model Properties: Autocorrelation The autocovariance function can be easily derived as { φγ γ l = 1 + σ 2 ifl = 0 φγ l 1 ifl > 0 By using the standard definition of correlation the autocorrelation function can be defined as ρ l = γ l γ 0 = φγ l 1 γ 0 = φρ l 1 for l 0 now since ρ 0 = 1wehaveρ l = φ l The autorrelation, namely persistency, depends on the autoregressive coefficient in the simple AR(1) model.
A simple autoregressive model Properties: Autocorrelation ρ l for φ=0.8 1 Sample Autocorrelation 0.5 0 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag Sample Autocorrelation 1 0.8 0.6 0.4 0.2 0 0.2 0.4 0.6 0.8 ρ l for φ= 0.8 1 0 2 4 6 8 10 12 14 16 18 20 Lag
The general AR(p) model The AR(1) model can be generalized for p lags as p x t = α + φ i x t i + ɛ t i=1 the unconditional moments are defined as α E(x t )= 1 p with Var(x t )=γ 0 = i=1 φ i p φ i γ i + σ 2 i=1 Cov(x t, x t i )=γ i = φ 1 γ i 1 + φ 2 γ i 2 +... + φ p γ j p dividing the autocovariance by γ 0 we get the ACF as p ρ j = φ i ρ j i i=1
A simple Moving Average model Formulation and distribution properties A simple MA(1) model is defined as x t = α + ɛ t + θɛ t 1 with ɛ t iid(0, σ 2 )
A simple Moving Average model Formulation and distribution properties A simple MA(1) model is defined as x t = α + ɛ t + θɛ t 1 with ɛ t iid(0, σ 2 ) The conditional moments can be easily defined as E t 1 x t = α + θɛ t 1 Var t 1 x t = σ 2
A simple Moving Average model Formulation and distribution properties A simple MA(1) model is defined as x t = α + ɛ t + θɛ t 1 with ɛ t iid(0, σ 2 ) The conditional moments can be easily defined as E t 1 x t = α + θɛ t 1 Var t 1 x t = σ 2 The unconditional moments can be defined as E(x t )=α Var(x t )=γ 0 = σ 2 + θ 2 σ 2 =(1 + θ 2 )σ 2
A simple Moving Average model Properties: Autocorrelation 0.6 0.4 MA(1) with θ = 0.1 0.2 0 0.2 0.4 0 100 200 300 400 500 600 700 800 900 1000 1.5 1 MA(12) with θ i = 0.9 for i=1,..,12 0.5 0 0.5 1 0 100 200 300 400 500 600 700 800 900 1000
A simple Moving Average model Properties: Autocorrelation The autocovariance function can be defined as γ 1 = E [(x t α)(x t 1 α)] = E [(ɛ t + θɛ t 1 )(ɛ t + θɛ t 1 )] = σ 2 θ being γ l = 0forl > 1, such that the autocorrelation can be defined as ρ 1 = γ 1 γ 0 = σ 2 θ (1 + θ 2 )σ 2 = θ (1 + θ 2 ) The autocorrelation depends on the moving average parameter θ
A simple Moving Average model Properties: Autocorrelation 1 Sample Autocorrelation 0.5 0 ρ 1 for θ = 0.1 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag 1 Sample Autocorrelation 0.5 0 ρ 1 for θ = 0.9 0.5 0 2 4 6 8 10 12 14 16 18 20 Lag
The ARMA(1,1) model Properties We can combine the AR(1) and the MA(1) as x t = c 1 x t 1 + ɛ t + a 1 ɛ t 1 (1 c 1 L)x t =(1 + a 1 L)ɛ t x t = (1 + a 1L) (1 c 1 L) ɛ t =(1 + a 1 L)(1 + c 1 L + c 2 L 2 +...)ɛ t =[1 +(a 1 + c 1 )L + c 1 (a 1 + c 1 )L 2 + c 2 1(a 1 + c 1 )L 3 +...]ɛ t
The ARMA(1,1) model Properties We can combine the AR(1) and the MA(1) as x t = c 1 x t 1 + ɛ t + a 1 ɛ t 1 (1 c 1 L)x t =(1 + a 1 L)ɛ t x t = (1 + a 1L) (1 c 1 L) ɛ t =(1 + a 1 L)(1 + c 1 L + c 2 L 2 +...)ɛ t =[1 +(a 1 + c 1 )L + c 1 (a 1 + c 1 )L 2 + c 2 1(a 1 + c 1 )L 3 +...]ɛ t Now the unconditional moments can be derived under the assumption of weak stationarity Var(x t )= [ 1 +(a 1 + c 1 ) 2 + c 2 1(a 1 + c 1 ) 2 +... ] σ 2 ɛ = [1 + (a 1 + c 1 ) 2 ] 1 c 2 σ 2 ɛ 1
The ARMA(1,1) model Properties We can combine the AR(1) and the MA(1) as x t = c 1 x t 1 + ɛ t + a 1 ɛ t 1 (1 c 1 L)x t =(1 + a 1 L)ɛ t x t = (1 + a 1L) (1 c 1 L) ɛ t =(1 + a 1 L)(1 + c 1 L + c 2 L 2 +...)ɛ t =[1 +(a 1 + c 1 )L + c 1 (a 1 + c 1 )L 2 + c 2 1(a 1 + c 1 )L 3 +...]ɛ t Now the unconditional moments can be derived under the assumption of weak stationarity Var(x t )= [ 1 +(a 1 + c 1 ) 2 + c 2 1(a 1 + c 1 ) 2 +... ] σ 2 ɛ = [1 + (a 1 + c 1 ) 2 ] 1 c 2 σ 2 ɛ 1 we can clearly see that for lim c 1 1 Var(x t)=
The General ARMA(p,q) model The ARMA(1,1) model can be generalized for and order p of the autoregressive structure and q for the moving average part such that x t = ρ 1 x t 1 + ρ 2 x t 2 +... + ρ p x t p + ɛ t + θ 1 ɛ t 1 + θ 2 ɛ t 2 +... + θ q ɛ t q ( 1 ρ1 L ρ 2 L 2... ρ p L p) x t = ( 1 + θ 1 L + θ 2 L 2 +... + θ q L q) ɛ t such that x t ρ(l) =θ(l)ɛ t x t = θ(l) ρ(l) ɛ t = Φ(L)ɛ t
Estimate the ARMA model The Box-Jenkins approach Step 1: Make sure that the time series is stationary (e.g. Augmented Dickey -Fuller test). If not stationary takes first order differences. Step 2: Model selection (Information Criteria) Step 3: Model checking (Residuals tests)
Example: Step 1 Check for stationarity Test for stationarity of the x t variable (i.e. US Stock market returns) by using the Augmented Dickey-Fuller test x t = c + ˆδx t 1 + in order to check for stationarity k ˆφ i Δx t i 1 + ˆɛ t i=0 t-statistic Prob. -19.068 0.0000 Test critical values: 1% level -3.445 5% level -2.868 10% level -2.570 Notice the t stats is defined as t (ˆδ 1 ) /SE (ˆδ ) where the null hypothesis is H 0 : ˆδ = 1.
Example: Step 2 Model selection The ACF and the PACF might be misleading the lag structure of the ARMA(p,q) can be investigate by using the information criteria AIC: The Akaike s Information Criteria is defined as AIC = 2log(L)+2(p q) SBC: The Schwarz Bayesian Information Criteria is defined as SBC = 2log(L)+log(T)(p q) where L is the value of the maximized likelihood and T the number of observations Model AIC SBC ARMA(1,1) -3.38-3.35 ARMA(1,2) -3.35-3.33 ARMA(1,3) -3.29-3.31 ARMA(2,1) -3.37-3.33 ARMA(3,1) -3.37-3.33
Example: Step 3 Model estimates and checking Variable Coefficient Std. Error t-statistic Prob. C 0.008 0.002 3.57 0.0004 AR(1) -0.649 0.214-3.029 0.0026 MA(1) 0.737 0.190 3.876 0.0001 Adjusted R-squared 0.014 Log likelihood 729.80 Akaike info criterion -3.381 Schwarz criterion -3.352 Hannan-Quinn criter. -3.369
Example: Step 3 Model estimates and checking 140 120 Distribution of the residuals 100 80 60 40 20 0 0.25 0.2 0.15 0.1 0.05 0 0.05 0.1 0.15
Example: Step 3 Model estimates and checking 0.8 Residuals Autocorrelation Function 0.6 Sample Autocorrelation 0.4 0.2 0 0.2 0 2 4 6 8 10 12 14 16 18 20 Lag