ECO Econometrics III. Daniel L. Millimet. Fall Southern Methodist University. DL Millimet (SMU) ECO 6375 Fall / 150

Size: px
Start display at page:

Download "ECO Econometrics III. Daniel L. Millimet. Fall Southern Methodist University. DL Millimet (SMU) ECO 6375 Fall / 150"

Transcription

1 ECO 6375 Econometrics III Daniel L. Millimet Southern Methodist University Fall 2018 DL Millimet (SMU) ECO 6375 Fall / 150

2 Time Series Introduction TS models can be grouped into two categories Models aimed at forecasting Focus is (should be) on ŷ, not β Causation is not a concern, only obtaining the most accurate forecasts Models aimed at estimating dynamic causal effects Causation is (typically) not the same as in microeconometrics A prediction-based approach to causality is used; attributable to Granger (1969) and Sims (1972) One variable causes another if the current value of the variable in question helps predict future values of the outcome variable Thus, definition of causation is explicitly tied to the information set available at each point in time; additional information added to the information set may render a previous causal relationship now spurious Some movement in time series toward a micro view of causation (structural VARS, etc.) DL Millimet (SMU) ECO 6375 Fall / 150

3 Regardless of the aim, TS analysis is a bit more complex than CS analysis since... The notion of repeated sampling vanishes Many CS results rely on independent observations which is very unlikely with TS data Consider a scalar random variable, Y, defined at integer points in time, denoted t = 0, ±1, ±2,... A time series process is a sequence of observations, {y t } t=, regarded as one realization of a stochastic process (i.e., for each value of t, y t is drawn from a distribution or population of y t s) a time series is a single realization of a random event Statistical results are not based on the notion of random sampling from a population as in CS, but from the dbn of statistics constructed from this realization in a time window, t = 1,..., T Asymptotic results based on increasing length of the time window DL Millimet (SMU) ECO 6375 Fall / 150

4 Let f Yt (y t ) denote the pdf of Y t, which in general may depend on t If this pdf depends on t, then each element in the realization is drawn from a different dbn Population moments E[y t ] = µ t = y tf Yt (y t )dy t E [ (y t µ t ) 2] = γ 0t = (y t µ t ) 2 f Yt (y t )dy t Definition The j th autocovariance of a time series process, {y t } t=, denoted γ jt, is Cov(y t, y t j ), where y t j is the j th lag. The j th autocorrelation (or serial correlation coeffi cient) is Corr(y t, y t j ) = Cov(y t, y t j )/ Var(y t ) Var(y t j ). DL Millimet (SMU) ECO 6375 Fall / 150

5 Notes: In principle, all population moments are indexed by t This implies estimation would require a sample of observations on Y for each t, which is not possible In practice, one wishes to make inferences about the statistical properties of the variable Y from a single, finite realization or set of T observations, {y t } T t=1 To proceed requires more structure DL Millimet (SMU) ECO 6375 Fall / 150

6 Time Series Stationarity & Ergodicity Definition A time series process, {y t } t=, is strongly stationary (or strictly stationary or stationary) if the joint dbn of any set of k observations in the sequence, {y t, y t+1,..., y t+k }, is the same regardless of the origin period, t. Definition A time series process, {y t } t=, is weakly stationary (or covariance stationary or second-order stationary) if E[y t ] is finite and identical for all t and if the covariances between any two observations (i.e., the autocovariance), Cov(y t, y t k ), is a finite function only of model parameters and their distance apart in time, k, but not t. In other words, µ t and γ jt, j = 0, ±1, ±2,..., exist for all t and j and are independent of t. DL Millimet (SMU) ECO 6375 Fall / 150

7 Notes: Stationarity begins to place suffi cient structure to allow estimation as µ t = µ and γ jt = γ j = γ j t Weakly stationary is implied by stationary While little practical difference between weakly stationary and stationary, technically weakly stationary is all that is required in TS analysis A Gaussian (or normal) process is stationary if it is weakly stationary since the dbn is completely characterized by the first two moments Joint stationarity refers the multivariate dbn of multiple processes: {y 1t, y 2t } t= Autocovariance refers to covariances between a variable and its lags; cross-covariance refers to covariances between processes DL Millimet (SMU) ECO 6375 Fall / 150

8 Consistency requires additional structure Definition A stationary time series process, {y t } t=, is ergodic if for functions, f : R a R 1 and g : R b R 1, lim k E[f (y t, y t+1,..., y t+a )g(y t+k, y t+k+1,..., y t+k+b )] = E[f (y t, y t+1,..., y t+a )] E[g(y t+k, y t+k+1,..., y t+k+b )] Ergodicity is less intuitive It states that if observations are separated by enough time, they are asymptotically independent This implies that each observation contains at least some unique information, which is necessary for consistent estimation Stationarity does not guarantee ergodicity DL Millimet (SMU) ECO 6375 Fall / 150

9 Time Series Model Types Models and methods vary depending on the type of data involved How many time series processes are being modeled? 1 Univariate time series: {y t } 2 Multivariate time series: {y 1t,..., y pt } Are other covariates introduced into the model? 1 Univariate time series: {y t, x t } 2 Multivariate time series: {y 1t,..., y pt, x t } Discussion that follows proceeds accordingly DL Millimet (SMU) ECO 6375 Fall / 150

10 Time Series Common Univariate Time Series Processes To start, consider univariate processes with no other covariates Basic building block: white noise Definition A time series process, {y t } t=, is white noise if E[y t ] = 0, E[y 2 t ] = σ 2 y, and E[y t y s ] = 0 t = s. Note, the final requirement is weaker than independence between y t and y s. The process is independent white noise if y t and y s are also independent for all t = s. The process is Gaussian white noise if y t N(0, σ 2 y ). DL Millimet (SMU) ECO 6375 Fall / 150

11 More complex processes 1 A moving average process, denoted MA(q), is given by y t = µ + q s=0 θ s ε t s where θ 0 = 1 and {ε t } t= is a time series process (and thus stochastic) 2 An autoregressive process, denoted AR(p), is given by y t = α + p s=1 γ s y t s + ε t where {ε t } t= is a time series process (and thus stochastic) 3 Models can be combined, denoted as ARMA(p, q) y t = α + p s=1 γ s y t s + q s=0 θ s ε t s where {ε t } t= is a time series process (and thus stochastic) 4 A random walk process is given by y t = y t 1 + ε t where {ε t } t= is a time series process (and thus stochastic) DL Millimet (SMU) ECO 6375 Fall / 150

12 Roadmap 1 Recongizing processes from their time series plots, autocorrelation structure, and partial autocorrelation structure 2 Under what circumstances are they stationary? 3 Estimation DL Millimet (SMU) ECO 6375 Fall / 150

13 y y y t y = 0.9L.y + e t y = 0.5L.y + e t y = 0.5L.y + e y t y = 1.2L.y + e y t y = 1.2L.y + e y t y = L.y + e Note: e ~ N(0,0.1). DL Millimet (SMU) ECO 6375 Fall / 150

14 y y y t y = e + 0.5L.e t y = e 0.5L.e t y = e L.e y y y t y = e + 1.2L.e Note: e ~ N(0,0.1) t y = 0.5L.y + e + 0.5L.e t y = 0.5L.y + e + 0.5L.e DL Millimet (SMU) ECO 6375 Fall / 150

15 Time Series Autocorrelation and Partial Autocorrelation Functions Preliminary analysis of a time series variable can be accomplished by simply plotting the variable over time Stata: -tsline- The dependence across time in a variable can be assessed using the autocorrelation (AC) and partial autocorrelation (PAC) functions Definition A correlogram is a plot of the autocorrelation function against time. Convergence to zero is a necessary but not suffi cient condition for a process to be stationary. DL Millimet (SMU) ECO 6375 Fall / 150

16 The estimated autocovariance fn for a variable, y t, t = 1,..., T, is defined for 0 v < T as R(v) = 1 T T v s=1 (y s y)(y s+v y) where division is typically by T The estimated AC fn is ρ v = R(v) R(0) = T v s=1 (y s y)(y s+v y) T s=1(y s y) 2 with variance given by { Var( ρ v ) = [ 1/T ] if v = 1 (1/T ) v s=1 1 ρ2 s if v > 1 based on Bartlett s (1946) formula for MA(q) processes which assumes stationarity and normal errors DL Millimet (SMU) ECO 6375 Fall / 150

17 The PAC at lag v measures the correlation between y t and y t v holding y t 1,..., y t (v 1) fixed One estimation procedure is based on the OLS regression ỹ t = v 1 s=1 β s ỹt s + φ vv ỹ t v + ε t where ỹ denotes the standardized y Variance given by Var( φ vv ) = 1/T Distribution theory underlying AC and PAC fns assume that y is stationary Stata: -corrgram-, -ac-, -pac- Enders (2004) has examples of true values Examples follow... DL Millimet (SMU) ECO 6375 Fall / 150

18 Time Series Plot and ACF/PACF White Noise Process t Lag AC Function Lag PAC Function Note: y = e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

19 Time Series Plot and ACF/PACF MA(1) Process t Lag AC Function Lag PAC Function Note: y = e + 0.5L.e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

20 Time Series Plot and ACF/PACF MA(5) Process t Lag AC Function Lag PAC Function Note: y=e+0.5l.e+0.2l2.e 0.1L3.e 0.2L4.e 0.5L5.e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

21 Time Series Plot and ACF/PACF AR(1) Process t Lag AC Function Lag PAC Function Note: y = 0.5L.y + e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

22 Time Series Plot and ACF/PACF AR(1) Process t Lag AC Function Lag PAC Function Note: y = 0.5L.y + e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

23 Time Series Plot and ACF/PACF ARMA(2,1) Process t Lag AC Function Lag PAC Function Note: y =0.5L.y+0.2L2.y+e+0.2L.e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

24 Properties of AC and PAC fns (Enders 2004, p. 66) Process ACF PACF White Noise ρ s = 0 s φ ss = 0 s AR(1): γ 1 > 0 Direct exponential decay: ρ s = γ s 1 φ 11 = ρ 1 ; φ ss = 0 s > 1 AR(1): γ 1 < 0 Oscillating decay: ρ s = γ s 1 φ 11 = ρ 1 ; φ ss = 0 s > 1 AR(p) Decay to zero; may oscillate Spikes thru lag p; φ ss = 0 s > p MA(1): θ 1 > 0 Positive spike at lag 1; Oscillating decay; φ 11 > 0 ρ s = 0 s > 1 MA(1): θ 1 < 0 Negative spike at lag 1; Geometric decay; φ 11 < 0 ρ s = 0 s > 1 ARMA(1,1): Exponential decay beginning Oscillating decay beginning γ 1 > 0 at lag 1; sgn(ρ 1 ) = sgn(γ 1 + θ 1 ) at lag 1 ARMA(1,1): Oscillating decay beginning Exponential decay beginning γ 1 < 0 at lag 1; sgn(ρ 1 ) = sgn(γ 1 + θ 1 ) at lag 1 ARMA(p,q) Decay beginning at lag p; Decay beginning after lag q; may oscillate may oscillate DL Millimet (SMU) ECO 6375 Fall / 150

25 Joint tests of significance Estimating ρ s for a suffi cient # of lags s, one will always find some statistically significant estimates due to Type I error Q-statistics test the null that a set of ACs are jointly zero Box-Pierce (1970) Q = T s k =1 ρ2 k χ2 s under H o : ρ 1 = = ρ s = 0 Better small sample performance by Ljung-Box (1978) Q = T (T + 2) s k =1 ρ 2 k T k χ2 s under H o : ρ 1 = = ρ s = 0 Rejection of H o suggests that at least one ρ is non-zero Tests also used to test for white noise residuals after estimation of ARMA(p, q) model (discussed next), but dof = s p q DL Millimet (SMU) ECO 6375 Fall / 150

26 Time Series Stationarity: Notation Define the lag operator, L, as Lz t = z t 1 Some properties 1 Constants: La = a, where a is a constant 2 Zero lags: L 0 z t = z t 3 Higher lags: L p z t = z t p 4 Negative lags: L p z t = z t+p 5 Distributive property: (L q + L p )z t = z t q + z t p 6 Associative property: L q (L p )z t = z t q p DL Millimet (SMU) ECO 6375 Fall / 150

27 Frequently appearing is the following polynomial in the lag operator A(L) = s=0(al) s = 1 + al + (al) 2 + which, if a < 1, simplifies to A(L) = 1 1 al Note, this identity is used going both ways; frequently one substitutes 1 1 al z t = s=0(al) s z t implying that a variable divided by 1 al depends on the complete history of realizations DL Millimet (SMU) ECO 6375 Fall / 150

28 Time Series Stationarity: Common Univariate Processes TS processes are expressed as difference equations containing stochastic elements Definition A difference equation expresses the value of a variable as a function of its past values, time, and other variables. Assessing stationarity requires solving the difference equation and examining the properties of this solution The solution to a difference equation expresses the value of a variable as a functon of time, exogenous variables, errors, and perhaps an initial value of the variable; no past values of the variable DL Millimet (SMU) ECO 6375 Fall / 150

29 MA(q) is given by y t = µ + q s=0 θ s ε t s where θ 0 = 1 Difference equation is already solved since lags of y are not in the eqtn If ε t is white noise, then E[y t ] = µ Cov(y t, y t j ) = σ 2 [ q s=0 θ2 s ] if j = 0 σ 2 θ j + q j s=1 θ s+j θ s if j = 1,..., q 0 if j > q Thus, correlation is zero after q periods Process is stationary; strongly stationary if ε t is Gaussian white noise Process is not white noise if q > 0 DL Millimet (SMU) ECO 6375 Fall / 150

30 AR(p) is given by y t = α + p s=1 γ s y t s + ε t To assess the dbn of y t, need to solve the difference equation so that only stochastic elements with known properties are on the RHS Assume ε t is white noise Solving implies that an AR(p) process can be expressed as a MA( ) process, referred to as a moving-average representation of y t, y t = α + p s=1 γ s y t s + ε t ( 1 p s=1 γ ) s Ls y t = A(L)y t = α + ε t α y t = A(L) + 1 A(L) ε t y t = α 1 p s=1 γ s + s=0 φ s ε t s where φ 0 = 1 AR(p) process approximates an infinite # of parameters with p AR coeffs DL Millimet (SMU) ECO 6375 Fall / 150

31 Stationarity of AR(p) First moment α E[y t ] = 1 p s=1 γ s Second moments { σ 2 (1 + φ 2 1 Cov(y t, y t j ) = ( + φ2 2 + ) ) if j = 0 σ 2 φ j + φ 1 φ j+1 + φ 2 φ j+2 + if j = 1,... which are finite if j φ 2 j is finite φ j + φ 1 φ j+1 + φ 2 φ j+2 + is finite for all j 0 (implying the first condition is redundant) For AR(1) this becomes E[y t ] = Var(y t ) = α 1 γ σ 2 1 γ 2 For AR(p) use Yule-Walker equations or method of undetermined coeffi cients to solve for φ s DL Millimet (SMU) ECO 6375 Fall / 150

32 Stationarity of AR(p) continued... Alternatively, AR(p) processes are stationary if ε t is white noise and the roots of the inverse characteristic equation lie outside the unit circle Example: p = 1 γ(m) = 1 p s=1 γ s ms = 0 γ(m) = 1 γ 1 m = 0 m = 1/γ 1 So, γ 1 < 1 such that m > 1 Implies that random walks are nonstationary Roots of real-valued polynomials can occur in complex conjugate pairs; stationarity requires their modulus to lie outside the unit circle E.g., if m = x + iy is a root where i 2 = 1, then stationarity requires m = x 2 + y 2 > 1 DL Millimet (SMU) ECO 6375 Fall / 150

33 Notes... Stationarity of ARMA(p, q) y t = α + p s=1 γ s y t s + q s=0 θ s ε t s Follows from the fact that there still exists a moving-average representation of y t, so the same conditions for finite second moments applies Alternatively 1 The roots of the inverse characteristic equation γ(m) = 1 p s=1 γ s ms = 0 lie outside the unit circle 2 x t q s=0 θ s ε t s must be stationary (which follows if ε t is white noise) If {y t } is nonstationary, then if the process is integrated it is referred to as an ARIMA model (discussed later) DL Millimet (SMU) ECO 6375 Fall / 150

34 Wold s Decomposition Theorem states that any weakly stationary process has an MA( ) representation given by y t = µ + s=0 φ s ε t s φ 0 = 1; s φ 2 s < ε t WN(0, σ 2 ) E[y t ] = µ { σ Cov(y t, y t j ) = 2 s φ 2 s if j = 0 σ 2 s φ s φ s+j + if j = 1,... The moving average weights in the Wold form are known as the impulse responses y t+s ε t = y t ε t s = φ s, s = 1, 2,... For stationary and ergodic processes lim s φ s = 0 A plot of the impulse reponses vs. s is known as an Impulse Response Function (IRF) DL Millimet (SMU) ECO 6375 Fall / 150

35 Time Series Univariate Time Series: Box-Jenkins Methodology Complex TS models often fare poorly in terms of forecasting relative to simpler, univariate models See Giacomini (EJ 2015) The Wold Representation is useful theoretically, but not for estimation purposes DL Millimet (SMU) ECO 6375 Fall / 150

36 ARMA(p, q) models are the primary model specifications Popular in marketing for forecasting product demand DL Millimet (SMU) ECO 6375 Fall / 150

37 ARMA(p, q) model given by y t = α + p s=1 γ s y t s }{{} AR(p) + ε t + q s=1 θ s ε t s }{{} MA(q) where ε t are referred to as innovations since this is the only new information in period t that determines y t The model can be re-written more compactly as where γ(l)y t = α + θ(l)ε t γ(l) = 1 γ 1 L γ 2 L 2 γ p L p θ(l) = 1 + θ 1 L + θ 2 L θ q L q DL Millimet (SMU) ECO 6375 Fall / 150

38 Estimation requires y t to be stationary and the model to be invertible Definition A time series process is invertible if y t can be written as a finite order AR process or an infinite but convergent AR process Invertibility requires the roots of the characteristic equation, θ(z) = 0, to lie outside the unit circle Non-invertible models are not problematic per se, but defy estimation since the AC and PAC fns never decay over time Stationarity does not guarantee invertibility Example: y t = ε t ε t 1 = (1 L)ε t 1 1 L y t = ε t s=0 y t s = ε t which is not convergent However y t = ε t θε t 1 = (1 θl)ε t 1 1 θl y t = ε t s=0 θs y t s = ε t which is convergent if θ < 1 An ARMA(p, 0) is invertible since it is a finite order AR process DL Millimet (SMU) ECO 6375 Fall / 150

39 Invertibility allows one to multiply both sides by [θ(l)] 1 to obtain a convergent AR( ) process y t = α + p s=1 γ s y t s + q s=0 θ s ε t s γ(l)y t = α + θ(l)ε t γ(l) θ(l) y t = (1 s=1 α s L s ) y t = α 1 q s=1 θ s + ε t where γ(l) = 1 γ 1 L γ 2 L 2 γ p L p θ(l) = 1 + θ 1 L + θ 2 L θ q L q This is not a finite AR process, but it is convergent if the roots of θ(z) lie outside the unit circle; at some point s, α s is so small that α t = 0 t s Note: Invertibility is not tested, but rather is assumed/imposed during estimation DL Millimet (SMU) ECO 6375 Fall / 150

40 Notes... All ARMA(p, 0) models are invertible, but not necessarily stationary All ARMA(0, q) models are stationary if ε t is stationary, but not necessarily invertible ARMA(p, q) models are stationary if the MA(q) portion is stationary and the roots of the inverse characteristic equations of γ(z) lie outside the unit circle ARMA(p, q) models are invertible if the MA(q) portion is invertible; i.e., roots of the characteristic equation θ(z) = 0 lie outside the unit circle DL Millimet (SMU) ECO 6375 Fall / 150

41 Notes (cont.)... If the inverse characteristic equations of γ(z) and θ(z) share a common root (or factor), then the model is not identified and the model should be reduced to an ARMA(p 1, q 1) Example: Consider ARMA(1, 1) (1 γ 1 L)y t = (1 θ 1 L)(α + ε t ) y t = α + ε t if γ 1 = θ 1, which is an ARMA(0, 0) DL Millimet (SMU) ECO 6375 Fall / 150

42 Estimation is by OLS for ARMA(p, 0) models, ML otherwise OLS is biased (known as Hurwicz bias); explicit expression is not possible, but bias is negative y t = α + γy t 1 + ε t E [ γ ols ] = γ + t (y t 1 y t 1 )(ε t ε t ) t (y t 1 y t 1 ) 2 = γ since E [y t 1 ε t ] = 0 OLS is consistent unless ε t is serially correlated as the above bias goes to zero as T Estimation by ML (details to follow) Is complex since observations are not iid Modeled not with the joint density of {y t } T t=0, but by factoring this into the product of a bunch of conditional densities and a marginal density for the initial value f (y T y T 1,..., y 0 )f (y T 1 y T 2,..., y 0 ) f (y 0 ) Stata: -arima- Stata: -arimafit-, -armadiag-, -armaroots- for post-estimation diagnostics DL Millimet (SMU) ECO 6375 Fall / 150

43 ML details... AR(1) Model Model given by y t = α + γy t 1 + ε t, t = 0, 1,..., T where ε t iid N(0, σ 2 ) With y t stationary, we know E[y 0 ] = α 1 γ ; Var(y 0) = σ2 1 γ 2 The density of y 0 is { 1 f (y 0 ) = 2πσ 2 /(1 γ 2 ) exp [y0 α/(1 γ)] 2 } 2σ 2 /(1 γ 2 ) DL Millimet (SMU) ECO 6375 Fall / 150

44 Next, consider the density of y 1. This is not independent of y 0. The conditional dbn is y 1 y 0 N(α + γy 0, σ 2 ) implying f (y 1 y 0 ) = [ 1 exp (y1 α γy 0 ) 2 ] 2πσ 2 2σ 2 Extending this through period T yields the likelihood fn L(θ) = f (y 0 ; θ) T t=1 f (y t y t 1 ; θ) DL Millimet (SMU) ECO 6375 Fall / 150

45 As noted above, OLS estimation of this model is also feasible given the assumption that Cov(y t 1, ε t ) = 0 However, OLS is not equivalent to the MLE from the previous slide OLS is equivalent to the conditional MLE of the model, where the likelihood fn is conditional on the observed value of y 0 (y 0 is treated as a constant) The conditional likelihood fn is L c (θ) = T t=1 f (y t y t 1 ; θ) which yields estimates of α, γ that minimize the SSE identical to OLS OLS is less effi cient since it ignores one obs; asymptotically irrelevant OLS is consistent, however, if ε t is non-normal (Hayashi 2000, pp ) DL Millimet (SMU) ECO 6375 Fall / 150

46 ML details... AR(p) Model Model given by y t = α + p s=1 γ s y t s + ε t, t = 0, 1,..., T where ε t N(0, σ 2 ) Now, y 0,..., y p 1 are treated as initial values with joint density α f (y 0,..., y p 1 ) N 1 p s=1 γ, σ 2 s σ 0 σ 1 σ σ p 1 σ 1 σ 0 where σ j = E[(y t E[y t ])(y t j E[y t j ])] DL Millimet (SMU) ECO 6375 Fall / 150

47 Conditional density of y t, t = p,..., T, is [ 1 f (y t y t 1,..., y t p ) = exp (yt α p s=1 γ s y t s ) 2 ] 2πσ 2 2σ 2 with the final likelihood fn given by L(θ) = f (y 0,..., y p 1 ; θ) T t=p f (y t y t 1,..., y t p ; θ) OLS remains equivalent to the conditional MLE DL Millimet (SMU) ECO 6375 Fall / 150

48 ML details... MA(1) Model Model given by y t = α + ε t + θε t 1, t = 1,..., T where ε t N(0, σ 2 ) If we knew ε 0, then the density of y 1 would be [ 1 f (y 1 ε 0 ) = exp (y1 α θε 0 ) 2 ] 2πσ 2 2σ 2 and the full vector of ε t, t = 1,..., T, would be known conditional on α, θ So, derive the likelihood fn assuming ε 0 = 0 DL Millimet (SMU) ECO 6375 Fall / 150

49 Assuming ε 0 = 0 ε 1 = y 1 α ε 2 = y 2 α θε 1 = (y 2 α) θ(y 1 α). ε T = y T α θε T 1 = T s=1 θ T s (y s α) and the likelihood fn is L = f (y 1 ε 0 = 0) T t=2 f (y t y t 1,..., y 1, ε 0 = 0) DL Millimet (SMU) ECO 6375 Fall / 150

50 Notes: Assuming ε 0 = 0 does not induce much bias if θ < 1 since the role of ε 0 in ε t dies out as t If θ > 1, then must use the exact (unconditional) likelihood; given in Hamilton (1994, pp ) All MA(q) models must be estimated by ML Extension to MA(q) models requires invertibility to derive the likelihood Extension to ARMA(p, q) models is trivial after going through all these cases GMM estimation also feasible DL Millimet (SMU) ECO 6375 Fall / 150

51 How to choose lag lengths, p? 1 ICs 2 Simple-to-general approach AIC = 2K 2 ln(l) BIC = ln(t )K 2 ln(l) For example, start with p = 1; if γ 1 is statistically significant, move to p = 2 Continue until last lag is statistically insignificant 3 General-to-simple approach For example, start with arbitrarily large p; if γ p is statistically insignificant, move to p = p 1 Continue dropping lags until last lag is statistically significant S-to-G (G-to-S) will systematically under- (over-)specify the model due to Type II (Type I) error 4 Lag to white noise: examine the ACF/PACF and use prior Q-statistics to test for white noise residuals Precludes the need to model serially correlated errors 5 Compare out-of-sample forecast performance DL Millimet (SMU) ECO 6375 Fall / 150

52 Forecasts are different than in CS models In CS models, one typically observes the regressors out-of-sample and forecasts y In ARMA models, this only works for the one-step ahead forecast, ŷ T +1 Dynamic forecasts can be obtained by forecasting ŷ T +1 and then using this in the forecast for ŷ T +2,... Forecast errors will be serially correlated in all likelihood Stata: -dpredict- DL Millimet (SMU) ECO 6375 Fall / 150

53 Improved forecasting Competing models can be compared in terms of forecast ability (discussed on next slides) However, literature also examines optimal combinations of competing forecasts (e.g., Bayesian model averaging techniques) DL Millimet (SMU) ECO 6375 Fall / 150

54 Comparing out-of-sample forecast performance... MSPE given by Can test formally MSPE = 1 +T T T s=t +1 1 (ŷ s y s ) 2 H o : Models forecast equally well H a : Not H o DL Millimet (SMU) ECO 6375 Fall / 150

55 F -test Under the following assumptions 1 Forecast errors are mean zero and normally distributed 2 Forecast errors are serially uncorrelated 3 Forecast errors are contemporaneously uncorrelated across models H o that the MSPEs from two competing models are equal using an F -statistic +T T s=t +1 F = (ŷ s1 y s ) 2 T +T s=t +1 (ŷ s2 y s ) 2 1 where 1 and 2 index competing forecasts and this has an F -dbn with T, T dof DL Millimet (SMU) ECO 6375 Fall / 150

56 Granger-Newbold (1976) test Relaxes assumption #3 Define Under H o, ρ xz = 0 Test statistic x s = (ŷ s1 y s ) + (ŷ s2 y s ) = ŷ s1 + ŷ s2 2y s z s = (ŷ s1 y s ) (ŷ s2 y s ) = ŷ s1 ŷ s2 ρ xz (1 ρ 2 xz ) T 1 t T 1 where a significant test statistic, along with its sign, indicates the dominance of one model ρ xz > 0 model 2 is preferred ρ xz < 0 model 1 is preferred DL Millimet (SMU) ECO 6375 Fall / 150

57 Diebold-Mariano (1995) test relaxes assumptions #1-3 While very popular, it is not without criticism Given the forecasts from two competing models, {ŷ s1 } T +T s=t +1 and {ŷ s2 } T +T s=t +1, begin by choosing a loss function, L(ŷ s y s ), to evaluate the cost of forecast errors 1 Squared Error Loss: L(ŷ s y s ) = (ŷ s y s ) 2 2 Absolute Error Loss: L(ŷ s y s ) = ŷ s y s DL Millimet (SMU) ECO 6375 Fall / 150

58 Diebold-Mariano (1995) continued... Goal is to test H o : E[L(ŷ s1 y s )] = E[L(ŷ s2 y s )] H a : E[L(ŷ s1 y s )] = E[L(ŷ s2 y s )] Define d s = L(ŷ s1 y s ) L(ŷ s2 y s ) H o : E[d s ] = 0 Only requirement is that d s is weakly stationary DL Millimet (SMU) ECO 6375 Fall / 150

59 Diebold-Mariano (1995) continued... Test statistic is where S = d N(0, 1) AVar(d) d = (1/T ) s d s ÂVar(d) = γ j=1 γ j, and γ j = Cov(d s, d s j ) Equivalent to testing statistical significance of the OLS estimate of the intercept in the regression of d s on an intercept only using HAC standard errors (discussed below) Stata: -dmariano- See Diebold (2015) for updated discussion DL Millimet (SMU) ECO 6375 Fall / 150

60 Time Series Univariate Time Series: ARDL (ARMAX) Models Sometimes forecasts can be improved by incorporating additional variables besides lags of y t Moreover, when the goal is more than forecasting, we move to more complex specifications Framework is known as the autoregressive distributed lag model, denoted as ADL(p, q) or ARDL(p, q), given by y t = α + p s=1 γ s y t s + q s=0 x t s β s + ε t E[εε ] = σ 2 Ω DL Millimet (SMU) ECO 6375 Fall / 150

61 The model can be re-written more compactly as γ(l)y t = α + β(l)x t + ε t where γ(l) = 1 γ 1 L γ 2 L 2 γ p L p β(l) = β 0 + β 1 L + β 2 L β q L q Special cases of the ARDL model include 1 Distributed Lag Model: ARDL(0, q) y t = α + q s=0 x t s β s + ε t 2 Autoregressive Model: ARDL(p, 0) y t = α + p s=1 γ s y t s + x t β + ε t DL Millimet (SMU) ECO 6375 Fall / 150

62 Notes Distributed lag model is a finite lag model since q < Implies the effect of x is zero after q periods Autoregessive model is an ARDL(0, ) model since repeated substitution for the lagged y s yields y t = α + s=0 x t s δ s + s=0 θ s ε t where δ s is a fn of β s and γ s with exact identity depending on p and q Implies the complete history of x is relevant Estimation requires y t, x t to be stationary DL Millimet (SMU) ECO 6375 Fall / 150

63 A frequently used, alternative representation of the ARDL model is the error correction form (ECM) Introduced in A.W. Phillips (EJ, 1954 & 1957) Consider the ARDL(1, 1) model y t = α + γ 1 y t 1 + β 0 x t + β 1 x t 1 + ε t Re-arranging, one gets y t = α + β 0 x t (1 γ 1 )(y t 1 θx t 1 ) + ε t where θ = β 0 + β 1 1 γ 1 Thus, y t is decomposed into two parts y t = α + β 0 x t + ε t }{{} derivative effect (1 γ 1 )(y t 1 θx t 1 ) }{{} error correction term where θ is the slope coeffi cient in the LR relationship DL Millimet (SMU) ECO 6375 Fall / 150

64 Interpretation To see the LR relationship between y and x, set E[y t ] = y t 1 = y and x t = x t 1 = x and solve y = α + (β 0 + β 1 )x + γ 1 y ( ) α β0 + β y = + 1 x 1 γ 1 1 γ 1 Thus, the derivative is Estimation requires that 1 y t, x t is stationary 2 y t 1 θx t 1 is stationary (Discussed later) y x = β 0 + β 1 1 γ 1 = θ DL Millimet (SMU) ECO 6375 Fall / 150

65 Time Series Univariate Time Series: ARDL Estimation (Part I) Consider the simplest ARDL model, ARDL(0, 0), given by y t = x t β + ε t, t = 1,..., T E[εε ] = σ 2 Ω where x is a vector of contemporaneous regressors and ε is the disturbance term Assume [y t, x t, ε t ] is a stationary and ergodic process OLS properties Unbiased under usual CLRM assumptions BLUE if Ω = I T, otherwise GLS is effi cient F- and t-tests are appropriate as well if E[εε ] = σ 2 I T (as these rely on independent obs) Asymptotic normality If Ω = I T, then straightforward If Ω = I T, then x and ε must satisfy additional requirements DL Millimet (SMU) ECO 6375 Fall / 150

66 If Ω = IT, recall from analysis of heteroskedasticity that... OLS remains unbiased under the usual CLRM assumptions, but is not effi cient The variance of β is σ 2 (x x) 1 x Ωx(x x) 1 Two approaches 1 OLS with a different estimator for the variance 2 GLS/FGLS DL Millimet (SMU) ECO 6375 Fall / 150

67 Approach #1: Newey-West standard errors Estimator of the variance is robust to arbitrary autocorrelation Estimator given by (x x) 1 { t ε 2 t x t x t + L l=1 T t=l+1 w l ε t ε t l (x t x t l + x t lx t) where w l = 1 l L + 1 and L is the lag length and current practice is to set L T 1/4 Notes } (x x) 1 Thus, unlike robust standard errors, a maximum lag length must be specified by user Also referred to as HAC standard errors Stata: -newey- DL Millimet (SMU) ECO 6375 Fall / 150

68 Approach #2: FGLS Recall... to devise an effi cient estimator, transform the model s.t. OLS applied to the transformed model is now effi cient To proceed, pre-multiplying the model by an T T matrix P s.t. Py = Px β + Pε y = x β + ε where E[ε ε x] = σ 2 PΩP = σ 2 I This follows if P P = Ω 1 Intuition Transform the data s.t. the original assumption of spherical disturbances holds OLS applied to the new model will yield an effi cient estimator DL Millimet (SMU) ECO 6375 Fall / 150

69 OLS yields β GLS = (x x ) 1 x y = (x Ω 1 x) 1 x Ω 1 y GLS minimizes SSE = ε ε = (y x β) Ω 1 (y x β), the weighted sum of squared residuals Requires Ω be known, but this is typically never the case Instead FGLS is utilized Procceds by assuming Ω = Ω(θ), where θ is unknown Then use Ω = Ω( θ) Aitken DL Millimet (SMU) ECO 6375 Fall / 150

70 Common forms of Ω(θ) 1 AR(p) where u t WN(0, σ 2 u) 2 MA(q) where u t WN(0, σ 2 u) 3 ARMA(p, q) Stata: -arima- ε t = u t + p s=1 ρ s ε t s ε t = u t + q s=1 θ s u t s ε t = u t + p s=1 ρ s ε t s + q s=1 θ s u t s DL Millimet (SMU) ECO 6375 Fall / 150

71 Most common form used is AR(1) With repeated substitution ε t = ρε t 1 + u t ε t = u t + ρu t 1 + ρ 2 u t 2 + implying that ε t depends on the complete history of u s Since u s are independent Var(ε t ) = σ 2 u(1 + ρ 2 + ρ 4 + ) To prevent Var(ε t ), we must restrict ρ < 1 This implies E[ε t ] = 0 Var(ε t ) = σ 2 ε = σ2 u 1 ρ 2 E[ε t ε t s ] = ρs σ 2 u 1 ρ 2 Corr(ε t, ε t s ) = ρ s which means that ε t is weakly stationary DL Millimet (SMU) ECO 6375 Fall / 150

72 Thus, in the AR(1) model σ 2 Ω = σ2 u 1 ρ 2 Estimates given by 1 ρ ρ 2 ρ 3 ρ T 1 ρ 1 ρ ρ 2 ρ T 2 ρ 2 ρ 1 ρ ρ T ρ ρ T 1 ρ T 2 ρ T 3 ρ 1 β GLS = (x Ω 1 x) 1 x Ω 1 y DL Millimet (SMU) ECO 6375 Fall / 150

73 Equivalently, estimation obtained by transforming the model s.t. disturbances are iid In the AR(1) model, the transformation is Py = Px β + Pε where 1 ρ ρ 1... P = ρ 1 DL Millimet (SMU) ECO 6375 Fall / 150

74 Transformation yields quasi first-differenced data; for example, 1 ρ 2 y 1 1 ρ 2 ε 1 y 2 ρy 1 Py =. ; Pε = ε 2 ρε 1. y T ρy T 1 ε T ρε T 1 where Var(ε s ρε s 1 ) = Var(u s ) = σ 2 u, s > 1 Var( 1 ρ 2 ε 1 ) = (1 ρ 2 σ ) 2 u 1 ρ 2 = σ2 u DL Millimet (SMU) ECO 6375 Fall / 150

75 FGLS estimation requires an estimate for ρ Obtained in a variety of ways using the OLS residuals, ε t ρ = T t=2 ε t ε t 1 T t=1 ε2 t ρ Theil = T k T 1 ρ ρ DW = 1 d, where d is the DW statistic 2 or estimated from the following artificial OLS regressions ε t = ρ ε t 1 + υ t ε t = ρ ε t+1 + υ t DL Millimet (SMU) ECO 6375 Fall / 150

76 Historically, after estimating ρ, two FGLS estimators based on the transformed data Prais-Winsten: typical FGLS using all T obs Cochrane-Orcutt: FGLS omitting the first obs (for computational ease) Iterative FGLS does not produce any gains, or converge to the MLE Stata -prais- -arima- when errors are more complex than an AR(1) DL Millimet (SMU) ECO 6375 Fall / 150

77 Time Series Univariate Time Series: Tests for Autocorrelation Preliminary analysis can be performed by examining the AC and PAC functions for the estimated residuals DL Millimet (SMU) ECO 6375 Fall / 150

78 Formal tests... Breusch-Godfrey LM Test Hypotheses H o : No autocorrelation H a : AR(p) or MA(p) Test statistic LM = TR 2 0 χ2 p where R 2 0 is the R2 from the OLS regression of ε t on x t0, where x t0 = [x t ε t 1 ε t 2 ε t p ] where the missing values are replaced with zeros Intuition: high R 2 can only be explained by correlation between current and lagged residuals Stata: -estat bgodfrey- DL Millimet (SMU) ECO 6375 Fall / 150

79 Box & Pierce s Q-test Asymptotically equivalent to the LM test under H o when x does not contain lagged values of y Test statistic Q = T p j=1 r j 2 χ 2 p where r j = T t=j+1 ε t ε t j T t=1 ε2 t Ljung & Box (1979) suggested the following version Q = T (T + 2) p j=1 r 2 j T j χ2 p While LM test is based on partial correlations, Q-test based on simple correlations DL Millimet (SMU) ECO 6375 Fall / 150

80 Durbin-Watson Test Hypotheses Test statistic where d = T t=2 ( ε t ε t 1 ) 2 T t=1 ε2 t H o : ρ = 0 H 1 : ρ > 0 H 2 : ρ < 0 = 2(1 r) + ε2 1 + ε 2 T T t=1 ε2 t 2(1 r) r = T t=2 ε t ε t 1 T t=1 ε2 t For H o v. H 1, reject H o if d < dl, do not reject H o if d > du, and draw no conclusion if d (dl, d U ) For H o v. H 2, define d = 4 d and proceed as above Critical values are reported in Greene; based on T and K Alternative version for models that include lagged values of y Stata: -estat dwatson-, -estat durbinalt- DL Millimet (SMU) ECO 6375 Fall / 150

81 Time Series Univariate Time Series: ARDL Estimation (Part II) A crucial condition for estimation of the ARDL(0, 0), given by y t = x t β + ε t, t = 1,..., T E[εε ] = σ 2 Ω is that [y t, x t, ε t ] are assumed to be a stationary and ergodic process If y t and/or x t are nonstationary, then each must be transformed prior to estimation (discussed later) DL Millimet (SMU) ECO 6375 Fall / 150

82 Time Series Univariate Time Series: ARDL Estimation (Part III) Now, consider the ARDL(0, q), given by y t = α + q s=0 x t s β s + ε t, t = 1,..., T E[εε ] = σ 2 Ω Assume [y t, x t, ε t ] is a stationary and ergodic process OLS properties are as in the ARDL(0, 0) model with one exception: a stronger version of exogeneity is needed E[ε t x t, x t 1, x t 2,..., x t q ] = 0 Interpretation of parameters is different than in CS models Short-run multiplier or impact multiplier, β s, reflects the immediate change in y s from a unit change in x t s Cumulated effect τ periods later, s=0 τ β s, reflects the cumulative change in y from s to s + τ Long-run multiplier or equilibrium multiplier, q s=0 β s, is the long-run change in y from a permanent unit change in x DL Millimet (SMU) ECO 6375 Fall / 150

83 How to choose q? 1 Simple-to-general approach 2 General-to-simple approach 3 Other model selection criteria discussed previously (e.g., R 2, AIC, BIC) 4 Alternative approach: Add lagged terms until ε t is white noise White noise approach ensures OLS is the effi cient estimator DL Millimet (SMU) ECO 6375 Fall / 150

84 Time Series Univariate Time Series: ARDL Estimation (Part IV) Now, consider the ARDL(p, q), given by y t = α + q s=0 x t s β s + γ s p s=1 y t s + ε t, t = 1,..., T E[εε ] = σ 2 Ω Assume [y t, x t, ε t ] is a stationary and ergodic process OLS properties are as before with one exception: an even stronger version of exogeneity is needed E[ε t y t 1, y t 2,..., y t p, x t, x t 1, x t 2,..., x t q ] = 0 Autocorrelation in ε t precludes this assumption OLS is biased and inconsistent IV estimation is needed Lag length(s) chosen as discussed above White noise approach precludes the need to use IV-FGLS DL Millimet (SMU) ECO 6375 Fall / 150

85 Time Series Univariate Time Series: Structural Breaks All TS processes specified to this point assume stability in the DGP over time Violation of this assumption implies a structural break Notation Sample period: t = 1,..., T Breakdate: T 1 Breakdate fraction: τ 1 = T 1 /T Pre-break sample: t = 1,..., T 1 Post-break sample: t = T 1 + 1,..., T DL Millimet (SMU) ECO 6375 Fall / 150

86 Terminology Full structural break { xt β y t = 1 + ε t t T 1 x t β 2 + ε t t > T 1 or y t = x t β 1 I[t T 1 ] + x t β 2 I[t > T 1 ] + ε t Partial structural break y t = z t β 0 + x t β 1 I[t T 1 ] + x t β 2 I[t > T 1 ] + ε t Variance break { σ 2 y t = x t β + ε t, Var(ε t ) = 1 t T 1 σ 2 2 t > T 1 DL Millimet (SMU) ECO 6375 Fall / 150

87 Several issues arise 1 Statistical testing for the presence of a break conditional on a fixed possible break date 2 Statistical testing for the presence of a break conditional on an unknown break date 3 Forecasting in the presence of structural breaks 4 Testing for nonstationarity in the presence of structural breaks (discussed later) DL Millimet (SMU) ECO 6375 Fall / 150

88 Time Series Univariate Time Series: Structural Breaks (Known Break Date) If the alleged T 1 is known, then problem is the classic Chow test 1 Estimate the unrestricted model y t = z t β 0 + x t β 1 I[t T 1 ] + x t β 2 I[t > T 1 ] + ε t where z may or may not be empty 2 Test H o : β 1 = β 2 F -test if errors are homoskedastic Wald test with non-iid errors 3 Rejection provides evidence in favor of the break at time T 1 DL Millimet (SMU) ECO 6375 Fall / 150

89 Time Series Univariate Time Series: Structural Breaks (Unknown Break Date) Modern approach takes the break date as unknown; T 1 (or τ 1 ) is a parameter to estimate Sup tests (Andrews 1993) 1 Define interval of candidate break dates, [t 1, t 2 ], where t 1 >> 1 and t 2 << T Rule-of-thumb: t 1 = 0.15T, t 2 = 0.85T 2 For each T 1 [t 1, t 2 ], perform a Chow test and compute either the F -test or Wald test, denoted F (T 1 ) and W (T 1 ), respectively 3 Define the supremum of these test statistics Sup F = max T 1 F (T 1 ) Sup W = max T 1 W (T 1 ) 4 Compare Sup-statistics to new critical values that account for multiple testing DL Millimet (SMU) ECO 6375 Fall / 150

90 To determine if a break actually occurred in T 1, need to compare the value of the Sup-statistic to the appropriate critical value These Sup-statistics have non-standard asymptotic dbns that depend on k = # parameters tested for stability π 1 = t 1 /T π 2 = t 2 /T Critical values in Andrews (2003), and are much larger than traditional std errors based on F - or χ 2 dbn to account for multiple testing Note (Hansen 2000) These critical values assume the possibility of a structural break in the parameters, not the covariates covariates are strictly stationary If x is nonstationary in that there is a structural break in its mean or variance, then Andrews critical values are invalid DL Millimet (SMU) ECO 6375 Fall / 150

91 DL Millimet (SMU) ECO 6375 Fall / 150

92 If a breakdate is found to occur, the estimated breakdate, T 1, is equivalently given by the value that minimizes the SSE in the model Formally y t = z t β 0 + x t β 1 I[t T 1 ] + x t β 2 I[t > T 1 ] + ε t min β,t 1 (y t z t β 0 x t β 1 I[t T 1 ] x t β 2 I[t > T 1 ]) 2 Can break this up into a two-step problem by minimizing the SSE (β) conditional on T 1 and then doing a grid search over T 1 Worthwhile to plot SSE (T 1 ) to see how sharp the drop is at T 1 T 1 is the same as the value that maximizes the Sup F Bai (1997), Elliott & Mueller (2007) discuss CIs for T 1 DL Millimet (SMU) ECO 6375 Fall / 150

93 Time Series Univariate Time Series: Structural Breaks (Error Variance) Testing for an unknown break in variance only 1 Estimate initial model and obtain ε t 2 Estimate model y t = x t β + ε t ε 2 t = β 1 I[t T 1 ] + β 2 I[t > T 1 ] + u t 3 Apply Andrews Sup W test where k = 1 DL Millimet (SMU) ECO 6375 Fall / 150

94 Time Series Univariate Time Series: Structural Breaks (End of Sample) End-of-sample breaks Tests here are powerful when T 1 [0.15T, 0.85T ] Not so powerful for breaks near endpoints of the sample This is especially problematic for forecasting Andrews (2003) provides an end-of-sample instability test DL Millimet (SMU) ECO 6375 Fall / 150

95 Time Series Univariate Time Series: Structural Breaks (Forecasting) Forecasting with breaks Pesaran & Timmerman (2007), Pesaran et al. (2013) discuss forecasting with structural breaks No definitive solution Solution #1: Estimate the breakdate, retain only the sample with t > T 1, and forecast from there Solution #2: Augment the sample with some observations prior to T 1 to add effi ciency at the expense of bias DL Millimet (SMU) ECO 6375 Fall / 150

96 Time Series Univariate Time Series: Structural Breaks (Multiple Breaks) With multiple unknown breaks, there are two options 1 Joint estimation Posit, say, two breaks versus no breaks Conduct a bivariate grid search over the two possible breakdates, T 1 and T 2 Compute similar Sup-statistics Estimate the breakdates by minimizing the SSE across all combinations of T 1 and T 2 2 Sequential estimation Test for one break date as above If there is evidence of a break at T 1, then test for a second break conditional on T 1 The preceding estimator is consistent for one of the breaks the first time through, etc. and thus will find both breaks (asymptotically) One may iterate: Estimate T 1, then T 2 T 1, then T 1 T 2, etc. DL Millimet (SMU) ECO 6375 Fall / 150

97 Time Series Multivariate Time Series: VARs Vector autoregressions (VARs) model several time series processes simultaneously Differs from ARDL(p, q) model in that only lagged variables are included as regressors Differs from ARMA(p, q) model in that the disturbances cannot be autocorrelated Let y t = [y 1t y 2t y nt ] denote a vector of n variables at time t DL Millimet (SMU) ECO 6375 Fall / 150

98 The VAR(p) model is given by y 1t = α 1 + φ 1 11 y 1t φ 1 n1 y nt φ 1 1p y 1t p + + φ 1 np y nt p + ε 1t y 2t = α 2 + φ 2 11 y 1t φ 2 n1 y nt φ 2 1p y 1t p + + φ 2 np y nt p + ε 2t. =... y nt = α n + φ n 11 y 1t φ n n1 y nt φ n 1p y 1t p + + φ n np y nt p + ε 1t Or, more compactly, the model is given by y t = α + Φ 1 y t 1 + Φ 2 y t Φ p y t p + ε t iid where ε t N(0, Σ), α is a n 1 vector, and Σ and Φ s are n n matrices φ 1 1s φ 1 2s φ 1 ns y 1t s φ 2 1s φ 2 2s φ 2 ns y 2t s Φ s =.... y t s =.. φ n 1s φ n ns y nt s DL Millimet (SMU) ECO 6375 Fall / 150

99 Notes Non-iid errors leads to a VARMA model, which is much more diffi cult and rarely used (Fruet Dias & Kapetanios 2018) VAR(p) models can be consistently estimated applying OLS one eqtn at a time if the regressors are exogenous and the model is stationary In this case, the model is just SUR with identical covariates in each eqtn no effi ciency gain to joint estimation Σ is estimated by Σ = 1 T t ε t ε t Stata: -sureg- Hamilton (1994) provides detailed coverage of ML estimation Stationarity requires the roots of to lie outside the unit circle I n Φ 1 z Φ 2 z 2 Φ p z p = 0 DL Millimet (SMU) ECO 6375 Fall / 150

100 Granger Causality Definition The variable, x, fails to Granger cause the variable, y, if MSE [ Ê[y t+s y t, y t 1,...] ] = MSE [ Ê[y t+s y t, y t 1,..., x t, x t 1,...] ] for s > 0. Notes Differs dramatically from the definition of causality in microeconometrics (Lechner 2010) Causality here is synonymous with predictive ability of a variable conditional on the complete history of the variable being forecasted In a VAR, an F-test that all the coeffs on a particular y i are jointly zero in the eqtn for y j is a test for whether y i Granger causes y j DL Millimet (SMU) ECO 6375 Fall / 150

101 Impulse Response Functions As in a univariate AR(p) process, a VAR(p) may be expressed as a MA( ) system The coeffs in this representation are known as impulse responses The IRF gives the response of y i,t+s to a one-time, unit change in ε jt (where i, j refer to one of the n variables in the system) IRF = y i,t+s ε jt However, IRFs are with respect to independent shocks If ε t iid N(0, Σ), then the errors are decomposed into true shocks, iid v t N(0, diag (σ 2 )) via Cholesky decomposition IRFs are then defined as IRF = y i,t+s v jt Note: Cholesky decomp results are not invariant to ordering of the variables, y 1,..., y p. Generalized IRFs address this issue. DL Millimet (SMU) ECO 6375 Fall / 150

102 Example (n = 2) DL Millimet (SMU) ECO 6375 Fall / 150

103 Structural VARs (SVARs) Differ from VARs in that contemporaneous values of regressors appear Simultaneous eqtn setup implies that OLS is no longer consistent since the contemporaneous regressors cannot be exogenous Identification relies on external or internal instruments External instruments are exclusion restrictions as typically construed in IV estimation Internal instruments rely on alternative restrictions Estimation proceeds by estimating the system of reduced form eqtns Zero or sign restrictions used for identification Identification via heteroskedasticity Stata: -var-, -varbasic-, -svar-, -irf- DL Millimet (SMU) ECO 6375 Fall / 150

104 Time Series Nonstationary Time Series Processes Outline 1 Common nonstationary processes 2 Consequences of ignoring nonstationarity 3 Testing for nonstationarity 4 Estimation with nonstationary data DL Millimet (SMU) ECO 6375 Fall / 150

105 Time Series Nonstationary Time Series Processes: Unit Roots Many economic variables are nonstationary Most common reason is the presence of a strong trend over time Two categories of trends Deterministic Stochastic DL Millimet (SMU) ECO 6375 Fall / 150

106 Three common models for nonstationary variables 1 Random walk y t = y t 1 + ε t 2 Random walk with drift y t = α + y t 1 + ε t 3 Trend stationary process y t = α + βt + ε t The first two are examples of stochastic trends, while the last is an example of a deterministic trend DL Millimet (SMU) ECO 6375 Fall / 150

107 Each is a special case of (1 L)y t = µ + υ t where µ = 0, α, or β and υ t is stationary if ε t is stationary Recall, the model is stationary if the roots of the characteristic equation, C (z) = 0, lie outside the unit circle For each case, the characteristic equation is which is referred to as a unit root 1 z = 0 z = 1 DL Millimet (SMU) ECO 6375 Fall / 150

108 Consider the random walk and random walk with drift models y t = α + y t 1 + ε t = α + ε t 1 L = s=0(α + ε t s ) where α = 0 in the model without drift If ε t is white noise, then Var(y t ) and Cov(y t, y t s ) = (t s)σ 2 depends on t and s regardless of α Autocorrelations given by ρ s = Corr(y t, y t s ) = (t s)/t, which starts close to unity and decays slowly If α = 0, then E[y t ] ± as well Consider the trend stationary prcoess ε s lead to only temporary departures from trend line E[y t ] = α + βt if E[ε t ] = 0 which depends on t AC and PAC plots of examples follow... DL Millimet (SMU) ECO 6375 Fall / 150

109 Time Series Plot and ACF/PACF Random Walk Process t Lag AC Function Lag PAC Function Note: y = L.y + e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

110 Time Series Plot and ACF/PACF Random Walk with Drift Process t Lag AC Function Lag PAC Function Note: y = L.y + e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

111 Time Series Plot and ACF/PACF Trend Sationary Process t Lag AC Function Lag PAC Function Note: y= t + e; e ~ N(0,0.25). DL Millimet (SMU) ECO 6375 Fall / 150

112 Time Series Nonstationary Time Series Processes: Consequences 1 Coeffi cients on autoregressive terms, γ s, s = 1,..., p are biased downward in absolute value For example, in the ARMA(1, 0) model, if γ = 1, then E[ γ] = T which is biased but consistent Thus, a standard t-test will over reject H o : γ = 1 2 The usual OLS t-statistic can have a non-normal dbn even as T, invalidating conventional CIs and hypothesis testing DL Millimet (SMU) ECO 6375 Fall / 150

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference

More information

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION

APPLIED ECONOMETRIC TIME SERIES 4TH EDITION APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Econometrics Week 4 Institute of Economic Studies Faculty of Social Sciences Charles University in Prague Fall 2012 1 / 23 Recommended Reading For the today Serial correlation and heteroskedasticity in

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Notes on Time Series Modeling

Notes on Time Series Modeling Notes on Time Series Modeling Garey Ramey University of California, San Diego January 17 1 Stationary processes De nition A stochastic process is any set of random variables y t indexed by t T : fy t g

More information

Empirical Market Microstructure Analysis (EMMA)

Empirical Market Microstructure Analysis (EMMA) Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg

More information

Econometrics II Heij et al. Chapter 7.1

Econometrics II Heij et al. Chapter 7.1 Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy

More information

Univariate Time Series Analysis; ARIMA Models

Univariate Time Series Analysis; ARIMA Models Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing

More information

Multivariate Time Series: VAR(p) Processes and Models

Multivariate Time Series: VAR(p) Processes and Models Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with

More information

Econ 623 Econometrics II Topic 2: Stationary Time Series

Econ 623 Econometrics II Topic 2: Stationary Time Series 1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the

More information

Central Bank of Chile October 29-31, 2013 Bruce Hansen (University of Wisconsin) Structural Breaks October 29-31, / 91. Bruce E.

Central Bank of Chile October 29-31, 2013 Bruce Hansen (University of Wisconsin) Structural Breaks October 29-31, / 91. Bruce E. Forecasting Lecture 3 Structural Breaks Central Bank of Chile October 29-31, 2013 Bruce Hansen (University of Wisconsin) Structural Breaks October 29-31, 2013 1 / 91 Bruce E. Hansen Organization Detection

More information

STAT Financial Time Series

STAT Financial Time Series STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR

More information

VAR Models and Applications

VAR Models and Applications VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

LECTURE 10: MORE ON RANDOM PROCESSES

LECTURE 10: MORE ON RANDOM PROCESSES LECTURE 10: MORE ON RANDOM PROCESSES AND SERIAL CORRELATION 2 Classification of random processes (cont d) stationary vs. non-stationary processes stationary = distribution does not change over time more

More information

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s.

9. AUTOCORRELATION. [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. 9. AUTOCORRELATION [1] Definition of Autocorrelation (AUTO) 1) Model: y t = x t β + ε t. We say that AUTO exists if cov(ε t,ε s ) 0, t s. ) Assumptions: All of SIC except SIC.3 (the random sample assumption).

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

Vector Auto-Regressive Models

Vector Auto-Regressive Models Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 48 Serial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Econometría 2: Análisis de series de Tiempo

Econometría 2: Análisis de series de Tiempo Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector

More information

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54 ECON2228 Notes 10 Christopher F Baum Boston College Economics 2014 2015 cfb (BC Econ) ECON2228 Notes 10 2014 2015 1 / 54 erial correlation and heteroskedasticity in time series regressions Chapter 12:

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Problem Set 2: Box-Jenkins methodology

Problem Set 2: Box-Jenkins methodology Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +

More information

Empirical Macroeconomics

Empirical Macroeconomics Empirical Macroeconomics Francesco Franco Nova SBE April 5, 2016 Francesco Franco Empirical Macroeconomics 1/39 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical

More information

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION

EASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t

More information

Ch.10 Autocorrelated Disturbances (June 15, 2016)

Ch.10 Autocorrelated Disturbances (June 15, 2016) Ch10 Autocorrelated Disturbances (June 15, 2016) In a time-series linear regression model setting, Y t = x tβ + u t, t = 1, 2,, T, (10-1) a common problem is autocorrelation, or serial correlation of the

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8]

Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] 1 Multivariate Time Series Analysis and Its Applications [Tsay (2005), chapter 8] Insights: Price movements in one market can spread easily and instantly to another market [economic globalization and internet

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Regression with time series

Regression with time series Regression with time series Class Notes Manuel Arellano February 22, 2018 1 Classical regression model with time series Model and assumptions The basic assumption is E y t x 1,, x T = E y t x t = x tβ

More information

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication G. S. Maddala Kajal Lahiri WILEY A John Wiley and Sons, Ltd., Publication TEMT Foreword Preface to the Fourth Edition xvii xix Part I Introduction and the Linear Regression Model 1 CHAPTER 1 What is Econometrics?

More information

Advanced Econometrics

Advanced Econometrics Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

AUTOCORRELATION. Phung Thanh Binh

AUTOCORRELATION. Phung Thanh Binh AUTOCORRELATION Phung Thanh Binh OUTLINE Time series Gauss-Markov conditions The nature of autocorrelation Causes of autocorrelation Consequences of autocorrelation Detecting autocorrelation Remedial measures

More information

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).

Note: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994). Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial

More information

Non-Stationary Time Series and Unit Root Testing

Non-Stationary Time Series and Unit Root Testing Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity

More information

Vector autoregressions, VAR

Vector autoregressions, VAR 1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,

More information

Basic concepts and terminology: AR, MA and ARMA processes

Basic concepts and terminology: AR, MA and ARMA processes ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture

More information

Covers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data

Covers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data Covers Chapter 10-12, some of 16, some of 18 in Wooldridge Regression Analysis with Time Series Data Obviously time series data different from cross section in terms of source of variation in x and y temporal

More information

10) Time series econometrics

10) Time series econometrics 30C00200 Econometrics 10) Time series econometrics Timo Kuosmanen Professor, Ph.D. 1 Topics today Static vs. dynamic time series model Suprious regression Stationary and nonstationary time series Unit

More information

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Reading Assignment Serial Correlation and Heteroskedasticity Chapters 1 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1 Serial Correlation or Autocorrelation y t = β 0 + β 1 x 1t + β x t +... + β k

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

Empirical Macroeconomics

Empirical Macroeconomics Empirical Macroeconomics Francesco Franco Nova SBE April 21, 2015 Francesco Franco Empirical Macroeconomics 1/33 Growth and Fluctuations Supply and Demand Figure : US dynamics Francesco Franco Empirical

More information

at least 50 and preferably 100 observations should be available to build a proper model

at least 50 and preferably 100 observations should be available to build a proper model III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or

More information

ECON3327: Financial Econometrics, Spring 2016

ECON3327: Financial Econometrics, Spring 2016 ECON3327: Financial Econometrics, Spring 2016 Wooldridge, Introductory Econometrics (5th ed, 2012) Chapter 11: OLS with time series data Stationary and weakly dependent time series The notion of a stationary

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA AR MA Model

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional

Heteroskedasticity. We now consider the implications of relaxing the assumption that the conditional Heteroskedasticity We now consider the implications of relaxing the assumption that the conditional variance V (u i x i ) = σ 2 is common to all observations i = 1,..., In many applications, we may suspect

More information

ARIMA Modelling and Forecasting

ARIMA Modelling and Forecasting ARIMA Modelling and Forecasting Economic time series often appear nonstationary, because of trends, seasonal patterns, cycles, etc. However, the differences may appear stationary. Δx t x t x t 1 (first

More information

Econometrics. 9) Heteroscedasticity and autocorrelation

Econometrics. 9) Heteroscedasticity and autocorrelation 30C00200 Econometrics 9) Heteroscedasticity and autocorrelation Timo Kuosmanen Professor, Ph.D. http://nomepre.net/index.php/timokuosmanen Today s topics Heteroscedasticity Possible causes Testing for

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

Trend-Cycle Decompositions

Trend-Cycle Decompositions Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)

More information

Title. Description. var intro Introduction to vector autoregressive models

Title. Description. var intro Introduction to vector autoregressive models Title var intro Introduction to vector autoregressive models Description Stata has a suite of commands for fitting, forecasting, interpreting, and performing inference on vector autoregressive (VAR) models

More information

Econometrics Summary Algebraic and Statistical Preliminaries

Econometrics Summary Algebraic and Statistical Preliminaries Econometrics Summary Algebraic and Statistical Preliminaries Elasticity: The point elasticity of Y with respect to L is given by α = ( Y/ L)/(Y/L). The arc elasticity is given by ( Y/ L)/(Y/L), when L

More information

Univariate Nonstationary Time Series 1

Univariate Nonstationary Time Series 1 Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction

More information

1 Class Organization. 2 Introduction

1 Class Organization. 2 Introduction Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat

More information

10. Time series regression and forecasting

10. Time series regression and forecasting 10. Time series regression and forecasting Key feature of this section: Analysis of data on a single entity observed at multiple points in time (time series data) Typical research questions: What is the

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Midterm Suggested Solutions

Midterm Suggested Solutions CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)

More information

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15 Econ 495 - Econometric Review 1 Contents 7 Introduction to Time Series 3 7.1 Time Series vs. Cross-Sectional Data............ 3 7.2 Detrending Time Series................... 15 7.3 Types of Stochastic

More information

Elements of Multivariate Time Series Analysis

Elements of Multivariate Time Series Analysis Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series

More information

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test

E 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)

Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary

More information

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94

Freeing up the Classical Assumptions. () Introductory Econometrics: Topic 5 1 / 94 Freeing up the Classical Assumptions () Introductory Econometrics: Topic 5 1 / 94 The Multiple Regression Model: Freeing Up the Classical Assumptions Some or all of classical assumptions needed for derivations

More information

Lecture 1: Stationary Time Series Analysis

Lecture 1: Stationary Time Series Analysis Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA

More information

Time Series Econometrics 4 Vijayamohanan Pillai N

Time Series Econometrics 4 Vijayamohanan Pillai N Time Series Econometrics 4 Vijayamohanan Pillai N Vijayamohan: CDS MPhil: Time Series 5 1 Autoregressive Moving Average Process: ARMA(p, q) Vijayamohan: CDS MPhil: Time Series 5 2 1 Autoregressive Moving

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Discrete time processes

Discrete time processes Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following

More information

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models University of Illinois Fall 2016 Department of Economics Roger Koenker Economics 536 Lecture 7 Introduction to Specification Testing in Dynamic Econometric Models In this lecture I want to briefly describe

More information

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series Econometrics I Professor William Greene Stern School of Business Department of Economics 25-1/25 Econometrics I Part 25 Time Series 25-2/25 Modeling an Economic Time Series Observed y 0, y 1,, y t, What

More information

7 Introduction to Time Series

7 Introduction to Time Series Econ 495 - Econometric Review 1 7 Introduction to Time Series 7.1 Time Series vs. Cross-Sectional Data Time series data has a temporal ordering, unlike cross-section data, we will need to changes some

More information

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN

More information

Cointegration, Stationarity and Error Correction Models.

Cointegration, Stationarity and Error Correction Models. Cointegration, Stationarity and Error Correction Models. STATIONARITY Wold s decomposition theorem states that a stationary time series process with no deterministic components has an infinite moving average

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Advanced Econometrics

Advanced Econometrics Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco

More information

Lecture 4a: ARMA Model

Lecture 4a: ARMA Model Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model

More information

7. GENERALIZED LEAST SQUARES (GLS)

7. GENERALIZED LEAST SQUARES (GLS) 7. GENERALIZED LEAST SQUARES (GLS) [1] ASSUMPTIONS: Assume SIC except that Cov(ε) = E(εε ) = σ Ω where Ω I T. Assume that E(ε) = 0 T 1, and that X Ω -1 X and X ΩX are all positive definite. Examples: Autocorrelation:

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

New Introduction to Multiple Time Series Analysis

New Introduction to Multiple Time Series Analysis Helmut Lütkepohl New Introduction to Multiple Time Series Analysis With 49 Figures and 36 Tables Springer Contents 1 Introduction 1 1.1 Objectives of Analyzing Multiple Time Series 1 1.2 Some Basics 2

More information

B y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal

B y t = γ 0 + Γ 1 y t + ε t B(L) y t = γ 0 + ε t ε t iid (0, D) D is diagonal Structural VAR Modeling for I(1) Data that is Not Cointegrated Assume y t =(y 1t,y 2t ) 0 be I(1) and not cointegrated. That is, y 1t and y 2t are both I(1) and there is no linear combination of y 1t and

More information

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56

Cointegrated VAR s. Eduardo Rossi University of Pavia. November Rossi Cointegrated VAR s Financial Econometrics / 56 Cointegrated VAR s Eduardo Rossi University of Pavia November 2013 Rossi Cointegrated VAR s Financial Econometrics - 2013 1 / 56 VAR y t = (y 1t,..., y nt ) is (n 1) vector. y t VAR(p): Φ(L)y t = ɛ t The

More information

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036

More information

ECON 4160, Spring term Lecture 12

ECON 4160, Spring term Lecture 12 ECON 4160, Spring term 2013. Lecture 12 Non-stationarity and co-integration 2/2 Ragnar Nymoen Department of Economics 13 Nov 2013 1 / 53 Introduction I So far we have considered: Stationary VAR, with deterministic

More information

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27

ARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27 ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

Univariate linear models

Univariate linear models Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation

More information

Linear Regression with Time Series Data

Linear Regression with Time Series Data Econometrics 2 Linear Regression with Time Series Data Heino Bohn Nielsen 1of21 Outline (1) The linear regression model, identification and estimation. (2) Assumptions and results: (a) Consistency. (b)

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

7. Integrated Processes

7. Integrated Processes 7. Integrated Processes Up to now: Analysis of stationary processes (stationary ARMA(p, q) processes) Problem: Many economic time series exhibit non-stationary patterns over time 226 Example: We consider

More information

E 4101/5101 Lecture 9: Non-stationarity

E 4101/5101 Lecture 9: Non-stationarity E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson

More information