Lecture 1: Stationary Time Series Analysis
|
|
- Amos Armstrong
- 5 years ago
- Views:
Transcription
1 Syllabus Stationarity ARMA AR MA Model Selection Estimation Lecture 1: Stationary Time Series Analysis : Time Series Econometrics Spring 2018 Jacek Suda
2 Syllabus Stationarity ARMA AR MA Model Selection Estimation Website Description Prerequisites Requirements Readings Outline Website Class website: Office hours: by appointments
3 Syllabus Stationarity ARMA AR MA Model Selection Estimation Website Description Prerequisites Requirements Readings Outline Description The purpose of this course is to familiarize students with current techniques used in macroeconomic time series models with applications in macroeconomics, international finance, and finance; with the ultimate aim of providing necessary tools to conduct original research in the area. The focus is on implementation of the models presented in the course. Topics include ARMA models, VARs and impulse response functions; unit roots, and structural breaks; spurious regressions; cointegration and VECM; ARCH models of volatility, and trend/cycle decomposition methods, including Kalman filtering. We will mostly work with the classical framework in the time domain but will touch upon Bayesian and frequency domain frameworks.
4 Syllabus Stationarity ARMA AR MA Model Selection Estimation Website Description Prerequisites Requirements Readings Outline Prerequisites I will assume some familiarity with matrix algebra and introductory statistics and econometrics. The course is the continuation of Econometrics course but we will also review the univariate time series analysis in this course.
5 Syllabus Stationarity ARMA AR MA Model Selection Estimation Website Description Prerequisites Requirements Readings Outline Requirements Attending lectures and participating in classroom discussions is essential to the learning process. There will be two or three homework assignments. The class ends either in-class 2 hour long exam. The assignments and exam will require the use of econometrics software. The weights in determining your grade are given as follows: Class participation 10% Homework assignments 40% Final exam 50%
6 Syllabus Stationarity ARMA AR MA Model Selection Estimation Website Description Prerequisites Requirements Readings Outline Readings The book that covers most of the material is Time Series Analysis by James D. Hamilton, Princeton University Press, Other texts are State-Space Models with Regime Switching by Chiang-Jin Kim and Charles R. Nelson, MIT Press, New Introduction to Multiple Time Series Analysis by Helmut Lütkepohl, Springer-Verlag, Introduction to Bayesian Econometrics by Edward Greenberg, Cambridge University Press, Time Series and Panel Data Econometrics by M. Hashem Pesaran, Oxford University Press, Applied Econometric Time Series by Walter Enders, Wiley, The readings include journal articles and chapters from the above books.
7 Syllabus Stationarity ARMA AR MA Model Selection Estimation Website Description Prerequisites Requirements Readings Outline Outline 1 Stationary Time Series Analys Overview of ARMA models State-Space Representation Kalman Filter 2 Structural Analysis Granger Causality, VAR, IRFs, Estimation, Variance decomposition Reduced-form and structural VAR models Distributed lag models 3 Unit Roots and Structural Breaks Unit root tests Structural break tests Trend/cycle decomposition Cointegration VEC models 4 Nonlinearity (time permitting) ARCH/GARCH models Markov switching Time-varying parameters Gibbs sampling Threshold models
8 Syllabus Stationarity ARMA AR MA Model Selection Estimation Website Description Prerequisites Requirements Readings Outline Today Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models 1 Auto-Correlation Function (ACF) 2 Partial Auto-Correlation Function (PACF) 3 Model Selection 4 Estimation 5 Forecasting
9 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold STATIONARITY
10 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Stochastic Process Stochastic process: a collection of random variables {..., Y 1, Y 0, Y 1, Y 2,..., Y T,...} = {Y t }. Observed series {y 1, y 2,..., y T } realizations of a stochastic process. We want a model for {Y t } to explain observed realizations {y t } T 1.
11 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j Note: mean is time-invariant
12 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j Note: covariance doesn t depend on t
13 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j Note: Var(Y t ) = γ 0 variance is also constant.
14 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Covariance-Stationary Definition {Y t } is covariance-stationary (weak stationary) if (i) E[Y t] = µ t (ii) Cov(Y t, Y t j) = E[(Y t µ)(y t j µ)] = γ j, t, j It is weak stationarity because it only relates to the first two moments. Higher moments can be time-variant. Examples 1 {Y t} with E(Y t) = 0, Var(Y t) = σ 2, and E(Y ty τ ) = 0 white noise (WN). 2 Y t iid(0, σ 2 ) independent white noise. 3 Y t iidn(0, σ 2 ) Gaussian white noise.
15 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Strict (Strong) Stationary Definition {Y t } is (strictly/strongly) stationary if for any values of j 1, j 2,..., j n the joint distribution of (Y t, Y t+j1, Y t+j2,..., Y t+jn ) depends only on the intervals separating the dates (j 1, j 2,..., j n ) and not on a date itself (t). For all τ, t 1, t 2,..., t n : F Y (y t1, y t2,..., y tn ) = F Y (y t1+τ, y t2+τ,..., y tn+τ ) If a process is strictly stationary with a finite second moment it is also covariance-stationary. Normality strong stationarity: whole distribution depends on the first two moments.
16 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN
17 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN t - time dummy
18 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN deterministic part
19 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN stochastic component
20 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN E[Y t] = β t depends on t. But X t = Y t β t is covariance stationary.
21 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN E[Y t] = β t depends on t. But X t = Y t β t is covariance stationary. 2 Y t = Y t 1 + ε t, ε t WN, Y 0 constant Random walk.
22 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Nonstationary Processes Examples: 1 Y t = β t + ε t, ε t WN E[Y t] = β t depends on t. But X t = Y t β t is covariance stationary. 2 Y t = Y t 1 + ε t, ε t WN, Y 0 constant Random walk. Solving recursively : Y t = t ε j + Y 0. j=1 E[Y t] = Y 0, time-invariant mean. But Var(Y t) = t σ 2 depends on t. X t = Y t Y t 1 is covariance stationary.
23 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Wold s Decomposition Theorem Any covariance stationary {Y t } has infinite order, moving-average representation: Y t = ψ j ε t j + κ t, j=0 ψ 0 = 1, ε t WN. ψ 0 = 1, j=0 ψ2 j <, ε t WN(0, σ 2 ), E(ε tκ s) = 0 t,s, κ t deterministic term (perfectly forecastable). Example: κ t = µ, constant mean. κ = a cos(λt) + b sin(λt), λ fixed number, Ea = Eb = Eab = 0.
24 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Wold s Decomposition Theorem Any covariance stationary {Y t } has infinite order, moving-average representation: Y t = ψ j ε t j + κ t, j=0 ψ 0 = 1, ε t WN. ψ 0 = 1, j=0 ψ2 j <, ε t WN(0, σ 2 ), E(ε tκ s) = 0 t,s, κ t deterministic term (perfectly forecastable). Linear combination of ε s (innovations over time), ε t = Y t E(Y t Y t 1, Y t 2,...), Weights does not depend on time t, they only depend on j, i.e. how long ago the shock ε occurred.
25 Syllabus Stationarity ARMA AR MA Model Selection Estimation Stochastic Process Cov-Stationarity Strict Stationary Nonstationary Wold Wold s Decomposition Theorem - Illustration Let X t = Y t κ t (purely indeterministic process). Then, E[X t] = ψ j E[ε t j ] = 0, j=0 E[Xt 2 ] = ψj 2 E[ε 2 t j] = σ 2 ψj 2 <, j=0 j=0 as ε t are uncorrelated; we have constant finite variance. E[X t X t j ] = E[(ε t + ψ 1 ε t 1 + ψ 2 ε t )(ε t j + ψ 1 ε t j 1 + ψ 2 ε t j )] = σ 2 (ψ j + ψ j+1 ψ 1 + ψ j+2 ψ ) = σ 2 ψ k ψ k+j, depends on j not t. k=0 So we have a covariance stationary process in mean and variance.
26 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact ARMA
27 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact ARMA Models Approximate Wold form with finite number of parameters. Wold Form: Y t µ = ψ j ε t j, ε t WN, j=0 ARMA(p,q): Y t µ = φ 1 (Y t 1 µ)+...+φ p (Y t p µ)+ε t +θ 1 ε t θ q ε t q.
28 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact Lag Operator Define the operator L as LX t X t 1, L 2 X t = L LX t = X t 2. In general, If c is a constant, Also, L k X t = X t k. Lc = c. L 1 X t X t+1, X t = (1 L)X t = X t X t 1.
29 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact Lag Operator It satisfies and, when φ < 1, L(αX t + βy t ) = αx t 1 + βy t 1 (a L + b L 2 ) X t = ax t 1 + bx t 2, lim (1 + φl + j φ2 L φ j L j ) = (1 φl) 1
30 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact ARMA Models in Lag Notaion ARMA(p, q): Y t µ = φ 1 (Y t 1 µ)+...+φ p (Y t p µ)+ε t +θ 1 ε t θ q ε t q, or Y t µ φ 1 (Y t 1 µ)... φ p (Y t p µ) = ε t +θ 1 ε t θ q ε t q, With lag operator: φ(l)(y t µ) = θ(l)ε t, where φ(l) = 1 φ 1 L φ 2 L 2... φ p L p, θ(l) = 1 + θ 1 L + θ 2 L θ q L q.
31 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact Stochastic Difference Equation (SDE) Representation Let X t = Y t µ and w t = θ(l)ε t. Then φ(l)x t = w t, or X t = φ 1 X t φ p X t p + w t, is a pth-order stochastic difference equation.
32 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact SDE Representation - AR(1) Example Example: First-order SDE (AR(1)): X t = φx t 1 + ε t, ε t WN Solve for Wold Form (recursive substitution) X t = φ t+1 X 1 + φ t ε 0 + φ t 1 ε φε t 1 + ε t = φ t+1 X 1 + ψ iε t i. i=0 where X 1 is an initial condition and ψ i = φ i. We approximated Wold form with 1 parameter form for AR(1).
33 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact Dynamic Multiplier The dynamic multiplier measurers the effect of ε t on subsequent values of X τ : X t+j ε t For the X t being AR(1) process = X j ε 0 = ψ j. X t+j ε t = ψ j = φ j. The dynamic multiplier for any linear difference equation depends only on the length of time j, not on time t.
34 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact Impulse Response Function The impulse-response function is a sequence of dynamic multipliers as a function of time from the one time impulse to ε t ,s IRF s, Time
35 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA Models Lag Operator Dynamic Multiplier IRF Cumulative impact Cumulative impact Permanent increase in ε at time t, i.e. ε t = 1, ε t+1 = 1, ε t+2 = 1,... or X t+j ε t + X t+j ε t+1 + X t+j ε t X t+j ε t+j = ψ j + ψ j ψ + 1 X t ε t + Xt+1 ε t In the limit, as j + Xt+2 ε t Xt+j ε t = ψ j + ψ j ψ + 1 [ Xt+j lim + X t+j X ] t+j = j ε t ε t+1 ε t+j ψ j = ψ(1), j=0 where ψ(1) = ψ(l = 1) = 1 + ψ 1 + ψ
36 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AUTOREGRESSIVE PROCESS
37 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1) Model Recall X t = φx t 1 + ε t, ε t WN, = φ j ε t j = ψ j ε t j. Wold coefficients j=0 j=0 ψ j = φψ j 1, ψ j = Y t+j ε t If φ < 1 X t is a stable and stationary solution to first-order SDE. If φ = 1 then ψ j = 1 j, X t = X 1 + t j=0 εj is neither stationary nor stable solution, and ψ(1) is infinite.
38 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1): Lag notation AR(1): X t = φx t 1 + ε t, ε t WN, (1 φl)x t = ε t Multiply both sides by (1 φl) 1 : X t = (1 φl) 1 ε t = (1 + φl + φ 2 L 2 + φ 3 L )ε t = φ j ε t j = ψ j ε t j, j=0 j=0 ψ(l) = (1 φl) 1.
39 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1): Long-run effects For AR(1), if φ < 1 the permanent increase in ε t equals X t+j ε t and as j + Xt+j ε t Xt+j ε t+j = 1 + φ + φ 2 + φ φ j. ψ(1) = 1 + φ + φ = 1/(1 φ) the cumulative consequences for X of a one-time change in ε, j=0 X t+j ε t = 1/(1 φ)
40 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1): Mean Intercept representation for Y t X t + µ Mean Y t = c + φy t 1 + ε t. E[Y t] = c + φe[y t 1] + E[ε t], Since we have covariance stationary process, E[Y t] = E[Y t 1] and E[Y t] = c µ, c = µ(1 φ). 1 φ
41 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1): Variance Variance Var(Y t ) = E[(Y t µ) 2 ] = E[(φ(Y t 1 µ) + ε t ) 2 ] = φ 2 E[(Y t 1 µ) 2 ] + 2φE[(Y t 1 µ)ε t ] + E[ε 2 t ]. Since Y t is covariance stationary and ε t is independently distributed, Var(Y t ) = Var(Y t 1 ) and so E[Y t 1 ε t ] = 0, Var(Y t ) = σ2 1 φ 2 γ 0.
42 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1): Covariance Covariance Cov(Y t, Y t j ) = E[(Y t µ)(y t j µ)] = φe[(y t 1 µ)(y t j µ)] + E[ε t (Y t j µ)], Cov(Y t, Y t j ) γ j = φγ j 1.
43 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1): Moments Summing up E[Y t ] = c 1 φ µ, Var(Y t ) = σ 2 1 φ 2 γ 0, Cov(Y t, Y t j ) γ j = φγ j 1.
44 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Auto-correlation Function (ACF) Define ρ j γ j γ 0 j th autocorrelation corr(y t, Y t j ) For AR(1), ρ j = φρ j 1. For AR(1) ACF and IRF are the same. In general it not true. ACF < 1, 1 >.
45 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(1): Auto-correlation Function (ACF) 0 < φ < ACFAR 1, < φ < ACF AR 1, f= Out[142]=
46 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Pth-order SDE: AR(p) An AR(p) process Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + v t, can be rewritten as a 1 st order system Y t µ φ 1 φ 2... φ p 1 φ p Y t 1 µ Y t 2 µ =. Y t p+1 µ In the state-space, companion form notation β t = F β t 1 + ε t and we are back in 1 st order system. Y t 1 µ Y t 2 µ Y t 3 µ. Y t p µ + v t
47 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Pth-order SDE: AR(p) An AR(p) process Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + v t, can be rewritten as a 1 st order system Y t µ φ 1 φ 2... φ p 1 φ p Y t 1 µ Y t 2 µ =. Y t p+1 µ In the state-space, companion form notation β t = F β t 1 + ε t and we are back in 1 st order system. Y t 1 µ Y t 2 µ Y t 3 µ. Y t p µ + v t
48 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Pth-order SDE: AR(p) An AR(p) process Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + v t, can be rewritten as a 1 st order system Y t µ φ 1 φ 2... φ p 1 φ p Y t 1 µ Y t 2 µ =. Y t p+1 µ In the state-space, companion form notation β t = F β t 1 + ε t and we are back in 1 st order system. Y t 1 µ Y t 2 µ Y t 3 µ. Y t p µ + v t
49 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Pth-order SDE: AR(p) An AR(p) process Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + v t, can be rewritten as a 1 st order system Y t µ φ 1 φ 2... φ p 1 φ p Y t 1 µ Y t 2 µ =. Y t p+1 µ In the state-space, companion form notation β t = F β t 1 + ε t and we are back in 1 st order system. Y t 1 µ Y t 2 µ Y t 3 µ. Y t p µ + v t
50 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Pth-order SDE: AR(p) An AR(p) process Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + v t, can be rewritten as a 1 st order system Y t µ Y t 1 µ Y t 2 µ =. Y t p+1 µ φ 1 φ 2... φ p 1 φ p Y t 1 µ Y t 2 µ Y t 3 µ. Y t p µ + v t In the state-space, companion form notation β t = F β t 1 + ε t and we are back in 1 st order system.
51 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Pth-order SDE: AR(p) An AR(p) process Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + v t, can be rewritten as a 1 st order system Y t µ Y t 1 µ Y t 2 µ. Y t p+1 µ = φ 1 φ 2... φ p 1 φ p Y t 1 µ Y t 2 µ Y t 3 µ. Y t p µ + v t In the state-space, companion form notation β t = F β t 1 + ε t and we are back in 1 st order system.
52 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(p): Stability Consider a state space form β t+j = F j+1 β t 1 + F j ε t Fε t+j 1 + ε t+j. AR(p) is stable and stationary if, lim j Fj = 0 i.e. when eigenvalues of F are inside unit circle (have modulus < 1). Shocks die out.
53 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Eigenvalues Consider equation Fx = λx. x is eigenvector and λ is a corresponding eigenvalue. To compute the eigenvalue, write it as If x is a non-zero vector then (F λi)x = 0. F λi is singular det(f λi) = 0.
54 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Example Consider the AR(2) Then, in a matrix notation where Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) + v t β t = [ Yt µ Y t 1 µ β t = Fβ t 1 + ε t, ] [ φ1 φ, F = Eigenvalues λ of the matrix F solve ( ) φ1 λ φ det 2 = λ 2 λφ 1 λ 1 φ 2 = 0, λ i = φ 1 ± φ φ 2 2 If λ i <1 for i = 1, 2, the AR(2) is stable. ]
55 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations AR(p): Stability For pth-order SDE, solve λ p φ 1 λ p 1... φ p 1 λ φ p = 0 In general, the solution involves complex and real roots. The AR(p) system is stable if all eigenvalues are inside the unit circle. Note: complex eigenvalues imply periodic behavior. AR 2, f1=+0.4 f2=
56 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Characteristic Equation Recall that we could write down AR(p) process Y t µ = φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) φ p (Y t p µ) + ε t, in the lag polynomial form φ(l)(y t µ) = ε t, where φ(l) = 1 φ 1 L φ 2 L 2... φ p L p. The stability can be then studied through the characteristic equation φ(z) = 1 φ 1 z φ 2 z 2... φ p z p = 0.
57 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Roots of Characteristic Equation The roots (z s) of characteristic equation, φ(z) = 0, are inverse of eigenvalues (λ s) of companion matrix F: as z = 1/λ. 1 φ 1 z φ 2 z 2... φ p z p = (1 λ 1 z)(1 λ 2 z) (1 λ p z) = z p (1/z λ 1 )(1/z λ 2 ) (1/z λ p ) For z > 1 stochastic difference equation is stable and stationary.
58 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Stability φ(l) can be decomposed into (1 φ 1 L φ 2 L 2... φ p L p ) = (1 λ 1 L)(1 λ 2 L)... (1 λ p L) Factor the polynomial into AR(1) elements. Since i λ i < 1 then (1 λ i L) 1 exists and so φ(l) 1 exists = ψ(l) = φ(l) 1 for AR(p).
59 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Yule-Walker Equations Variance, covariances, autocorrelations, and dynamic multipliers have the same p th -order form: Var(Y t ) = γ 0 = φ 1 γ 1 + φ 2 γ φ p γ p + σ 2 Cov(Y t, Y t j ) = γ j = φ 1 γ j 1 + φ 2 γ j φ p γ j p corr(y t, Y t j ) = ρ j = φ 1 ρ j 1 + φ 2 ρ j φ p ρ j p ψ j = φ 1 ψ j 1 + φ 2 ψ j φ p ψ j p
60 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Yule-Walker Equations: AR(2) Example AR(2) Example: φ(l) 1 = ψ(l) φ(l)ψ(l) = 1 (1 φ 1 L φ 2 L 2 )(1 + ψ 1 L + ψ 2 L ) = (ψ 1 φ 1 )L + (ψ 2 φ 1 ψ 1 + φ 2 )L = 1 Then ψ 1 = φ 1, ψ 2 = φ 1 ψ 1 + φ 2,. ψ j = φ 1 ψ j 1 + φ 2 ψ j 2.
61 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Example: AR(2) Moments AR(2) Process: Y t = c + φ 1 Y t 1 + φ 2 Y t 2 + ε t Mean Covariance µ = c/(1 φ 1 φ 2 ) γ j = E[(Y t µ)(y t j µ)] = E[(φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) + ε t )(Y t j µ)] = φ 1 γ j 1 + φ 2 γ j 2 γ 1 = φ 1 γ 0 + φ 2 γ 1 = γ 0 φ 1 1 φ 2 γ 2 = φ 1 γ 1 + φ 2 γ 0 = γ 0 ( φ φ 2 + φ 2 )
62 Syllabus Stationarity ARMA AR MA Model Selection Estimation AR(1) ACF AR(p) Stability Yule-Walker Equations Example: AR(2) Moments Variance γ 0 = E[(Y t µ)(φ 1 (Y t 1 µ) + φ 2 (Y t 2 µ) + ε t )] = φ 1 γ 1 + φ 2 γ 2 + σ 2 [ φ 2 = 1 + φ 2φ 2 ] 1 + φ 2 2 γ 0 + σ 2 1 φ 2 1 φ 2 1 φ 2 γ 0 = (1 + φ 2 ) [ ] (1 φ 2 ) 2 φ 2 1 Autocorrelation ρ j = φ 1 ρ j 1 + φ 2 ρ j 2 ρ 1 = φ 1 + φ 2 ρ 1 = φ 1 /(1 φ 2 ) ρ 2 = φ 1 ρ 1 + φ 2
63 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MOVING AVERAGE (MA) PROCESSES
64 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF Moving Average (MA) Processes Recall Wold Form: Y t = µ + ψ j ε t j, j=0 ε t WN, MA(q) process : truncated Wold form. Y t = µ + ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q Y t = µ + θ(l)ε t, θ(l) = 1 + θ 1 L + θ 2 L θ q L q. Moving average as Y t is constructed from a weighed sum to the q most recent values of ε
65 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Mean Variance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t E[Y t] = µ, Var(Y t) = γ 0 = E[(Y t µ) 2 ]
66 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Mean Variance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t E[Y t] = µ, Var(Y t) = γ 0 = E[(Y t µ) 2 ] = E[(ε t + θε t 1) 2 ]
67 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Mean Variance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t E[Y t] = µ, Var(Y t) = γ 0 = E[(Y t µ) 2 ] = E[(ε t + θε t 1) 2 ] = E[ε 2 t + 2θε tε t 1 + θ 2 ε 2 t 1]
68 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Mean Variance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t E[Y t] = µ, Var(Y t) = γ 0 = E[(Y t µ) 2 ] = E[(ε t + θε t 1) 2 ] = E[ε 2 t + 2θε tε t 1 + θ 2 ε 2 t 1] = σ θ 2 σ 2
69 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Mean Variance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t E[Y t] = µ, Var(Y t) = γ 0 = E[(Y t µ) 2 ] = E[(ε t + θε t 1) 2 ] = E[ε 2 t + 2θε tε t 1 + θ 2 ε 2 t 1] = σ θ 2 σ 2 = (1 + θ 2 )σ 2,
70 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Covariance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t Cov(Y t, Y t 1) = γ 1 = E[(Y t µ)(y t 1 µ)]
71 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Covariance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t Cov(Y t, Y t 1) = γ 1 = E[(Y t µ)(y t 1 µ)] = E[(ε t + θε t 1)(ε t 1 + θε t 2)]
72 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Covariance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t Cov(Y t, Y t 1) = γ 1 = E[(Y t µ)(y t 1 µ)] = E[(ε t + θε t 1)(ε t 1 + θε t 2)] = θσ 2,
73 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF MA(1) Example MA(1) Moments Covariance Y t = µ + ε t + θε t 1 = µ + (1 + θl)ε t Cov(Y t, Y t 1) = γ 1 = E[(Y t µ)(y t 1 µ)] = E[(ε t + θε t 1)(ε t 1 + θε t 2)] = θσ 2, Cov(Y t, Y t j) = γ j = 0 j > 1. Autocorrelation Function ρ 1 = γ1 θσ 2 = γ 0 (1 + θ 2 )σ = θ θ 2 ρ j = 0, j > 1.
74 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF Invertibility of MA Processes Consider MA(1) Y t = µ + (1 + θl)ε t θ and 1/θ give the same ACF ρ 1 = γ1 = θ γ θ = 1/θ θ 2 = 1 θ 1 = θ 1 + θ 2 θ 2 θ 2 +1 Normalize θ by using invertible MA representation. Example: Both Y t = ε t + 0.5ε t 1 and Y t = ε t + 2ε t 1 have the same autocorrelation function: ρ 1 = = (0.5) 2 = 0.4
75 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF Invertibility of MA Processes But for MA(1) with θ < 1, there exists -order AR representation: Y t µ = (1 + θl)ε t, θ < 1, = (1 θ L)ε t, for θ = θ It can rewritten as (1 θ L) 1 (Y t µ) = ε t (1 + θ L + θ 2 L )(Y t µ) = ε t Y t µ = θ(y t 1 µ) θ 2 (Y t 2 µ) + θ 3 (Y t 3 µ)... + ε t i.e. AR( ) with φ(l) = 1 + θ θ 2 + θ 3 θ θ < 1 is not a stability requirement, MA system is always stable. It allows invertibility and AR( ) representation.
76 Syllabus Stationarity ARMA AR MA Model Selection Estimation MA(1) Invertibility PACF Partial Autocorrelation Function (PACF) Definition k th -order partial autocorrelation is the regression coefficient (for the population) φ kk in k th -order autoregression Y t = c + φ k1 Y t 1 + φ k2 Y t φ kk Y t k + ε t It measures how important is the last lag.
77 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual MODEL SELECTION
78 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Box-Jenkins Approach Matching model with actual data Transform data to appear covariance stationary We may have data that is not covariance stationary, e.g. GDP. Box-Jenkins approach: maybe GDP is not covariance-stationary but some transformation of it is and we can get forecast of it from transformed series. take logs (natural) differences detrend
79 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual GDP 16000,0 GDP 14000, , ,0 8000,0 6000,0 GDP 4000,0 2000,0 0, We can t use ARMA model as the graph shows it s not covariance stationary series. Also, not linear.
80 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Log of GDP Take natural logarithm. 12 LN(GDP) LN(GDP)
81 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Difference in log of GDP = Growth rate If we take first difference of the log series, we get growth rate of level series. 0,07 Diff ln(gdp) 0,06 0,05 0,04 0,03 0,02 Diff ln(gdp) 0,01 0-0, ,02-0,03 It looks much more like AR or MA process. Just looking at graph may not be enough.
82 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(1) y:ar AR 1, time 2
83 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(1) y:ar 1 AR 1, time 2 3
84 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(1) y:ar 1 AR 1, time 2
85 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(1) y:ar 1 25 AR 1, time
86 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(1) y:ar 1 AR 1, time 2 3
87 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(2) y:ar 2 AR 2, time 2
88 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual MA(1) y:ma 1 MA 1, time 2
89 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual MA(1) y:ma 1 MA 1, time 2
90 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual MA(2) y:ma 2 MA 2, time 2
91 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(1) It s not that bad, though: y:ar 1 AR 1, time 1 2
92 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual AR(1) It s not that bad, though: y:ar 1 AR 1, time 2
93 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Box-Jenkins Approach Matching model with actual data Transform data to appear covariance stationary take logs (natural) differences detrend Examine the sample ACF and PACF We know that there is a 1-1 mapping between ACF and series
94 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual ACF: AR(1) 1.0 ACFAR 1,
95 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual ACF: AR(2) ACFAR 2,
96 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual ACF: MA(1) 1.0 ACFMA 1,
97 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual ACF: MA(2) 1.0 ACFMA 2,
98 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Sample ACF ȳ = 1 T ˆγ j = 1 T T y t, t=1 T (y t ȳ)(y t j ȳ), sample auto covariance estimate t=j+1 ρ j = ˆγ j ˆγ 0 sample auto correlation
99 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Sample PACF Use OLS y t = c + φ 1k y t φ kk y t k + ε t, k = 1, 2,... If y t iid(µ, σ 2 ), ρ j = 0, j 0. (no correlation across observation) ˆ var(ˆρ j ) 1 T and ˆ var( ˆφ kk ) 1 T This values are used when reporting bounds in ACF and PACF: 5%-95%: ±1.96var(ˆρ ˆ j ) or ±1.96var( ˆ ˆφ kk ).
100 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Model selection Assume we can reject iid, i.e. PACF and ACF show some significant lags. How to determine which model is correct? Process ACF PACF AR(p) Exponential or oscillatory decay φ kk = 0, k > p MA(q) ρ k = 0, k > q Exponential or oscillatory decay ARMA(p,q) decay begins at lag q decay begins at lag p But we have many models we can t really discriminate between just by looking.
101 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Box-Jenkins Matching model with actual data Transform data to appear covariance stationary take logs (natural) differences detrend Examine the sample ACF and PACF Estimate ARMA models Perform diagnostic analysis to confirm that model is consistent with data
102 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Box-Ljung Statistics Use modified Box-Ljung Statistics (Q-Stat) to test joint statistical significance of ˆρ 1,..., ˆρ k : Q (k) = T(T + 2) k i=1 Under H 0 : Y t iid(µ, σ 2 ), Q (k) A χ 2 (k). 1 T i ˆρ2 i. For significance level α, the critical region for rejection of the hypothesis of randomness (i.e. reject ρ 1 =... = ρ k = 0) is Q (k) > χ 2 1 α(k). For smaller sample, empirical distribution has fatter tail and one needs to move the critical point to the right.
103 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Akaike Information Criterion (AIC) Akaike Information Criterion (AIC): AIC(p, q) = ln(ˆσ 2 ) + 1 2(p + q), T where the first term is responsible for the fit of the model and the second one is a penalty for number of parameters.
104 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Schwarz (Bayesian) Information Criterion Schwarz (Bayesian) Information Criterion (BIC): BIC(p, q) = ln(ˆσ 2 ) + 1 ln(t)2(p + q), T where now penalty is related to sample size.
105 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Selection: AIC vs BIC Calculate selection criterium for a set of estimated models and try to find the one that minimize the AIC or BIC. Result: If p, q considered are larger than true orders (i) BIC picks true ARMA(p,q) model as T. (ii) AIC picks overparameterized model as T. For smaller sample it might be a a problem with choice BIC: Trade-off between efficiency and consistency: Better to overparameterize than to have incorrect result.
106 Syllabus Stationarity ARMA AR MA Model Selection Estimation Box-Jenkins GDP ACF PACF Box-Ljung Selection Residual Residual Diagnostic Use sample ACF and PACF of sample residuals. Compute Box-Ljung Statistics for sample of residuals: Q (k) A χ 2 (k (p + q)), where (p+q) is a degree of freedom adjustment Use LM test (it has higher power) T R 2 χ 2 (k), with R 2 computed from the regression ˆε 2 t = c + α 1ˆε t α k ˆε t k + v t.
107 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) ESTIMATION
108 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) ARMA Wold representation: Y t = κ t + ψ j ε t j ε t WN(0, σ 2 ), j=0 ψj 2 < j=0 AR(p): Y t µ = φ 1 (Y t 1 µ)+φ 2 (Y t 2 µ)+...+φ p (Y t p µ)+ε t, ε t WN MA(q): Y t µ = ε t + θ 1 ε t 1 + θ 2 ε t θ q ε t q, ε t WN ARMA(p,q): Y t µ = φ 1 (Y t 1 µ) φ p (Y t p µ) + ε t + θ 1 ε t θ q ε t q
109 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Estimation OLS MLE AR(1) MA(1)
110 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) OLS For AR(p), OLS is equivalent to Conditional MLE Model: y t = c + φ 1 y t φ p y t p + ε t, ε t WN(0, σ 2 ). = x tβ + ε t, β = (c, φ 1, φ 2,..., φ p ), x t = (1, y t 1, y t 2,..., y t OLS: ˆβ = ˆσ 2 = ( T ) 1 T x t x t x t y t, t=1 1 T (p + 1) t=1 T (y t x t ˆβ) 2. t=1
111 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Properties Note: E[ ˆβ] β because x t is random and E(ε Y) 0. But, if z > 1 for φ(z) = 0 (or λ < 1 for F) then, ˆβ p β, ˆσ 2 p σ 2. Estimator might be biased but consistent, it converges in probability. Illustration: φ = 0.7 n = Median =
112 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Properties Note: E[ ˆβ] β because x t is random and E(ε Y) 0. But, if z > 1 for φ(z) = 0 (or λ < 1 for F) then, ˆβ p β, ˆσ 2 p σ 2. Estimator might be biased but consistent, it converges in probability. Illustration: φ = 0.7 n = Median =
113 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Properties Note: E[ ˆβ] β because x t is random and E(ε Y) 0. But, if z > 1 for φ(z) = 0 (or λ < 1 for F) then, ˆβ p β, ˆσ 2 p σ 2. Estimator might be biased but consistent, it converges in probability. Illustration: φ = 0.7 n = Median =
114 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Properties Note: E[ ˆβ] β because x t is random and E(ε Y) 0. But, if z > 1 for φ(z) = 0 (or λ < 1 for F) then, ˆβ p β, ˆσ 2 p σ 2. Estimator might be biased but consistent, it converges in probability. Illustration: φ = 0.7 n = Median =
115 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Properties Note: E[ ˆβ] β because x t is random and E(ε Y) 0. But, if z > 1 for φ(z) = 0 (or λ < 1 for F) then, ˆβ p β, ˆσ 2 p σ 2. Estimator might be biased but consistent, it converges in probability. Illustration: φ = 0.7 n = Median =
116 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Properties Note: E[ ˆβ] β because x t is random and E(ε Y) 0. But, if z > 1 for φ(z) = 0 (or λ < 1 for F) then, ˆβ p β, ˆσ 2 p σ 2. Estimator might be biased but consistent, it converges in probability. Illustration: φ = 0.7 n = Median =
117 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Hypothesis testing Hypothesis testing: T( ˆβ β) d N(0, σ 2 V 1 1 ), V = plim T T T x t x t. If we have enough data (T ) then t-student will converge to Normal distribution. t=1 t β=β0 = ˆβ β 0 t student(t 1) N(0, 1). SE( ˆ ˆβ) T See Hamilton for a downward bias in a finite sample, i.e. E[ ˆβ] < β.
118 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) MLE If Y t iid, joint likelihood is the product of marginal likelihoods (pdf) L( θ y 1,..., y T ) = = T L( θ y t ), t=1 T f (y t θ), t=1 Different parameters have different probability of generating {y 1,..., y T }.
119 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) MLE If Y t iid, joint likelihood is the product of marginal likelihoods (pdf) L( θ y 1,..., y T ) = = T L( θ y t ), t=1 T f (y t θ), t=1 Different parameters have different probability of generating {y 1,..., y T }. But if Y t is not independent, factorization is invalid. Instead, factor into conditional distributions. f (Y 1, Y 2 θ) = f (Y 2 Y 1, θ)f (Y 1 θ), f (Y 1, Y 2, Y 3 θ) = f (Y 3 Y 2, Y 1, θ)f (Y 2, Y 1 θ), L( θ y 1,..., y T ) = f (Y 1,..., Y T θ) = T f (Y t Y t 1,..., Y 2, Y 1, θ)f (Y 1 θ) t=2
120 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) MLE Conditional MLE assumes Y 1 fixed (not random). It s just a simplifying assumption. L c ( θ y 1,..., y T ) = T f (Y t Y t 1,..., Y 1, θ). t=2 Given normality, first order conditions of maximization L c are linear in θ. For AR models ˆφ CMLE OLS. Conditional MLE is consistent. (It s not so efficient as we ignore randomness of Y 1.) Exact MLE requires non-linear optimization. Often we don t know distribution of the the data. One thing that is assumed in OLS is the ε WN. Assuming normality in CMLE is not a bad assumption as we still get consistent estimator: quasi MLE. (see Davidson and MacKinnon.) We don t get unbiasness in any of MLE. What can be done is to compute what the bias is and correct for it.
121 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Estimation AR(1) Recall: AR(1) Y t = c + φy t 1 + ε t, ε iidn(0, σ 2 ), φ < 1. If we don t believe in normality of ε t, we have quasi-mle. If we do: MLE. If ε N so is Y t Y t 1 N: Y t Y t 1 N(c + φy t 1, σ 2 ).
122 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Conditional MLE If we know Y t 1 the only term that is random is ε t. f (y t y t 1, θ) = 1 2πσ 2 e 1 2σ 2 (yt c φyt 1)2, where θ = (c, φ, σ 2 ). What about Y 1? Y 1 N(E[Y t ], var(y t )) N(µ, σ 2 1 φ 2 ), µ = c 1 φ. If φ = 1 no unconditional mean or variance exist.
123 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Exact MLE Maximum likelihood i.e. L( θ y 1,..., y T ) = f (y 1 θ) 1 L( θ y 1,..., y T ) = 2π( σ2 1 φ ) 2 T f (y t y t 1, θ), t=2 e 1 2σ 2 /(1 φ 2 (y1 c ) 1 φ )2 T t=2 1 2πσ 2 e 1 2σ 2 (yt c φyt 1)2. Need to solve non-linear optimization problem.
124 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) MA(1) Recall Y t = µ + ε t + θε t 1, ε t iidn(0, σ 2 ), θ < 1. θ < 1 is assumed for invertible representation only. Figure: Bi-model likelihood function for MA process
125 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Estimation MA(1) Y t ε t 1 N(µ + θε t 1, σ 2 ), f (y t ε t 1, θ) = 1 1 2πσ 2 e 2σ 2 (yt µ θεt 1)2, θ = (µ, θ, σ 2 ). Problem: without knowing ε t 2 we don t observe ε t 1. Need to know ε t 2 to know ε t 1 = y t µ θε t 2. But ε t 2 unobservable. Assume ε 0 = 0. Make it non-random, just fix it with number 0. The trick works with any number.
126 Syllabus Stationarity ARMA AR MA Model Selection Estimation ARMA OLS MLE AR(1) CMLE Exact MLE MA(1) Estimation MA(1) Y 1 ε 0 N(µ, σ 2 ), Y 1 = µ + ε 1 ε 1 = Y 1 µ, Y 2 = µ + ε 2 + θε 1 ε 2 = Y 2 µ θ(y 1 µ), ε t = Y t µ θ(y t 1 µ) ( 1) t 1 θ t 1 (Y 1 µ). Conditional likelihood (ε 0 = 0).: L( θ y 1,..., y T, ε 0 = 0) = T t=1 1 2πσ 2 e 1 2σ 2 ε2 t. If θ < 1 (much less), ε 0 doesn t matter, CMLE is consistent. Exact MLE requires Kalman Filter.
Lecture 1: Stationary Time Series Analysis
Syllabus Stationarity ARMA AR MA Model Selection Estimation Forecasting Lecture 1: Stationary Time Series Analysis 222061-1617: Time Series Econometrics Spring 2018 Jacek Suda Syllabus Stationarity ARMA
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationLecture on ARMA model
Lecture on ARMA model Robert M. de Jong Ohio State University Columbus, OH 43210 USA Chien-Ho Wang National Taipei University Taipei City, 104 Taiwan ROC October 19, 2006 (Very Preliminary edition, Comment
More informationEconometrics II Heij et al. Chapter 7.1
Chapter 7.1 p. 1/2 Econometrics II Heij et al. Chapter 7.1 Linear Time Series Models for Stationary data Marius Ooms Tinbergen Institute Amsterdam Chapter 7.1 p. 2/2 Program Introduction Modelling philosophy
More informationEASTERN MEDITERRANEAN UNIVERSITY ECON 604, FALL 2007 DEPARTMENT OF ECONOMICS MEHMET BALCILAR ARIMA MODELS: IDENTIFICATION
ARIMA MODELS: IDENTIFICATION A. Autocorrelations and Partial Autocorrelations 1. Summary of What We Know So Far: a) Series y t is to be modeled by Box-Jenkins methods. The first step was to convert y t
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More informationIntroduction to ARMA and GARCH processes
Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,
More informationSTAT Financial Time Series
STAT 6104 - Financial Time Series Chapter 4 - Estimation in the time Domain Chun Yip Yau (CUHK) STAT 6104:Financial Time Series 1 / 46 Agenda 1 Introduction 2 Moment Estimates 3 Autoregressive Models (AR
More informationAPPLIED ECONOMETRIC TIME SERIES 4TH EDITION
APPLIED ECONOMETRIC TIME SERIES 4TH EDITION Chapter 2: STATIONARY TIME-SERIES MODELS WALTER ENDERS, UNIVERSITY OF ALABAMA Copyright 2015 John Wiley & Sons, Inc. Section 1 STOCHASTIC DIFFERENCE EQUATION
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationReview Session: Econometrics - CLEFIN (20192)
Review Session: Econometrics - CLEFIN (20192) Part II: Univariate time series analysis Daniele Bianchi March 20, 2013 Fundamentals Stationarity A time series is a sequence of random variables x t, t =
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationTIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets
TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:
More informationClass: Trend-Cycle Decomposition
Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationTrend-Cycle Decompositions
Trend-Cycle Decompositions Eric Zivot April 22, 2005 1 Introduction A convenient way of representing an economic time series y t is through the so-called trend-cycle decomposition y t = TD t + Z t (1)
More informationUnivariate Time Series Analysis; ARIMA Models
Econometrics 2 Fall 24 Univariate Time Series Analysis; ARIMA Models Heino Bohn Nielsen of4 Outline of the Lecture () Introduction to univariate time series analysis. (2) Stationarity. (3) Characterizing
More informationMidterm Suggested Solutions
CUHK Dept. of Economics Spring 2011 ECON 4120 Sung Y. Park Midterm Suggested Solutions Q1 (a) In time series, autocorrelation measures the correlation between y t and its lag y t τ. It is defined as. ρ(τ)
More informationStationary Stochastic Time Series Models
Stationary Stochastic Time Series Models When modeling time series it is useful to regard an observed time series, (x 1,x,..., x n ), as the realisation of a stochastic process. In general a stochastic
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationUnivariate Nonstationary Time Series 1
Univariate Nonstationary Time Series 1 Sebastian Fossati University of Alberta 1 These slides are based on Eric Zivot s time series notes available at: http://faculty.washington.edu/ezivot Introduction
More informationAutoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationECONOMETRICS Part II PhD LBS
ECONOMETRICS Part II PhD LBS Luca Gambetti UAB, Barcelona GSE February-March 2014 1 Contacts Prof.: Luca Gambetti email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ Description This is
More informationUnivariate ARIMA Models
Univariate ARIMA Models ARIMA Model Building Steps: Identification: Using graphs, statistics, ACFs and PACFs, transformations, etc. to achieve stationary and tentatively identify patterns and model components.
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More informationVector Auto-Regressive Models
Vector Auto-Regressive Models Laurent Ferrara 1 1 University of Paris Nanterre M2 Oct. 2018 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationE 4101/5101 Lecture 6: Spectral analysis
E 4101/5101 Lecture 6: Spectral analysis Ragnar Nymoen 3 March 2011 References to this lecture Hamilton Ch 6 Lecture note (on web page) For stationary variables/processes there is a close correspondence
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationVAR Models and Applications
VAR Models and Applications Laurent Ferrara 1 1 University of Paris West M2 EIPMC Oct. 2016 Overview of the presentation 1. Vector Auto-Regressions Definition Estimation Testing 2. Impulse responses functions
More informationAdvanced Econometrics
Advanced Econometrics Marco Sunder Nov 04 2010 Marco Sunder Advanced Econometrics 1/ 25 Contents 1 2 3 Marco Sunder Advanced Econometrics 2/ 25 Music Marco Sunder Advanced Econometrics 3/ 25 Music Marco
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationEconometric Forecasting
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 1, 2014 Outline Introduction Model-free extrapolation Univariate time-series models Trend
More informationVector autoregressions, VAR
1 / 45 Vector autoregressions, VAR Chapter 2 Financial Econometrics Michael Hauser WS17/18 2 / 45 Content Cross-correlations VAR model in standard/reduced form Properties of VAR(1), VAR(p) Structural VAR,
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationProblem Set 2: Box-Jenkins methodology
Problem Set : Box-Jenkins methodology 1) For an AR1) process we have: γ0) = σ ε 1 φ σ ε γ0) = 1 φ Hence, For a MA1) process, p lim R = φ γ0) = 1 + θ )σ ε σ ε 1 = γ0) 1 + θ Therefore, p lim R = 1 1 1 +
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationClass 4: Non-stationary Time Series
DF Phillips-Perron Stationarity Tests Variance Ratio Structural Break UC Approach Homework Jacek Suda, BdF and PSE January 24, 2013 DF Phillips-Perron Stationarity Tests Variance Ratio Structural Break
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More informationLecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications
Lecture 3: Autoregressive Moving Average (ARMA) Models and their Practical Applications Prof. Massimo Guidolin 20192 Financial Econometrics Winter/Spring 2018 Overview Moving average processes Autoregressive
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationUniversità di Pavia. Forecasting. Eduardo Rossi
Università di Pavia Forecasting Eduardo Rossi Mean Squared Error Forecast of Y t+1 based on a set of variables observed at date t, X t : Yt+1 t. The loss function MSE(Y t+1 t ) = E[Y t+1 Y t+1 t ]2 The
More informationAdvanced Econometrics
Based on the textbook by Verbeek: A Guide to Modern Econometrics Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna May 2, 2013 Outline Univariate
More informationMultivariate Time Series: VAR(p) Processes and Models
Multivariate Time Series: VAR(p) Processes and Models A VAR(p) model, for p > 0 is X t = φ 0 + Φ 1 X t 1 + + Φ p X t p + A t, where X t, φ 0, and X t i are k-vectors, Φ 1,..., Φ p are k k matrices, with
More informationSingle Equation Linear GMM with Serially Correlated Moment Conditions
Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationSTAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong
STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationMEI Exam Review. June 7, 2002
MEI Exam Review June 7, 2002 1 Final Exam Revision Notes 1.1 Random Rules and Formulas Linear transformations of random variables. f y (Y ) = f x (X) dx. dg Inverse Proof. (AB)(AB) 1 = I. (B 1 A 1 )(AB)(AB)
More informationEconometrics I: Univariate Time Series Econometrics (1)
Econometrics I: Dipartimento di Economia Politica e Metodi Quantitativi University of Pavia Overview of the Lecture 1 st EViews Session VI: Some Theoretical Premises 2 Overview of the Lecture 1 st EViews
More informationNote: The primary reference for these notes is Enders (2004). An alternative and more technical treatment can be found in Hamilton (1994).
Chapter 4 Analysis of a Single Time Series Note: The primary reference for these notes is Enders (4). An alternative and more technical treatment can be found in Hamilton (994). Most data used in financial
More informationLINEAR STOCHASTIC MODELS
LINEAR STOCHASTIC MODELS Let {x τ+1,x τ+2,...,x τ+n } denote n consecutive elements from a stochastic process. If their joint distribution does not depend on τ, regardless of the size of n, then the process
More informationFE570 Financial Markets and Trading. Stevens Institute of Technology
FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012
More informationProblem Set 1 Solution Sketches Time Series Analysis Spring 2010
Problem Set 1 Solution Sketches Time Series Analysis Spring 2010 1. Construct a martingale difference process that is not weakly stationary. Simplest e.g.: Let Y t be a sequence of independent, non-identically
More informationReliability and Risk Analysis. Time Series, Types of Trend Functions and Estimates of Trends
Reliability and Risk Analysis Stochastic process The sequence of random variables {Y t, t = 0, ±1, ±2 } is called the stochastic process The mean function of a stochastic process {Y t} is the function
More informationStat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting)
Stat 5100 Handout #12.e Notes: ARIMA Models (Unit 7) Key here: after stationary, identify dependence structure (and use for forecasting) (overshort example) White noise H 0 : Let Z t be the stationary
More informationARIMA Models. Jamie Monogan. January 16, University of Georgia. Jamie Monogan (UGA) ARIMA Models January 16, / 27
ARIMA Models Jamie Monogan University of Georgia January 16, 2018 Jamie Monogan (UGA) ARIMA Models January 16, 2018 1 / 27 Objectives By the end of this meeting, participants should be able to: Argue why
More informationECON 4160, Spring term Lecture 12
ECON 4160, Spring term 2013. Lecture 12 Non-stationarity and co-integration 2/2 Ragnar Nymoen Department of Economics 13 Nov 2013 1 / 53 Introduction I So far we have considered: Stationary VAR, with deterministic
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationEstimating AR/MA models
September 17, 2009 Goals The likelihood estimation of AR/MA models AR(1) MA(1) Inference Model specification for a given dataset Why MLE? Traditional linear statistics is one methodology of estimating
More informationLesson 13: Box-Jenkins Modeling Strategy for building ARMA models
Lesson 13: Box-Jenkins Modeling Strategy for building ARMA models Facoltà di Economia Università dell Aquila umberto.triacca@gmail.com Introduction In this lesson we present a method to construct an ARMA(p,
More informationE 4160 Autumn term Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test
E 4160 Autumn term 2016. Lecture 9: Deterministic trends vs integrated series; Spurious regression; Dickey-Fuller distribution and test Ragnar Nymoen Department of Economics, University of Oslo 24 October
More informationNon-Stationary Time Series and Unit Root Testing
Econometrics II Non-Stationary Time Series and Unit Root Testing Morten Nyboe Tabor Course Outline: Non-Stationary Time Series and Unit Root Testing 1 Stationarity and Deviation from Stationarity Trend-Stationarity
More informationIntroduction to Time Series Analysis. Lecture 11.
Introduction to Time Series Analysis. Lecture 11. Peter Bartlett 1. Review: Time series modelling and forecasting 2. Parameter estimation 3. Maximum likelihood estimator 4. Yule-Walker estimation 5. Yule-Walker
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationCircle a single answer for each multiple choice question. Your choice should be made clearly.
TEST #1 STA 4853 March 4, 215 Name: Please read the following directions. DO NOT TURN THE PAGE UNTIL INSTRUCTED TO DO SO Directions This exam is closed book and closed notes. There are 31 questions. Circle
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationLecture 4a: ARMA Model
Lecture 4a: ARMA Model 1 2 Big Picture Most often our goal is to find a statistical model to describe real time series (estimation), and then predict the future (forecasting) One particularly popular model
More informationChapter 3 - Temporal processes
STK4150 - Intro 1 Chapter 3 - Temporal processes Odd Kolbjørnsen and Geir Storvik January 23 2017 STK4150 - Intro 2 Temporal processes Data collected over time Past, present, future, change Temporal aspect
More informationChapter 12: An introduction to Time Series Analysis. Chapter 12: An introduction to Time Series Analysis
Chapter 12: An introduction to Time Series Analysis Introduction In this chapter, we will discuss forecasting with single-series (univariate) Box-Jenkins models. The common name of the models is Auto-Regressive
More informationEstimating Moving Average Processes with an improved version of Durbin s Method
Estimating Moving Average Processes with an improved version of Durbin s Method Maximilian Ludwig this version: June 7, 4, initial version: April, 3 arxiv:347956v [statme] 6 Jun 4 Abstract This paper provides
More informationDefine y t+h t as the forecast of y t+h based on I t known parameters. The forecast error is. Forecasting
Forecasting Let {y t } be a covariance stationary are ergodic process, eg an ARMA(p, q) process with Wold representation y t = X μ + ψ j ε t j, ε t ~WN(0,σ 2 ) j=0 = μ + ε t + ψ 1 ε t 1 + ψ 2 ε t 2 + Let
More informationVolatility. Gerald P. Dwyer. February Clemson University
Volatility Gerald P. Dwyer Clemson University February 2016 Outline 1 Volatility Characteristics of Time Series Heteroskedasticity Simpler Estimation Strategies Exponentially Weighted Moving Average Use
More informationTime Series (Part II)
Time Series (Part II) Luca Gambetti UAB IDEA Winter 2011 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email: luca.gambetti@uab.es webpage: http://pareto.uab.es/lgambetti/ 2 Goal
More information{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }
Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More informationLecture note 2 considered the statistical analysis of regression models for time
DYNAMIC MODELS FOR STATIONARY TIME SERIES Econometrics 2 LectureNote4 Heino Bohn Nielsen March 2, 2007 Lecture note 2 considered the statistical analysis of regression models for time series data, and
More informationEcon 423 Lecture Notes: Additional Topics in Time Series 1
Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationE 4101/5101 Lecture 9: Non-stationarity
E 4101/5101 Lecture 9: Non-stationarity Ragnar Nymoen 30 March 2011 Introduction I Main references: Hamilton Ch 15,16 and 17. Davidson and MacKinnon Ch 14.3 and 14.4 Also read Ch 2.4 and Ch 2.5 in Davidson
More informationTIME SERIES ANALYSIS. Forecasting and Control. Wiley. Fifth Edition GWILYM M. JENKINS GEORGE E. P. BOX GREGORY C. REINSEL GRETA M.
TIME SERIES ANALYSIS Forecasting and Control Fifth Edition GEORGE E. P. BOX GWILYM M. JENKINS GREGORY C. REINSEL GRETA M. LJUNG Wiley CONTENTS PREFACE TO THE FIFTH EDITION PREFACE TO THE FOURTH EDITION
More informationUnivariate linear models
Univariate linear models The specification process of an univariate ARIMA model is based on the theoretical properties of the different processes and it is also important the observation and interpretation
More information