EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015
|
|
- Toby Peter Austin
- 5 years ago
- Views:
Transcription
1 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015
2 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
3 1. Stationarity Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
4 1. Stationarity Four economies Long run growth
5 1. Stationarity log US GDP Long run growth
6 1. Stationarity US GDP Long run growth
7 1. Stationarity G7, EMDE Convergence
8 1. Stationarity G7, EMDE Not decoupling
9 1. Stationarity Trend and cycle? Annual growth rates. Subsample: Cross-correlation G7 and EMDE Table: The cross-correlation between G7 and EMDE growth rates has increased four-fold (data from IMF WEO Oct 2014). Y t = X t + Z t = Y t = X t + Z t
10 1. Stationarity G7, EMDE Long-term decoupling?
11 1. Stationarity Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
12 1. Stationarity Covariance stationarity Say a process W = { W t : t =..., 1, 0, 1,... } is covariance stationary (or second-order stationary) when 1. EW t = EW 0 for all t; 2. E[W t W t s ] for any s is finite and independent of t. Without loss of generality take EW 0 = 0. Therefore: G W (s) def = E[W t W t s ] = E[W t+s W t ] = E[W t s W t ] = G W ( s).
13 1. Stationarity Long Run Economic Growth
14 1. Stationarity US Economic Growth: Early
15 1. Stationarity US Economic Growth: Later
16 1. Stationarity 1. Call G the covariogram: Symmetric about 0, i.e., G W (j) = G W ( j) for all j. 2. Vector (W t, W t 1, W t 2,..., W t n ) has VCV G W (0) G W (1)... G W (n 1) G W (n) G W (1) G W (0)... G W (n 2) G W (n 1) G W (n) G W (n 1)... G W (1) G W (0) The matrix just formed must be positive semidefinite (since it s a VCV); we then say the sequence G W itself is positive semidefinite.
17 1. Stationarity 3. Often we take the process as given and just write G rather than G W. Often we find convenient to write the lag as subscript rather than as an argument in parenthesis. Thus the covariogram: G = { G j : j =..., 2, 1, 0, 1, 2,... } a doubly-infinite positive semidefinite sequence of numbers symmetric about 0.
18 1. Stationarity Covariogram US Economic Growth: Early
19 1. Stationarity Covariogram US Economic Growth: Later
20 1. Stationarity Say the process W = { W t : t =..., 1, 0, 1,... } is stationary (or strictly stationary) when for all n and all t 1, t 2,..., t n the joint distributions F(W t1 +s, W t2 +s,..., W tn +s) are independent of s. Neither covariance stationarity nor strict stationarity is implied by the other.
21 1. Stationarity 1. Suppose W t iid Cauchy (infinite variance): strictly stationary but not covariance stationary. 2. Covariance stationarity does not specify moments higher than second-order, and thus does not specify the entire distribution. 3. If W is strictly stationary and has finite second-order moments then W is also covariance stationary. 4. If W is covariance stationary and normally distributed then W is also strictly stationary.
22 1. Stationarity Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
23 1. Stationarity Special cases I 1. [White noise] Process W = { W t : t =..., 1, 0, 1,... } such that E[W t ] = 0 G W (s) = E[W t W t s ] = Call W w.n. (0, ν 2 ). { ν 2 for s = 0; 0 otherwise.
24 1. Stationarity Special cases II 2. [Autoregressive (AR) process] Suppose ɛ = { ɛ t : t =..., 1, 0, 1,... } is w.n. (0, ν 2 ) and W t = ρw t 1 + ɛ t. Such a W is said to be a first-order autoregressive process, a first-order autoregression, or an AR(1).
25 1. Stationarity 3. Iterate AR(1) to see: Special cases III W t = ɛ t + ρw t 1 = ɛ t + ρ[ɛ t 1 + ρw t 2 ] = ɛ t + ρɛ t 1 + ρ 2 W t 2 = ɛ t + ρɛ t 1 + ρ 2 ɛ t 2 + ρ 3 W t 3 t 1 = ρ s ɛ t s + ρ t W 0. s=0 When ρ < 1, consider limit W t = s=0 ρ s ɛ t s + lim s ρs W t s = Var(W t ) = ν 2 ρ 2s + 0 = (1 ρ 2 ) 1 ν 2. s=0
26 1. Stationarity Special cases IV Call W t = ρ s ɛ t s = s=0 γ s ɛ t s s=0 the moving average representation (MAR) in current and lagged ɛ. The MAR gives W in terms of w.n. ɛ, with E[W t k ɛ t ] = 0 for k = 1, 2,....
27 1. Stationarity Special cases V 4. An AR(1) with ρ < 1 is covariance stationary and satisfies both CLT and LLN. [ ( )( ) ] E[W t W t 1 ] = E ρ s ɛ t s ρ s ɛ t 1 s s=0 s=0 [ = E (ɛ t + ρɛ t 1 + ρ 2 ɛ t ) ] (ɛ t 1 + ρɛ t ) = (ρ + ρ 3 + ρ ) ν 2 = (1 + ρ 2 + ρ )ρ ν 2 = (1 ρ 2 ) 1 ρν 2 = ρ E[W 2 t].
28 1. Stationarity Special cases VI 5. Repeating this calculation gives E[W t W t s ] = ρ s E[W 2 t] = (1 ρ 2 ) 1 ρ s ν 2 independent of t. 6. If, however, AR(1) with ρ = 1 then W t = [ t 1 s=0 ɛ t s ] }{{} does not converge + 1 t W 0 This is a so-called unit root situation; W is not covariance stationary. 7. AR(1) with ρ > 1?
29 1. Stationarity Special cases VII 8. AR(p) or p-th order autoregression If ɛ w.n. (0, ν 2 ) p W t = ρ j W t j + ɛ t, t =..., 1, 0, 1,... j=1 ρ(z) def = 1 p ρ j z j 0 on z 1, j=1 then W is covariance stationary.
30 1. Stationarity Special cases VIII 9. MA(q), q-th order moving average ɛ w.n. (0, ν 2 ) q W t = θ j ɛ t j. j=0 With no further restrictions on θ, provided only that q is finite, W is covariance stationary (and has well-defined covariogram). If q, W remains covariance stationary provided j=0 θ2 j < (square-summable coefficients). No conditions on locations of zeroes of j θ jz j. When q finite, then E[W t W t s ] = 0 for s > q, i.e., the covariogram vanishes outside a finite interval. E[W t k ɛ t ] = 0 for k = 1, 2,....
31 1. Stationarity Special cases IX 10. ARMA(p, q) ɛ w.n. (0, ν 2 ) p q W t = ρ j W t j + θ k ɛ t k. j=1 k=0 E[W t k ɛ t ] = 0 for k = 1, 2,.... Covariance stationary provided (1 p j=1 ρ jz j ) vanishes nowhere inside the unit circle z 1.
32 1. Stationarity Special cases X In all cases we can derive the (typically infinite) MAR in current and lagged ɛ: with j γ2 j < ; W t = γ j ɛ t j j=0 E[W t k ɛ t ] = 0, k = 1, 2,....
33 2. Some complex numbers. Spectral density Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
34 2. Some complex numbers. Spectral density For covariance stationarity the earlier p-th order autoregression referred to the condition ρ(z) = 1 p ρ j z j 0 on z 1. j=1 1. The terminology says ρ has no zeroes inside the unit circle, or all the zeroes of ρ are outside the unit circle. 2. (First-order check) When p = 1 the function is ρ(z) = 1 ρ 1 z. When ρ 1 < 1, the zero of the function ρ (i.e., that z that sets ρ(z) = 0) is 1/ρ 1, and that is outside the unit circle, 1/ρ 1 = 1/ ρ 1 > 1.
35 2. Some complex numbers. Spectral density The unit circle. Euler s formula I 1 ω z = cos ω + i sin ω R z = cos ω + i sin ω = e iω with z = e iω = 1
36 2. Some complex numbers. Spectral density Fourier transform I The formula f(x) = Y = a + bx describes 1. A straight line relating X to Y in the form: Y f(x) X
37 2. Some complex numbers. Spectral density Fourier transform II 2. A mapping taking the pair of numbers (a, b) to a function, that happens to be a straight line: Ψ(a, b) a straight line with intercept a and slope b A different operator might map (a, b) to a quadratic ax + b/x, or an exponential function a exp(bx), or a sine wave a sin(b X), and so on. Generalise: If a = (..., a 2, a 1, a 0, a 1, a 2,... ) is a sequence of numbers, we say its Fourier Transform is the mapping that takes sequence a to the function ã(ω) = j= a j e ijω, ω ( π, π]
38 2. Some complex numbers. Spectral density Fourier transform III Notice that if we write z = e iω (as before, except now with a reciprocal) then the Fourier transform is just ã(ω) = a j z j, j= z = e iω and z = e iω over ω ( π, π] is just z = 1. Finally, notice on the unit circle the set ( π, π] traces out exactly the same range (0, 2π]. It is natural to interpret ω then as frequency: Any given, fixed value to ω says how quickly z j circumnavigates the unit circle: as j rises through the integers. z j = e iωj = cos(ωj) i sin(ωj).
39 2. Some complex numbers. Spectral density A compact toolkit I Previously encountered sequences and their now compact manipulation: 1. MAR γ 0, γ 1,..., typically with j γ2 j <. We can write γ(z) = j γ jz j. 2. AR(p) coefficients ρ 1, ρ 2,..., we have already defined and analysed ρ(z) def = 1 p j=1 ρ jz j. 3. The moving average coefficients in an ARMA process W t = p ρ j W t j + j=1 q θ k ɛ t k. k=0 can be found as the coefficients on z m in q γ(z) = γ m z m k=0 = θ kz k 1 p j=1 ρ jz j. m=0
40 2. Some complex numbers. Spectral density A compact toolkit II 4. Covariogram G = { G j : integer j }. Write G(z) = j= G jz j and call this too the covariogram. 5. It might seem like we are overloading symbols like ρ with different meanings: Sometimes ρ is a function, sometimes a sequence, sometimes just a single number (ρ j ). So too the name covariogram: a function, a sequence, a component of a VCV matrix?! But it s always easy to tell from context what is meant, and overloading terms like this is much preferred to proliferating excessively terminology and notation. (And since mathematicians and computer scientists do it as well, how awful can it be?)
41 2. Some complex numbers. Spectral density Fourier inversion I The Fourier transform can be inverted to give the original sequence, term by term: Theorem When ã is the Fourier transform of sequence then a = {..., a 1, a 0, a 1,... } a k = 1 +π ã(ω)e +iωk dω, for every integer k. 2π π
42 2. Some complex numbers. Spectral density Fourier inversion II This follows from writing out the right side 1 π 2π π ã(ω)e iωk dω = 1 2π π π after noticing that for all j k j π π a j e iω(j k) dω = 1 2π (2π)a k = a k, e iω(j k) dω = 0.
43 2. Some complex numbers. Spectral density Spectral density When a (scalar) stochastic process X has covariogram G then we call G s Fourier transform the spectral density of X: S(ω) = j= G j e iωj, ω ( π, π]. 1. The spectral density is just the covariogram evaluated on the unit circle. 2. Since G is symmetric about 0, i.e., G j = G j, the spectral density is real and also symmetric about 0. Moreover, since G 0 = 1 2π π π S(ω) dω, we can interpret S as spreading out X s variability across the range of frequencies ω (0, 2π].
44 2. Some complex numbers. Spectral density Examples I 1. (w.n.) Then { G0 for j = 0; G j = 0 otherwise. S(ω) = G j e iωj = G 0 j 2. (MA) Then G j = 0 for j > 1. S(ω) = G 0 + G 1 e iω + G 1 e iω = G 0 + G 1 [e iω + e iω ] = G 0 + 2G 1 cos(ω)
45 2. Some complex numbers. Spectral density Examples II 3. (AR) Then G j = ν2 1 ρ 2 ρ j S(ω) = = j= ρ j e iωj ν 2 1 ρ 2 = ν 2 (1 ρe iω )(1 ρe iω ) ν 2 (1 + ρ 2 ) ρe iω ρe iω = [ (1 + ρ 2 ) 2ρ cos ω ] 1 ν 2
46 2. Some complex numbers. Spectral density Spectral density: w.n.
47 2. Some complex numbers. Spectral density Spectral density: MA
48 2. Some complex numbers. Spectral density Spectral density: AR
49 2. Some complex numbers. Spectral density Spectral density: MA, negative
50 2. Some complex numbers. Spectral density Spectral density: AR, richer
51 2. Some complex numbers. Spectral density Spectral density: ARMA, almost w.n.
52 2. Some complex numbers. Spectral density Spectral density: US growth, Maddison Project
53 2. Some complex numbers. Spectral density Spectral density: US growth (renormalised variance), Maddison Project
54 3. Wold Representation. Innovations Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
55 3. Wold Representation. Innovations Wold Representation Theorem Theorem Suppose X is zero mean and covariance stationary. Then there is 1. a (one-sided) sequence of numbers C and 2. an ɛ white noise (0, ν 2 ) such that 3. at time t the random variable ɛ t is uncorrelated with all linear combinations of lagged X s (i.e., for t 1, t 2,... ); and 4. X t = j=0 C jɛ t j (up to inessential deterministic components).
56 3. Wold Representation. Innovations 1. At each t the random variable ɛ t is news relative to the history of X, i.e., it is unforecastable by the information in X. We call ɛ an innovations process for X. 2. Defining C(z) = j C jz j the covariogam of X can be found as the coefficients in G(z) = C(z)C(z 1 ) ν 2 = 3. The spectral density of X is k= G k z k. S(ω) = G(e iω ) = C(ω) C(ω) ν 2 = C(ω) 2 ν 2 (where denotes complex conjugate).
57 3. Wold Representation. Innovations 4. The spectral density of X at 0 is the sum of the covariogram S(0) = G(e i0 ) = j= G j 1 j = j= G j. 5. The variance of X is Var(ɛ) j C2 j (confirmed either directly or by taking the inverse Fourier transform of the spectral density).
58 4. Central Limit Theory and Laws of Large Numbers Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
59 4. Central Limit Theory and Laws of Large Numbers Layering the intuition I Theorem For sufficiently well-behaved zero-mean stochastic process u there exists b > 0 such that b T 1/2 T u t t=1 T T 1 u t t=1 L N(0, 1) Pr 0. (CLT) (LLN) (Trivial) u t iid N(0, σ 2 ); then σ 1 T 1/2 T t=1 u t N(0, 1).
60 4. Central Limit Theory and Laws of Large Numbers Layering the intuition II (Lindberg-Levy) u t iid Eu t = 0, Var u t = σ 2 0; then σ 1 T 1/2 T u t t=1 L N(0, 1). Proof Use characteristic functions. Write that of u t φ t (z) = e izx df t (x) where F t is df of u t. By independence, the characteristic function Φ T of σ 1 T 1/2 T u t t=1
61 4. Central Limit Theory and Laws of Large Numbers Layering the intuition III is a product of individual characteristic functions. Taking Taylor expansions up to second-order, Φ t (z) = [ ] T 1 z2 2T + o(z2 /T) e z2 /2 as T. (Liapunov) u t independent not identically distributed (inid) Eu t = 0, Eu 2 t = σ2 t, finite third moment; then ( T t=1 σ 2 t ) 1/2 T t=1 u t L N(0, 1).
62 4. Central Limit Theory and Laws of Large Numbers Layering the intuition IV (Lindberg-Feller) Same as Liapunov but with finite third moment condition replaced by lim max T 1 t T σ t C T = 0 ( T 1/2; where C T = t=1 t) σ2 then same CLT conclusion follows, i.e., ( T ) 1/2 T L N(0, 1). t=1 σ 2 t t=1 u t
63 4. Central Limit Theory and Laws of Large Numbers Layering the intuition V (Thm in White [1983]) Suppose u t is stationary with Eu 2 t = σ2 <. Assume E(u t u m, u m 1, u m 2,... ) q.m. 0 as m and where (Var R j ) 1/2 < j=0 R j = E(u 0 u j, u j 1,... ) E(u 0 u j 1, u j 2,... ). If λ 2 = E(u tu t s ) > 0, then λ 1 T 1/2 T u t t=1 L N(0, 1).
64 4. Central Limit Theory and Laws of Large Numbers CL Metatheorem I Theorem A zero-mean stochastic process u that is covariance stationary and ergodic satisfies S(0) 1/2 T 1/2 T u t t=1 L N(0, 1) with S the spectral density of u. Ergodic? The process sweeps out the entire sample space sufficiently often so that sample averages converge to the underlying population averages (that are, therefore, constants).
65 4. Central Limit Theory and Laws of Large Numbers CLT fails I (Fails to be ergodic) For u and ɛ w.n., the process W t = u 0 + ɛ t is w.n. but fails to be ergodic. It gets stuck around the value u 0. (Fails spectral density positive at zero) Suppose ɛ w.n. not normally distributed and W t = ɛ t ɛ t 1. Obviously W has mean 0 and is covariance stationary (moreover, it is ergodic). Yet because
66 4. Central Limit Theory and Laws of Large Numbers CLT fails II T T W t = ɛ t ɛ t 1 = ɛ T ɛ 0, t=1 t=1 in actuality W does not satisfy any CLT: T 1/2 T W T t=1 Pr 0 (not L N(, )) (no power of T can scale the running sums to produce a nondegenerate normal distribution). Here we see it s because the running sums telescope to just two terms, the beginning and the end ɛ. This, at first, appears very
67 4. Central Limit Theory and Laws of Large Numbers CLT fails III specific to this first-order MA. The condition, however, generalises to the spectral density of W vanishing at frequency 0: S(ω) = 1 e iω 2 Var(ɛ) = (1 cos ω) 2 Var(ɛ) = 0 at ω = 0. Any process with spectral density vanishing at frequency 0 will violate CLT.
68 5. Examples Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
69 5. Examples Examples 1. Expanded in EX1 2. Economic growth 3. Public debt 4. Exchange rates 5. Geography 6....
70 6. Conclusion Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion
71 6. Conclusion Concepts to remember and use 1. Stationarity. Covariance stationarity. 2. Covariogram. 3. ARMA processes. Zeroes outside the unit circle. Moving average representation. 4. Fourier transform. Spectral density. 5. Wold representation. Innovations 6. CLT, LLN
Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)
Economics Department LSE EC402 Lent 2015 Danny Quah TW1.10.01A x7535 : Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL) 1. Suppose ɛ is w.n. (0, σ 2 ), ρ < 1, and W t = ρw t 1 + ɛ t, for t = 1, 2,....
More information3. ARMA Modeling. Now: Important class of stationary processes
3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average
More informationFrequency Domain and Filtering
3 Frequency Domain and Filtering This is page i Printer: Opaque this 3. Introduction Comovement and volatility are key concepts in macroeconomics. Therefore, it is important to have statistics that describe
More informationECON 616: Lecture 1: Time Series Basics
ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters
More informationEcon 623 Econometrics II Topic 2: Stationary Time Series
1 Introduction Econ 623 Econometrics II Topic 2: Stationary Time Series In the regression model we can model the error term as an autoregression AR(1) process. That is, we can use the past value of the
More informationDiscrete time processes
Discrete time processes Predictions are difficult. Especially about the future Mark Twain. Florian Herzog 2013 Modeling observed data When we model observed (realized) data, we encounter usually the following
More informationEcon 424 Time Series Concepts
Econ 424 Time Series Concepts Eric Zivot January 20 2015 Time Series Processes Stochastic (Random) Process { 1 2 +1 } = { } = sequence of random variables indexed by time Observed time series of length
More informationLECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.
MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;
More informationIntroduction to Stochastic processes
Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space
More informationSome Time-Series Models
Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random
More informationClass 1: Stationary Time Series Analysis
Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models
More informationDifference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.
Difference equations Definitions: A difference equation takes the general form x t fx t 1,x t 2, defining the current value of a variable x as a function of previously generated values. A finite order
More informationBasic concepts and terminology: AR, MA and ARMA processes
ECON 5101 ADVANCED ECONOMETRICS TIME SERIES Lecture note no. 1 (EB) Erik Biørn, Department of Economics Version of February 1, 2011 Basic concepts and terminology: AR, MA and ARMA processes This lecture
More information1 Linear Difference Equations
ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with
More informationNonlinear time series
Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of
More informationTime Series 3. Robert Almgren. Sept. 28, 2009
Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E
More informationTime Series 2. Robert Almgren. Sept. 21, 2009
Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models
More informationSpectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania
Spectral Analysis Jesús Fernández-Villaverde University of Pennsylvania 1 Why Spectral Analysis? We want to develop a theory to obtain the business cycle properties of the data. Burns and Mitchell (1946).
More informationTime Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley
Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the
More informationLecture 1: Fundamental concepts in Time Series Analysis (part 2)
Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)
More information5: MULTIVARATE STATIONARY PROCESSES
5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability
More information7. MULTIVARATE STATIONARY PROCESSES
7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY & Contents PREFACE xiii 1 1.1. 1.2. Difference Equations First-Order Difference Equations 1 /?th-order Difference
More informationARMA Models: I VIII 1
ARMA Models: I autoregressive moving-average (ARMA) processes play a key role in time series analysis for any positive integer p & any purely nondeterministic process {X t } with ACVF { X (h)}, there is
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationChapter 4: Models for Stationary Time Series
Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t
More informationGeneralised AR and MA Models and Applications
Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and
More informationParametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)
Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides
More informationSTA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)
STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5) Outline 1 Announcements 2 Autocorrelation and Cross-Correlation 3 Stationary Time Series 4 Homework 1c Arthur Berg STA
More informationIdentifiability, Invertibility
Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:
More informationProf. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis
Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation
More informationCh. 14 Stationary ARMA Process
Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable
More information1 Class Organization. 2 Introduction
Time Series Analysis, Lecture 1, 2018 1 1 Class Organization Course Description Prerequisite Homework and Grading Readings and Lecture Notes Course Website: http://www.nanlifinance.org/teaching.html wechat
More information11. Further Issues in Using OLS with TS Data
11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,
More informationCh 5. Models for Nonstationary Time Series. Time Series Analysis
We have studied some deterministic and some stationary trend models. However, many time series data cannot be modeled in either way. Ex. The data set oil.price displays an increasing variation from the
More informationUniversity of Oxford. Statistical Methods Autocorrelation. Identification and Estimation
University of Oxford Statistical Methods Autocorrelation Identification and Estimation Dr. Órlaith Burke Michaelmas Term, 2011 Department of Statistics, 1 South Parks Road, Oxford OX1 3TG Contents 1 Model
More informationLECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.
LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity. Important points of Lecture 1: A time series {X t } is a series of observations taken sequentially over time: x t is an observation
More informationStochastic Processes: I. consider bowl of worms model for oscilloscope experiment:
Stochastic Processes: I consider bowl of worms model for oscilloscope experiment: SAPAscope 2.0 / 0 1 RESET SAPA2e 22, 23 II 1 stochastic process is: Stochastic Processes: II informally: bowl + drawing
More informationPart III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to
TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let
More informationAutoregressive and Moving-Average Models
Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms
More informationGaussian processes. Basic Properties VAG002-
Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity
More information8.2 Harmonic Regression and the Periodogram
Chapter 8 Spectral Methods 8.1 Introduction Spectral methods are based on thining of a time series as a superposition of sinusoidal fluctuations of various frequencies the analogue for a random process
More informationLecture 2: Univariate Time Series
Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:
More informationStatistics of Stochastic Processes
Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function
More informationClass: Trend-Cycle Decomposition
Class: Trend-Cycle Decomposition Macroeconometrics - Spring 2011 Jacek Suda, BdF and PSE June 1, 2011 Outline Outline: 1 Unobserved Component Approach 2 Beveridge-Nelson Decomposition 3 Spectral Analysis
More informationEc402 Econometrics. Suitable for all candidates. Summer 2012 (part of) examination. Instructions to candidates. Time allowed: 3 hours
Summer 2012 (part of) examination Ec402 Suitable for all candidates Instructions to candidates Time allowed: 3 hours This paper contains THREE sections. Answer all questions in Section A, ONE question
More informationCovariances of ARMA Processes
Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation
More informationEmpirical Market Microstructure Analysis (EMMA)
Empirical Market Microstructure Analysis (EMMA) Lecture 3: Statistical Building Blocks and Econometric Basics Prof. Dr. Michael Stein michael.stein@vwl.uni-freiburg.de Albert-Ludwigs-University of Freiburg
More informationTime Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY
Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7
More informationLecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)
Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency
More informationStochastic processes: basic notions
Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This
More informationFinal Exam. Economics 835: Econometrics. Fall 2010
Final Exam Economics 835: Econometrics Fall 2010 Please answer the question I ask - no more and no less - and remember that the correct answer is often short and simple. 1 Some short questions a) For each
More informationA time series is called strictly stationary if the joint distribution of every collection (Y t
5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a
More information1 Teaching notes on structural VARs.
Bent E. Sørensen February 22, 2007 1 Teaching notes on structural VARs. 1.1 Vector MA models: 1.1.1 Probability theory The simplest (to analyze, estimation is a different matter) time series models are
More informationTime Series Examples Sheet
Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,
More informationLesson 9: Autoregressive-Moving Average (ARMA) models
Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen
More informationProbability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver
Stochastic Signals Overview Definitions Second order statistics Stationarity and ergodicity Random signal variability Power spectral density Linear systems with stationary inputs Random signal memory Correlation
More informationγ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1
4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving
More informationPermanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko
Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko 1 / 36 The PIH Utility function is quadratic, u(c t ) = 1 2 (c t c) 2 ; borrowing/saving is allowed using only the risk-free bond; β(1 + r)
More informationCovariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )
Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y
More informationECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES
ECO 513 Fall 2009 C. Sims CONDIIONAL EXPECAION; SOCHASIC PROCESSES 1. HREE EXAMPLES OF SOCHASIC PROCESSES (I) X t has three possible time paths. With probability.5 X t t, with probability.25 X t t, and
More informationSTA205 Probability: Week 8 R. Wolpert
INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationApplied time-series analysis
Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,
More informationCh. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations
Ch 15 Forecasting Having considered in Chapter 14 some of the properties of ARMA models, we now show how they may be used to forecast future values of an observed time series For the present we proceed
More informationMA Advanced Econometrics: Applying Least Squares to Time Series
MA Advanced Econometrics: Applying Least Squares to Time Series Karl Whelan School of Economics, UCD February 15, 2011 Karl Whelan (UCD) Time Series February 15, 2011 1 / 24 Part I Time Series: Standard
More informationCh 4. Models For Stationary Time Series. Time Series Analysis
This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e
More informationAutoregressive Moving Average (ARMA) Models and their Practical Applications
Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:
More informationChapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes
Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.
More informationESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45
ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions
More information1.1. VARs, Wold representations and their limits
1. Shocks Nr. 1 1.1. VARs, Wold representations and their limits A brief review of VARs. Assume a true model, in MA form: X = A 0 e + A 1 e( 1) + A 2 e( 2) +...; E(ee ) = I = (A 0 + A 1 L + A 2 L 2 +...)
More informationTAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω
ECO 513 Spring 2015 TAKEHOME FINAL EXAM (1) Suppose the univariate stochastic process y is ARMA(2,2) of the following form: y t = 1.6974y t 1.9604y t 2 + ε t 1.6628ε t 1 +.9216ε t 2, (1) where ε is i.i.d.
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationCalculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by
Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary
More informationPure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0
MODULE 9: STATIONARY PROCESSES 7 Lecture 2 Autoregressive Processes 1 Moving Average Process Pure Random process Pure Random Process or White Noise Process: is a random process X t, t 0} which has: E[X
More informationStochastic Processes
Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.
More information3 Theory of stationary random processes
3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation
More informationElements of Multivariate Time Series Analysis
Gregory C. Reinsel Elements of Multivariate Time Series Analysis Second Edition With 14 Figures Springer Contents Preface to the Second Edition Preface to the First Edition vii ix 1. Vector Time Series
More informationGARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50
GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6
More informationECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models
ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models Patrick Herb Brandeis University Spring 2016 Patrick Herb (Brandeis University) Standard Univariate Models ECON/FIN
More informationMultivariate Time Series
Multivariate Time Series Notation: I do not use boldface (or anything else) to distinguish vectors from scalars. Tsay (and many other writers) do. I denote a multivariate stochastic process in the form
More informationStrictly Stationary Solutions of Autoregressive Moving Average Equations
Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution
More informationConsider the trend-cycle decomposition of a time series y t
1 Unit Root Tests Consider the trend-cycle decomposition of a time series y t y t = TD t + TS t + C t = TD t + Z t The basic issue in unit root testing is to determine if TS t = 0. Two classes of tests,
More informationat least 50 and preferably 100 observations should be available to build a proper model
III Box-Jenkins Methods 1. Pros and Cons of ARIMA Forecasting a) need for data at least 50 and preferably 100 observations should be available to build a proper model used most frequently for hourly or
More informationChapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for
Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply
More informationAsymptotic distribution of GMM Estimator
Asymptotic distribution of GMM Estimator Eduardo Rossi University of Pavia Econometria finanziaria 2010 Rossi (2010) GMM 2010 1 / 45 Outline 1 Asymptotic Normality of the GMM Estimator 2 Long Run Covariance
More informationWeek 5 Quantitative Analysis of Financial Markets Characterizing Cycles
Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles Christopher Ting http://www.mysmu.edu/faculty/christophert/ Christopher Ting : christopherting@smu.edu.sg : 6828 0364 : LKCSB 5036
More informationARMA (and ARIMA) models are often expressed in backshift notation.
Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time
More informationAR, MA and ARMA models
AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For
More informationChapter 9: Forecasting
Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the
More informationForecasting with ARMA
Forecasting with ARMA Eduardo Rossi University of Pavia October 2013 Rossi Forecasting Financial Econometrics - 2013 1 / 32 Mean Squared Error Linear Projection Forecast of Y t+1 based on a set of variables
More informationEconometría 2: Análisis de series de Tiempo
Econometría 2: Análisis de series de Tiempo Karoll GOMEZ kgomezp@unal.edu.co http://karollgomez.wordpress.com Segundo semestre 2016 IX. Vector Time Series Models VARMA Models A. 1. Motivation: The vector
More informationStochastic process for macro
Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,
More informationTopic 4 Unit Roots. Gerald P. Dwyer. February Clemson University
Topic 4 Unit Roots Gerald P. Dwyer Clemson University February 2016 Outline 1 Unit Roots Introduction Trend and Difference Stationary Autocorrelations of Series That Have Deterministic or Stochastic Trends
More informationLecture 2: ARMA(p,q) models (part 2)
Lecture 2: ARMA(p,q) models (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC) Univariate time series Sept.
More informationIf we want to analyze experimental or simulated data we might encounter the following tasks:
Chapter 1 Introduction If we want to analyze experimental or simulated data we might encounter the following tasks: Characterization of the source of the signal and diagnosis Studying dependencies Prediction
More informationCh 6. Model Specification. Time Series Analysis
We start to build ARIMA(p,d,q) models. The subjects include: 1 how to determine p, d, q for a given series (Chapter 6); 2 how to estimate the parameters (φ s and θ s) of a specific ARIMA(p,d,q) model (Chapter
More informationBasics: Definitions and Notation. Stationarity. A More Formal Definition
Basics: Definitions and Notation A Univariate is a sequence of measurements of the same variable collected over (usually regular intervals of) time. Usual assumption in many time series techniques is that
More informationChapter 6: Model Specification for Time Series
Chapter 6: Model Specification for Time Series The ARIMA(p, d, q) class of models as a broad class can describe many real time series. Model specification for ARIMA(p, d, q) models involves 1. Choosing
More informationA Primer on Asymptotics
A Primer on Asymptotics Eric Zivot Department of Economics University of Washington September 30, 2003 Revised: October 7, 2009 Introduction The two main concepts in asymptotic theory covered in these
More information