EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

Similar documents
Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)

3. ARMA Modeling. Now: Important class of stationary processes

Frequency Domain and Filtering

ECON 616: Lecture 1: Time Series Basics

Econ 623 Econometrics II Topic 2: Stationary Time Series

Discrete time processes

Econ 424 Time Series Concepts

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

Introduction to Stochastic processes

Some Time-Series Models

Class 1: Stationary Time Series Analysis

Difference equations. Definitions: A difference equation takes the general form. x t f x t 1,,x t m.

Basic concepts and terminology: AR, MA and ARMA processes

1 Linear Difference Equations

Nonlinear time series

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009

Spectral Analysis. Jesús Fernández-Villaverde University of Pennsylvania

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

5: MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

ARMA Models: I VIII 1

Time Series Examples Sheet

Chapter 4: Models for Stationary Time Series

Generalised AR and MA Models and Applications

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

STA 6857 Autocorrelation and Cross-Correlation & Stationary Time Series ( 1.4, 1.5)

Identifiability, Invertibility

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Ch. 14 Stationary ARMA Process

1 Class Organization. 2 Introduction

11. Further Issues in Using OLS with TS Data

Ch 5. Models for Nonstationary Time Series. Time Series Analysis

University of Oxford. Statistical Methods Autocorrelation. Identification and Estimation

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Autoregressive and Moving-Average Models

Gaussian processes. Basic Properties VAG002-

8.2 Harmonic Regression and the Periodogram

Lecture 2: Univariate Time Series

Statistics of Stochastic Processes

Class: Trend-Cycle Decomposition

Ec402 Econometrics. Suitable for all candidates. Summer 2012 (part of) examination. Instructions to candidates. Time allowed: 3 hours

Covariances of ARMA Processes

Empirical Market Microstructure Analysis (EMMA)

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Stochastic processes: basic notions

Final Exam. Economics 835: Econometrics. Fall 2010

A time series is called strictly stationary if the joint distribution of every collection (Y t

1 Teaching notes on structural VARs.

Time Series Examples Sheet

Lesson 9: Autoregressive-Moving Average (ARMA) models

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

Permanent Income Hypothesis (PIH) Instructor: Dmytro Hryshko

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

ECO 513 Fall 2009 C. Sims CONDITIONAL EXPECTATION; STOCHASTIC PROCESSES

STA205 Probability: Week 8 R. Wolpert

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

Applied time-series analysis

Ch. 15 Forecasting. 1.1 Forecasts Based on Conditional Expectations

MA Advanced Econometrics: Applying Least Squares to Time Series

Ch 4. Models For Stationary Time Series. Time Series Analysis

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

1.1. VARs, Wold representations and their limits

TAKEHOME FINAL EXAM e iω e 2iω e iω e 2iω

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Pure Random process Pure Random Process or White Noise Process: is a random process {X t, t 0} which has: { σ 2 if k = 0 0 if k 0

Stochastic Processes

3 Theory of stationary random processes

Elements of Multivariate Time Series Analysis

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

ECON/FIN 250: Forecasting in Finance and Economics: Section 6: Standard Univariate Models

Multivariate Time Series

Strictly Stationary Solutions of Autoregressive Moving Average Equations

Consider the trend-cycle decomposition of a time series y t

at least 50 and preferably 100 observations should be available to build a proper model

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Asymptotic distribution of GMM Estimator

Week 5 Quantitative Analysis of Financial Markets Characterizing Cycles

ARMA (and ARIMA) models are often expressed in backshift notation.

AR, MA and ARMA models

Chapter 9: Forecasting

Forecasting with ARMA

Econometría 2: Análisis de series de Tiempo

Stochastic process for macro

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Lecture 2: ARMA(p,q) models (part 2)

If we want to analyze experimental or simulated data we might encounter the following tasks:

Ch 6. Model Specification. Time Series Analysis

Basics: Definitions and Notation. Stationarity. A More Formal Definition

Chapter 6: Model Specification for Time Series

A Primer on Asymptotics

Transcription:

EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015

OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

1. Stationarity Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

1. Stationarity Four economies Long run growth

1. Stationarity log US GDP Long run growth

1. Stationarity US GDP Long run growth

1. Stationarity G7, EMDE Convergence

1. Stationarity G7, EMDE Not decoupling

1. Stationarity Trend and cycle? Annual growth rates. Subsample: 1980 2000 2001 2014 Cross-correlation G7 and EMDE 0.2 0.8 Table: The cross-correlation between G7 and EMDE growth rates has increased four-fold (data from IMF WEO Oct 2014). Y t = X t + Z t = Y t = X t + Z t

1. Stationarity G7, EMDE Long-term decoupling?

1. Stationarity Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

1. Stationarity Covariance stationarity Say a process W = { W t : t =..., 1, 0, 1,... } is covariance stationary (or second-order stationary) when 1. EW t = EW 0 for all t; 2. E[W t W t s ] for any s is finite and independent of t. Without loss of generality take EW 0 = 0. Therefore: G W (s) def = E[W t W t s ] = E[W t+s W t ] = E[W t s W t ] = G W ( s).

1. Stationarity Long Run Economic Growth

1. Stationarity US Economic Growth: Early

1. Stationarity US Economic Growth: Later

1. Stationarity 1. Call G the covariogram: Symmetric about 0, i.e., G W (j) = G W ( j) for all j. 2. Vector (W t, W t 1, W t 2,..., W t n ) has VCV G W (0) G W (1)... G W (n 1) G W (n) G W (1) G W (0)... G W (n 2) G W (n 1)......... G W (n) G W (n 1)... G W (1) G W (0) The matrix just formed must be positive semidefinite (since it s a VCV); we then say the sequence G W itself is positive semidefinite.

1. Stationarity 3. Often we take the process as given and just write G rather than G W. Often we find convenient to write the lag as subscript rather than as an argument in parenthesis. Thus the covariogram: G = { G j : j =..., 2, 1, 0, 1, 2,... } a doubly-infinite positive semidefinite sequence of numbers symmetric about 0.

1. Stationarity Covariogram US Economic Growth: Early

1. Stationarity Covariogram US Economic Growth: Later

1. Stationarity Say the process W = { W t : t =..., 1, 0, 1,... } is stationary (or strictly stationary) when for all n and all t 1, t 2,..., t n the joint distributions F(W t1 +s, W t2 +s,..., W tn +s) are independent of s. Neither covariance stationarity nor strict stationarity is implied by the other.

1. Stationarity 1. Suppose W t iid Cauchy (infinite variance): strictly stationary but not covariance stationary. 2. Covariance stationarity does not specify moments higher than second-order, and thus does not specify the entire distribution. 3. If W is strictly stationary and has finite second-order moments then W is also covariance stationary. 4. If W is covariance stationary and normally distributed then W is also strictly stationary.

1. Stationarity Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

1. Stationarity Special cases I 1. [White noise] Process W = { W t : t =..., 1, 0, 1,... } such that E[W t ] = 0 G W (s) = E[W t W t s ] = Call W w.n. (0, ν 2 ). { ν 2 for s = 0; 0 otherwise.

1. Stationarity Special cases II 2. [Autoregressive (AR) process] Suppose ɛ = { ɛ t : t =..., 1, 0, 1,... } is w.n. (0, ν 2 ) and W t = ρw t 1 + ɛ t. Such a W is said to be a first-order autoregressive process, a first-order autoregression, or an AR(1).

1. Stationarity 3. Iterate AR(1) to see: Special cases III W t = ɛ t + ρw t 1 = ɛ t + ρ[ɛ t 1 + ρw t 2 ] = ɛ t + ρɛ t 1 + ρ 2 W t 2 = ɛ t + ρɛ t 1 + ρ 2 ɛ t 2 + ρ 3 W t 3 t 1 = ρ s ɛ t s + ρ t W 0. s=0 When ρ < 1, consider limit W t = s=0 ρ s ɛ t s + lim s ρs W t s = Var(W t ) = ν 2 ρ 2s + 0 = (1 ρ 2 ) 1 ν 2. s=0

1. Stationarity Special cases IV Call W t = ρ s ɛ t s = s=0 γ s ɛ t s s=0 the moving average representation (MAR) in current and lagged ɛ. The MAR gives W in terms of w.n. ɛ, with E[W t k ɛ t ] = 0 for k = 1, 2,....

1. Stationarity Special cases V 4. An AR(1) with ρ < 1 is covariance stationary and satisfies both CLT and LLN. [ ( )( ) ] E[W t W t 1 ] = E ρ s ɛ t s ρ s ɛ t 1 s s=0 s=0 [ = E (ɛ t + ρɛ t 1 + ρ 2 ɛ t 2+... ) ] (ɛ t 1 + ρɛ t 2 +... ) = (ρ + ρ 3 + ρ 5 +... ) ν 2 = (1 + ρ 2 + ρ 4 +... )ρ ν 2 = (1 ρ 2 ) 1 ρν 2 = ρ E[W 2 t].

1. Stationarity Special cases VI 5. Repeating this calculation gives E[W t W t s ] = ρ s E[W 2 t] = (1 ρ 2 ) 1 ρ s ν 2 independent of t. 6. If, however, AR(1) with ρ = 1 then W t = [ t 1 s=0 ɛ t s ] }{{} does not converge + 1 t W 0 This is a so-called unit root situation; W is not covariance stationary. 7. AR(1) with ρ > 1?

1. Stationarity Special cases VII 8. AR(p) or p-th order autoregression If ɛ w.n. (0, ν 2 ) p W t = ρ j W t j + ɛ t, t =..., 1, 0, 1,... j=1 ρ(z) def = 1 p ρ j z j 0 on z 1, j=1 then W is covariance stationary.

1. Stationarity Special cases VIII 9. MA(q), q-th order moving average ɛ w.n. (0, ν 2 ) q W t = θ j ɛ t j. j=0 With no further restrictions on θ, provided only that q is finite, W is covariance stationary (and has well-defined covariogram). If q, W remains covariance stationary provided j=0 θ2 j < (square-summable coefficients). No conditions on locations of zeroes of j θ jz j. When q finite, then E[W t W t s ] = 0 for s > q, i.e., the covariogram vanishes outside a finite interval. E[W t k ɛ t ] = 0 for k = 1, 2,....

1. Stationarity Special cases IX 10. ARMA(p, q) ɛ w.n. (0, ν 2 ) p q W t = ρ j W t j + θ k ɛ t k. j=1 k=0 E[W t k ɛ t ] = 0 for k = 1, 2,.... Covariance stationary provided (1 p j=1 ρ jz j ) vanishes nowhere inside the unit circle z 1.

1. Stationarity Special cases X In all cases we can derive the (typically infinite) MAR in current and lagged ɛ: with j γ2 j < ; W t = γ j ɛ t j j=0 E[W t k ɛ t ] = 0, k = 1, 2,....

2. Some complex numbers. Spectral density Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

2. Some complex numbers. Spectral density For covariance stationarity the earlier p-th order autoregression referred to the condition ρ(z) = 1 p ρ j z j 0 on z 1. j=1 1. The terminology says ρ has no zeroes inside the unit circle, or all the zeroes of ρ are outside the unit circle. 2. (First-order check) When p = 1 the function is ρ(z) = 1 ρ 1 z. When ρ 1 < 1, the zero of the function ρ (i.e., that z that sets ρ(z) = 0) is 1/ρ 1, and that is outside the unit circle, 1/ρ 1 = 1/ ρ 1 > 1.

2. Some complex numbers. Spectral density The unit circle. Euler s formula I 1 ω z = cos ω + i sin ω R z = cos ω + i sin ω = e iω with z = e iω = 1

2. Some complex numbers. Spectral density Fourier transform I The formula f(x) = Y = a + bx describes 1. A straight line relating X to Y in the form: Y f(x) X

2. Some complex numbers. Spectral density Fourier transform II 2. A mapping taking the pair of numbers (a, b) to a function, that happens to be a straight line: Ψ(a, b) a straight line with intercept a and slope b A different operator might map (a, b) to a quadratic ax + b/x, or an exponential function a exp(bx), or a sine wave a sin(b X), and so on. Generalise: If a = (..., a 2, a 1, a 0, a 1, a 2,... ) is a sequence of numbers, we say its Fourier Transform is the mapping that takes sequence a to the function ã(ω) = j= a j e ijω, ω ( π, π]

2. Some complex numbers. Spectral density Fourier transform III Notice that if we write z = e iω (as before, except now with a reciprocal) then the Fourier transform is just ã(ω) = a j z j, j= z = e iω and z = e iω over ω ( π, π] is just z = 1. Finally, notice on the unit circle the set ( π, π] traces out exactly the same range (0, 2π]. It is natural to interpret ω then as frequency: Any given, fixed value to ω says how quickly z j circumnavigates the unit circle: as j rises through the integers. z j = e iωj = cos(ωj) i sin(ωj).

2. Some complex numbers. Spectral density A compact toolkit I Previously encountered sequences and their now compact manipulation: 1. MAR γ 0, γ 1,..., typically with j γ2 j <. We can write γ(z) = j γ jz j. 2. AR(p) coefficients ρ 1, ρ 2,..., we have already defined and analysed ρ(z) def = 1 p j=1 ρ jz j. 3. The moving average coefficients in an ARMA process W t = p ρ j W t j + j=1 q θ k ɛ t k. k=0 can be found as the coefficients on z m in q γ(z) = γ m z m k=0 = θ kz k 1 p j=1 ρ jz j. m=0

2. Some complex numbers. Spectral density A compact toolkit II 4. Covariogram G = { G j : integer j }. Write G(z) = j= G jz j and call this too the covariogram. 5. It might seem like we are overloading symbols like ρ with different meanings: Sometimes ρ is a function, sometimes a sequence, sometimes just a single number (ρ j ). So too the name covariogram: a function, a sequence, a component of a VCV matrix?! But it s always easy to tell from context what is meant, and overloading terms like this is much preferred to proliferating excessively terminology and notation. (And since mathematicians and computer scientists do it as well, how awful can it be?)

2. Some complex numbers. Spectral density Fourier inversion I The Fourier transform can be inverted to give the original sequence, term by term: Theorem When ã is the Fourier transform of sequence then a = {..., a 1, a 0, a 1,... } a k = 1 +π ã(ω)e +iωk dω, for every integer k. 2π π

2. Some complex numbers. Spectral density Fourier inversion II This follows from writing out the right side 1 π 2π π ã(ω)e iωk dω = 1 2π π π after noticing that for all j k j π π a j e iω(j k) dω = 1 2π (2π)a k = a k, e iω(j k) dω = 0.

2. Some complex numbers. Spectral density Spectral density When a (scalar) stochastic process X has covariogram G then we call G s Fourier transform the spectral density of X: S(ω) = j= G j e iωj, ω ( π, π]. 1. The spectral density is just the covariogram evaluated on the unit circle. 2. Since G is symmetric about 0, i.e., G j = G j, the spectral density is real and also symmetric about 0. Moreover, since G 0 = 1 2π π π S(ω) dω, we can interpret S as spreading out X s variability across the range of frequencies ω (0, 2π].

2. Some complex numbers. Spectral density Examples I 1. (w.n.) Then { G0 for j = 0; G j = 0 otherwise. S(ω) = G j e iωj = G 0 j 2. (MA) Then G j = 0 for j > 1. S(ω) = G 0 + G 1 e iω + G 1 e iω = G 0 + G 1 [e iω + e iω ] = G 0 + 2G 1 cos(ω)

2. Some complex numbers. Spectral density Examples II 3. (AR) Then G j = ν2 1 ρ 2 ρ j S(ω) = = j= ρ j e iωj ν 2 1 ρ 2 = ν 2 (1 ρe iω )(1 ρe iω ) ν 2 (1 + ρ 2 ) ρe iω ρe iω = [ (1 + ρ 2 ) 2ρ cos ω ] 1 ν 2

2. Some complex numbers. Spectral density Spectral density: w.n.

2. Some complex numbers. Spectral density Spectral density: MA

2. Some complex numbers. Spectral density Spectral density: AR

2. Some complex numbers. Spectral density Spectral density: MA, negative

2. Some complex numbers. Spectral density Spectral density: AR, richer

2. Some complex numbers. Spectral density Spectral density: ARMA, almost w.n.

2. Some complex numbers. Spectral density Spectral density: US growth, Maddison Project

2. Some complex numbers. Spectral density Spectral density: US growth (renormalised variance), Maddison Project

3. Wold Representation. Innovations Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

3. Wold Representation. Innovations Wold Representation Theorem Theorem Suppose X is zero mean and covariance stationary. Then there is 1. a (one-sided) sequence of numbers C and 2. an ɛ white noise (0, ν 2 ) such that 3. at time t the random variable ɛ t is uncorrelated with all linear combinations of lagged X s (i.e., for t 1, t 2,... ); and 4. X t = j=0 C jɛ t j (up to inessential deterministic components).

3. Wold Representation. Innovations 1. At each t the random variable ɛ t is news relative to the history of X, i.e., it is unforecastable by the information in X. We call ɛ an innovations process for X. 2. Defining C(z) = j C jz j the covariogam of X can be found as the coefficients in G(z) = C(z)C(z 1 ) ν 2 = 3. The spectral density of X is k= G k z k. S(ω) = G(e iω ) = C(ω) C(ω) ν 2 = C(ω) 2 ν 2 (where denotes complex conjugate).

3. Wold Representation. Innovations 4. The spectral density of X at 0 is the sum of the covariogram S(0) = G(e i0 ) = j= G j 1 j = j= G j. 5. The variance of X is Var(ɛ) j C2 j (confirmed either directly or by taking the inverse Fourier transform of the spectral density).

4. Central Limit Theory and Laws of Large Numbers Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

4. Central Limit Theory and Laws of Large Numbers Layering the intuition I Theorem For sufficiently well-behaved zero-mean stochastic process u there exists b > 0 such that b T 1/2 T u t t=1 T T 1 u t t=1 L N(0, 1) Pr 0. (CLT) (LLN) (Trivial) u t iid N(0, σ 2 ); then σ 1 T 1/2 T t=1 u t N(0, 1).

4. Central Limit Theory and Laws of Large Numbers Layering the intuition II (Lindberg-Levy) u t iid Eu t = 0, Var u t = σ 2 0; then σ 1 T 1/2 T u t t=1 L N(0, 1). Proof Use characteristic functions. Write that of u t φ t (z) = e izx df t (x) where F t is df of u t. By independence, the characteristic function Φ T of σ 1 T 1/2 T u t t=1

4. Central Limit Theory and Laws of Large Numbers Layering the intuition III is a product of individual characteristic functions. Taking Taylor expansions up to second-order, Φ t (z) = [ ] T 1 z2 2T + o(z2 /T) e z2 /2 as T. (Liapunov) u t independent not identically distributed (inid) Eu t = 0, Eu 2 t = σ2 t, finite third moment; then ( T t=1 σ 2 t ) 1/2 T t=1 u t L N(0, 1).

4. Central Limit Theory and Laws of Large Numbers Layering the intuition IV (Lindberg-Feller) Same as Liapunov but with finite third moment condition replaced by lim max T 1 t T σ t C T = 0 ( T 1/2; where C T = t=1 t) σ2 then same CLT conclusion follows, i.e., ( T ) 1/2 T L N(0, 1). t=1 σ 2 t t=1 u t

4. Central Limit Theory and Laws of Large Numbers Layering the intuition V (Thm. 5.15 in White [1983]) Suppose u t is stationary with Eu 2 t = σ2 <. Assume E(u t u m, u m 1, u m 2,... ) q.m. 0 as m and where (Var R j ) 1/2 < j=0 R j = E(u 0 u j, u j 1,... ) E(u 0 u j 1, u j 2,... ). If λ 2 = E(u tu t s ) > 0, then λ 1 T 1/2 T u t t=1 L N(0, 1).

4. Central Limit Theory and Laws of Large Numbers CL Metatheorem I Theorem A zero-mean stochastic process u that is covariance stationary and ergodic satisfies S(0) 1/2 T 1/2 T u t t=1 L N(0, 1) with S the spectral density of u. Ergodic? The process sweeps out the entire sample space sufficiently often so that sample averages converge to the underlying population averages (that are, therefore, constants).

4. Central Limit Theory and Laws of Large Numbers CLT fails I (Fails to be ergodic) For u and ɛ w.n., the process W t = u 0 + ɛ t is w.n. but fails to be ergodic. It gets stuck around the value u 0. (Fails spectral density positive at zero) Suppose ɛ w.n. not normally distributed and W t = ɛ t ɛ t 1. Obviously W has mean 0 and is covariance stationary (moreover, it is ergodic). Yet because

4. Central Limit Theory and Laws of Large Numbers CLT fails II T T W t = ɛ t ɛ t 1 = ɛ T ɛ 0, t=1 t=1 in actuality W does not satisfy any CLT: T 1/2 T W T t=1 Pr 0 (not L N(, )) (no power of T can scale the running sums to produce a nondegenerate normal distribution). Here we see it s because the running sums telescope to just two terms, the beginning and the end ɛ. This, at first, appears very

4. Central Limit Theory and Laws of Large Numbers CLT fails III specific to this first-order MA. The condition, however, generalises to the spectral density of W vanishing at frequency 0: S(ω) = 1 e iω 2 Var(ɛ) = (1 cos ω) 2 Var(ɛ) = 0 at ω = 0. Any process with spectral density vanishing at frequency 0 will violate CLT.

5. Examples Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

5. Examples Examples 1. Expanded in EX1 2. Economic growth 3. Public debt 4. Exchange rates 5. Geography 6....

6. Conclusion Outline 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers. Spectral density 3. Wold Representation. Innovations 4. Central Limit Theory and Laws of Large Numbers 5. Examples 6. Conclusion

6. Conclusion Concepts to remember and use 1. Stationarity. Covariance stationarity. 2. Covariogram. 3. ARMA processes. Zeroes outside the unit circle. Moving average representation. 4. Fourier transform. Spectral density. 5. Wold representation. Innovations 6. CLT, LLN