Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Similar documents
This note introduces some key concepts in time series econometrics. First, we

11. Further Issues in Using OLS with TS Data

Econ 424 Time Series Concepts

LECTURES 2-3 : Stochastic Processes, Autocorrelation function. Stationarity.

Final Exam. Economics 835: Econometrics. Fall 2010

Econometría 2: Análisis de series de Tiempo

A Primer on Asymptotics

Probability Space. J. McNames Portland State University ECE 538/638 Stochastic Signals Ver

Introduction to Economic Time Series

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

ECON 616: Lecture 1: Time Series Basics

MA Advanced Econometrics: Applying Least Squares to Time Series

Introduction to Stochastic processes

1. Stochastic Processes and Stationarity

Stochastic process for macro

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

What s for today. Random Fields Autocovariance Stationarity, Isotropy. c Mikyoung Jun (Texas A&M) stat647 Lecture 2 August 30, / 13

1 Introduction to Generalized Least Squares

Lesson 4: Stationary stochastic processes

Economics Department LSE. Econometrics: Timeseries EXERCISE 1: SERIAL CORRELATION (ANALYTICAL)

Communication Theory II

STAT 520: Forecasting and Time Series. David B. Hitchcock University of South Carolina Department of Statistics

Spectral representations and ergodic theorems for stationary stochastic processes

Module 9: Stationary Processes

ECON3327: Financial Econometrics, Spring 2016

Econometría 2: Análisis de series de Tiempo

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

Empirical Macroeconomics

1. Fundamental concepts

7 Introduction to Time Series Time Series vs. Cross-Sectional Data Detrending Time Series... 15

Stochastic Processes

Covers Chapter 10-12, some of 16, some of 18 in Wooldridge. Regression Analysis with Time Series Data

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

7 Introduction to Time Series

Lecture 2: Univariate Time Series

Statistical signal processing

Time Series 2. Robert Almgren. Sept. 21, 2009

Multivariate Time Series

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

For a stochastic process {Y t : t = 0, ±1, ±2, ±3, }, the mean function is defined by (2.2.1) ± 2..., γ t,

Linear Regression with Time Series Data

If we want to analyze experimental or simulated data we might encounter the following tasks:

Economics 583: Econometric Theory I A Primer on Asymptotics

Generalized Method of Moments: I. Chapter 9, R. Davidson and J.G. MacKinnon, Econometric Theory and Methods, 2004, Oxford.

Understanding Regressions with Observations Collected at High Frequency over Long Span

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Empirical Macroeconomics

Nonlinear time series

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

1 Estimation of Persistent Dynamic Panel Data. Motivation

Some Time-Series Models

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

1 Motivation for Instrumental Variable (IV) Regression

Econometrics II - EXAM Answer each question in separate sheets in three hours

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Discrete Probability Refresher

6. The econometrics of Financial Markets: Empirical Analysis of Financial Time Series. MA6622, Ernesto Mordecki, CityU, HK, 2006.

Definition of a Stochastic Process

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Stat 248 Lab 2: Stationarity, More EDA, Basic TS Models

Introductory Econometrics

Stochastic Processes

STAT 248: EDA & Stationarity Handout 3

AR, MA and ARMA models

White Noise Processes (Section 6.2)

1 Linear Difference Equations

Topic 4 Unit Roots. Gerald P. Dwyer. February Clemson University

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

The regression model with one stochastic regressor.

Long-Run Covariability

Stochastic Processes: I. consider bowl of worms model for oscilloscope experiment:

Discrete time processes

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

A Course in Applied Econometrics Lecture 4: Linear Panel Data Models, II. Jeff Wooldridge IRP Lectures, UW Madison, August 2008

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

Eksamen på Økonomistudiet 2006-II Econometrics 2 June 9, 2006

Review of Statistics

A Course on Advanced Econometrics

10. Time series regression and forecasting

Lecture 19 - Decomposing a Time Series into its Trend and Cyclical Components

Dynamic Panels. Chapter Introduction Autoregressive Model

Chapter 6. Random Processes

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes

Linear models. Linear models are computationally convenient and remain widely used in. applied econometric research

LECTURE 10: MORE ON RANDOM PROCESSES

Econ 623 Econometrics II Topic 2: Stationary Time Series

11.1 Gujarati(2003): Chapter 12

Characterizing Forecast Uncertainty Prediction Intervals. The estimated AR (and VAR) models generate point forecasts of y t+s, y ˆ

Introduction to Econometrics

ECONOMICS 7200 MODERN TIME SERIES ANALYSIS Econometric Theory and Applications

ECON 3150/4150, Spring term Lecture 6

10) Time series econometrics

Lecture 5: Unit Roots, Cointegration and Error Correction Models The Spurious Regression Problem

Stochastic Processes. A stochastic process is a function of two variables:

Statistics of stochastic processes

EC212: Introduction to Econometrics Review Materials (Wooldridge, Appendix)

in the time series. The relation between y and x is contemporaneous.

LECTURE 2 LINEAR REGRESSION MODEL AND OLS

CHAPTER 21: TIME SERIES ECONOMETRICS: SOME BASIC CONCEPTS

Transcription:

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency and asymptotic normality of the OLS estimator in time series regressions with temporally dependent, predetermined (but not necessarily strictly exogenous) regressors and serially uncorrelated disturbances. [We will deal with serially correlated disturbances later in the course.] In this lecture we will state the Ergodic Theorem, an LLN that applies to stationary and ergodic stochastic processes. We will begin by defining and describing stationary and ergodic processes. In the next lecture we will state the Ergodic Stationary Martingale Differences CLT, providing a definition and description of martingales and martingale difference sequences before presenting that theorem. Then, in the following lecture, we apply these theorems to formulate a set of conditions under which the OLS estimator is consistent and asymptotically normal.

Definition Stochastic Process A sequence of random variables is also called a stochastic process or a time series (if the index refers to a period or point in time). Note - Sometimes it is most convenient to define the stochastic process {z i }over the positive (or nonnegative integers) i = 1,2, (or i = 0,1,2, ) and sometimes it is most convenient to define the process over the entire set of integers i =,-2,- 1,0,1,2,

Definition Realizations of a Stochastic Process The outcome of a stochastic process forms a sequence of real numbers, which we also write as {z i }. This sequence of real numbers is called a realization of the stochastic process or a (realizaition of the) time series. In econometrics, the time series data we observe, e.g., quarterly U.S. real GDP from 1960-2004, are thought of as part of a realization of a stochastic process. Our goal in applied time series analysis is to draw inferences about the stochastic process based upon the realization we have observed.

The most useful class of stochastic processes is the class of stationary stochastic processes.the basic idea underlying the notion of a stationary process is that in a probability sense, made precise by the following definition, the process behaves the same way over time. Definition Stationarity The stochastic process {z i }, i =,-1,0,1, is strictly stationary if all of its finite dimensional distributions are time invariant. That is, Pr ob( zi < α 1,..., z ) Pr ( 1,..., ) 1 i < α k k = ob zh+ i < α z 1 h+ i < α k k for all: positive interger k, integers i 1,,i k, integers h, and real numbers α 1,,α k. Note the z s can be random variables or random vectors (provided that they have the same dimension). In the case where the z s are random vectors, we would say that the process is jointly stationary. (It can be the case that each element of z is strictly stationary, but the vector process is not jointly stationary. See Example 2.3 in Hayashi.)

Fact Fact If {z i }is strictly stationary, the α-moment of z i, E( z α i ), is the same for all I, if it exists and is finite. (Why? Because the distribution of zi is the same for all i by the definition of stationarity. ) If {z i }is strictly stationary and f(.) is a continuous function, then {f(z i )} is also strictly stationary. So, for example,{y i }, y i = a 0 + a 1 z i+1 + a 2 z i + a 3 z i-1, is strictly stationary. Also, if the z s are m-dimensional and jointly stationary, then zz and z z are strictly stationary. If zz is nonsingular, then (zz ) -1 is stationary.

A couple of (extreme) examples of stationary stochastic processes: An i.i.d. sequence is a strictly stationary sequence (This follows almost immediate from the definition. 1) use the independence property to factor the joint distribution into the product of the marginal distributions. 2). Then use the identical distribution property that Prob(z i < α) = Prob(z j < α) for all i,j,α.) A constant sequence is a strictly stationary sequence. Suppose we flip a coin. If H, then z i = 0 for all i; if T, then z i = 1 for all i. Then, for example, Prob(z i < 1/2) = Prob(z j < 1/2) = Prob(H) for all i,j. Note that in the first example, the process has no memory; in the second example the process has inifinite memory the initial value completely determines the remainder of the sequence.

In order for stationary processes to be of use to us, we will need to restrict the class of stationary processes to those with sufficiently weak memory. (In example 2, there will be no way to infer, e.g., the probability of H, from a single realization of the process, regardless of how many observations we get to see.)

Ergodicity Stationarity is a useful concept because it means that there is something that is fixed across the sequence of random variables for us to learn about from observing outcomes of the process: the fixed finite dimensional distributions and their moments (provided these exist). However, in order for us to learn about the characteristics of the stationary process as the realization unfolds, there must be new information contained in the new observations. An additional condition that relates to this requirement is the condtion of ergodicity. Ergodicity is a condition that restricts the memory of the process. It can be defined in a variety of ways. A loose definition of ergodicity is that the process is asymptotically independent. That is, for sufficiently large n, z i and z i+n are nearly independent. A more formal definition is provided in the text. All of these definitions essentially say that the effect of the present on the future eventually disappears.

An i.i.d. sequence is ergodic (though ergodic sequences need not be i.i.d.). The stochastic process defined above by the coin toss example is not ergodic. Bottom line Stationary and ergodic processes allow for processes that are temporally dependent but with sufficiently weak memory for learing to take place as new observations are revealed.

The Ergodic Theorem - Let {z i } be stationary and ergodic with E(z i ) = µ (i.e., the mean of the process exists and is finite). Then 1 n n 1 z i µ a. s. That is, if {z i }is stationary and ergodic with a finite mean, then the sample mean is a (strongly) consistent estimator of that mean.

An important corollary to the Ergodic Theorem Let {z i } be stationary and ergodic and let f(.) be a continuous function. Assume that E(f(z i )) = η. Then 1 n n 1 f ( z i ) η a. s. (The corollary follows from the Ergodic Theorem because {f(z i )} will be stationary and ergodic if {z i } is stationary and ergodic.)

Digression on Covariance Stationary Process A second commonly encountered class of stationary processes is the class of covariance stationary processes (also called weakly stationary or stationary in the wide-sense processes). Definition Covariance Stationarity The stochastic process {z i }, i = 1,2, is covariance stationary if i. E(z i ) = µ for i = 1,2, ii. Var(z i ) = σ 2 < for i = 1,2, iii. Cov(z i,z i-j ) = γ j for all i,j That is, a stochastic process is covariance stationary if it has a constant and finite variance, a constant mean, and the covariance between two elements of the sequence only depends on how far apart they are.

Note that a strictly stationary process will be covariance stationary if it has a finite variance. a covariance stationary process does not require that the z i s have identical distributions; thus strictly stationary processes are d.i.d. while covariance stationary processes can be d.ni.d. Fact If {z i } is stationary and ergodic and if Var(z i ) = σ 2 <, then the Ergodic Theorem can be applied to show that and ˆ γ 1 n j, n = ( zi ˆ)( µ z ˆ i j µ ) n a. s. i j+ 1 = γ j ˆ ρ j, n = ˆ γ j, n / ˆ γ 0, n ρ a. s. j That is, the sample autocovariances and sample autocorrrelations are consistent estimators of the population autocovariances and autocorrelations.

Definition The stochastic process {z i }is a white noise process if i. E(z i ) =0 for i = 1,2, ii. Var(z i ) = σ 2 < for i = 1,2, iii. Cov(z i,z i-j ) =0 for all i j That is, a white noise (w.n.) process, is a zero-mean, constant variance, and serially uncorrelated process. A w.n. process is covariance stationary (but not necessarily strictly stationary,since the z i s are not necessarily identically distributed). An i.i.d. sequence is a white noise sequence if it has a finite variance. White noise processes are the fundamental building building blocks of covariance stationary processes and play a very important role in time series analysis.

The differences between strict stationarity and covariance stationarity is, for the most part, only of interest to the theoretician. That is, if we are willing to treat a particular time series as a covariance stationary process there is usually little reason to think that it s not also strictly stationary and vice versa. So, why are both definitions useful? In theoretical settings, when we are trying to establish consistency and asymptotic normality it is often easier to work under the assumption of strict stationarity. However, in applications when we look at a time series and consider whether it looks like a realization from a stationary process, we usually think in terms of the conditions for covariance stationarity. In applications, we observe part of a single realization of a stochastic process, say, the real numbers z 1,,z n, and then we have to decide whether it is reasonable to assume that this is a realization of stationary stochastic process (or not).

Later in this course, if we have time, we will talk about testing this assumption against a particular type of non-stationarity. But, often our willingness to make this assumption is based on observating the time series graph of the series and asking the following questions 1. Does it look like a realization of a process with a constant mean? Or, does it look like the realization of a process with an increasing mean? (I.e., does the series display a time trend?) 2. Does it look like a realization of a process with a constant variance? Or, does it look like the volatility of the process is varying systematically with time? Consider, for example, the U.S. unemployment rate and U.S. real GDP. Many economic time series, like real GDP, seem to be nonstationary because their means are increasing with time. This would seem to greatly limit the appeal and usefulness of stationarity. Although these series appear to be nonstationary, there might be simple transformations that can be applied to create stationary series: first differencing, removing a linear trend,