Econ Autocorrelation. Sanjaya DeSilva

Similar documents
R t. C t P t. + u t. C t = αp t + βr t + v t. + β + w t

Econ107 Applied Econometrics Topic 7: Multicollinearity (Studenmund, Chapter 8)

Regression with Time Series Data

1. Diagnostic (Misspeci cation) Tests: Testing the Assumptions

Solutions to Odd Number Exercises in Chapter 6

Licenciatura de ADE y Licenciatura conjunta Derecho y ADE. Hoja de ejercicios 2 PARTE A

Outline. lse-logo. Outline. Outline. 1 Wald Test. 2 The Likelihood Ratio Test. 3 Lagrange Multiplier Tests

Stationary Time Series

Dynamic Models, Autocorrelation and Forecasting

Chapter 16. Regression with Time Series Data

ACE 562 Fall Lecture 8: The Simple Linear Regression Model: R 2, Reporting the Results and Prediction. by Professor Scott H.

ECON 482 / WH Hong Time Series Data Analysis 1. The Nature of Time Series Data. Example of time series data (inflation and unemployment rates)

Dynamic Econometric Models: Y t = + 0 X t + 1 X t X t k X t-k + e t. A. Autoregressive Model:

Distribution of Least Squares

Distribution of Estimates

Vectorautoregressive Model and Cointegration Analysis. Time Series Analysis Dr. Sevtap Kestel 1

A Specification Test for Linear Dynamic Stochastic General Equilibrium Models

GMM - Generalized Method of Moments

ACE 562 Fall Lecture 5: The Simple Linear Regression Model: Sampling Properties of the Least Squares Estimators. by Professor Scott H.

Generalized Least Squares

ACE 564 Spring Lecture 7. Extensions of The Multiple Regression Model: Dummy Independent Variables. by Professor Scott H.

Diebold, Chapter 7. Francis X. Diebold, Elements of Forecasting, 4th Edition (Mason, Ohio: Cengage Learning, 2006). Chapter 7. Characterizing Cycles

Wednesday, November 7 Handout: Heteroskedasticity

Modeling and Forecasting Volatility Autoregressive Conditional Heteroskedasticity Models. Economic Forecasting Anthony Tay Slide 1

14 Autoregressive Moving Average Models

Nature Neuroscience: doi: /nn Supplementary Figure 1. Spike-count autocorrelations in time.

(10) (a) Derive and plot the spectrum of y. Discuss how the seasonality in the process is evident in spectrum.

The Effect of Nonzero Autocorrelation Coefficients on the Distributions of Durbin-Watson Test Estimator: Three Autoregressive Models

Solutions: Wednesday, November 14

Properties of Autocorrelated Processes Economics 30331

Solutions to Exercises in Chapter 12

Comparing Means: t-tests for One Sample & Two Related Samples

Chapter 5. Heterocedastic Models. Introduction to time series (2008) 1

Financial Econometrics Jeffrey R. Russell Midterm Winter 2009 SOLUTIONS

FORECASTING WITH REGRESSION

Time series Decomposition method

How to Deal with Structural Breaks in Practical Cointegration Analysis

Chapter 11. Heteroskedasticity The Nature of Heteroskedasticity. In Chapter 3 we introduced the linear model (11.1.1)

Testing the Random Walk Model. i.i.d. ( ) r

CHAPTER 17: DYNAMIC ECONOMETRIC MODELS: AUTOREGRESSIVE AND DISTRIBUTED-LAG MODELS

Lecture 5. Time series: ECM. Bernardina Algieri Department Economics, Statistics and Finance

Module: Principles of Financial Econometrics I Lecturer: Dr Baboo M Nowbutsing

Wisconsin Unemployment Rate Forecast Revisited

Wednesday, December 5 Handout: Panel Data and Unobservable Variables

Unit Root Time Series. Univariate random walk

Forecasting optimally

Chapter 15. Time Series: Descriptive Analyses, Models, and Forecasting

Cointegration and Implications for Forecasting

The Simple Linear Regression Model: Reporting the Results and Choosing the Functional Form

Lecture 4. Classical Linear Regression Model: Overview

Nonstationarity-Integrated Models. Time Series Analysis Dr. Sevtap Kestel 1

Stability. Coefficients may change over time. Evolution of the economy Policy changes

Estimation Uncertainty

The Brock-Mirman Stochastic Growth Model

DEPARTMENT OF STATISTICS

Measurement Error 1: Consequences Page 1. Definitions. For two variables, X and Y, the following hold: Expectation, or Mean, of X.

Bias in Conditional and Unconditional Fixed Effects Logit Estimation: a Correction * Tom Coupé

Hypothesis Testing in the Classical Normal Linear Regression Model. 1. Components of Hypothesis Tests

ACE 562 Fall Lecture 4: Simple Linear Regression Model: Specification and Estimation. by Professor Scott H. Irwin

ESTIMATION OF DYNAMIC PANEL DATA MODELS WHEN REGRESSION COEFFICIENTS AND INDIVIDUAL EFFECTS ARE TIME-VARYING

Computer Simulates the Effect of Internal Restriction on Residuals in Linear Regression Model with First-order Autoregressive Procedures

The Multiple Regression Model: Hypothesis Tests and the Use of Nonsample Information

A note on spurious regressions between stationary series

Lecture 15. Dummy variables, continued

20. Applications of the Genetic-Drift Model

Autocorrelation and the AR(1) Process

Vehicle Arrival Models : Headway

GDP PER CAPITA IN EUROPE: TIME TRENDS AND PERSISTENCE

Math 10B: Mock Mid II. April 13, 2016

Cointegration in Theory and Practice. A Tribute to Clive Granger. ASSA Meetings January 5, 2010

y = β 1 + β 2 x (11.1.1)

Empirical Process Theory

Reliability of Technical Systems

The Brock-Mirman Stochastic Growth Model

Section 4 NABE ASTEF 232

Volatility. Many economic series, and most financial series, display conditional volatility

Nonstationary Time Series Data and Cointegration

A New Unit Root Test against Asymmetric ESTAR Nonlinearity with Smooth Breaks

The consumption-based determinants of the term structure of discount rates: Corrigendum. Christian Gollier 1 Toulouse School of Economics March 2012

PENALIZED LEAST SQUARES AND PENALIZED LIKELIHOOD

Variance Bounds Tests for the Hypothesis of Efficient Stock Market

Robust estimation based on the first- and third-moment restrictions of the power transformation model

The Overlapping Data Problem

Innova Junior College H2 Mathematics JC2 Preliminary Examinations Paper 2 Solutions 0 (*)

Robust critical values for unit root tests for series with conditional heteroscedasticity errors: An application of the simple NoVaS transformation

Problem Set 5. Graduate Macro II, Spring 2017 The University of Notre Dame Professor Sims

Stat 601 The Design of Experiments

Kriging Models Predicting Atrazine Concentrations in Surface Water Draining Agricultural Watersheds

CH Sean Han QF, NTHU, Taiwan BFS2010. (Joint work with T.-Y. Chen and W.-H. Liu)

The General Linear Test in the Ridge Regression

OBJECTIVES OF TIME SERIES ANALYSIS

Lecture 10 Estimating Nonlinear Regression Models

References are appeared in the last slide. Last update: (1393/08/19)

Department of Economics East Carolina University Greenville, NC Phone: Fax:

Lecture 6 - Testing Restrictions on the Disturbance Process (References Sections 2.7 and 2.10, Hayashi)

Detecting Lag-One Autocorrelation in Interrupted Time Series Experiments with Small Datasets

Answers to Exercises in Chapter 7 - Correlation Functions

Økonomisk Kandidateksamen 2005(II) Econometrics 2. Solution

13.3 Term structure models

Choice of Spectral Density Estimator in Ng-Perron Test: A Comparative Analysis

Transcription:

Econ 39 - Auocorrelaion Sanjaya DeSilva Ocober 3, 008

1 Definiion Auocorrelaion (or serial correlaion) occurs when he error erm of one observaion is correlaed wih he error erm of any oher observaion. This violaes he classical assumpion ha E(u i u j ) = 0 i j 1.1 Causes of Auocorrelaion 1. Spaial dependence of observaions (cross-secional daa).. Random effecs: unobserved variables ha are independen beween bu correlaed wihin cross-secional unis (e.g. panel daa, muliple individuals in he same household) 3. Funcional form: incorrec funcional forms and smoohing. 4. Omied variables: omied variables ha are serially correlaed (e.g. ineria) Y = β 0 +β 1 X +nu where nu = β Z +u, and E(Z Z 1 ) = 0 (e.g. preference shif, oil shock, war, ineres rae) 5. Lagged dependen variable: error includes he lagged dependen variable Y = β 0 + β 1 X + nu where nu = β Y 1 + u (e.g. consumpion, invesmen, ineres rae). 6. Firs difference equaions: Y = β 0 + β 1 X + u 7. Nonsaionariy: If he mean and/or variance of he variables change over ime. 1

1. Firs Order Auoregressive Scheme: The AR1 Model Y = β 0 + β 1 X + u (1) u = ρu 1 + ɛ () ɛ (0, σɛ ) (3) cov(ɛ i ɛ j ) = 0 i j (4) where ρ is he firs-order coefficien of auocorrelaion. Wih his informaion, we can consruc he properies of u. V ar(u ) = ρ V ar(u 1 ) + σ ɛ = ρ V ar(u ) + σ ɛ = E(u ) = 0 (5) σ ɛ 1 ρ (6) Cov(u, u 1 ) = E(u u 1 ) = E((ρu 1 + u 1 ɛ )) = ρe(u ) = ρ 1 ρ (7) Cor(u, u 1 ) = ρ (8) Cov(u, u s ) = E(u u 1 ) = E((ρu 1 + u 1 ɛ )) = ρ s E(u ) = ρ (9) 1 ρ Cor(u, u s ) = ρ s (10) σ ɛ σ ɛ OLS esimaion of an AR1 Model Since E(u ) = 0, he OLS esimaor of β 1 is unbiased. However, he variance of b 1 is, E( k i u i ) = σ x i +E(k 1 k u 1 u +...+k n 1 k n u n 1 u n ) = σ x i +k 1 k E(u 1 u )+...+k n 1 k n E (11)

If X = rx 1, hen his expression reduces o V ar(b 1 ) = σ 1 + rρ x i 1 rρ (1) If boh X and u are posiively auocorrelaed, he rue variance of b 1 is greaer han wha he OLS formula repors. Therefore, we may misakenly rejec null hypoheses and consruc arificially narrow confidence inervals. 3 Properies of OLS esimaor of AR1 model 1. Coefficien esimae is unbiased: E(b 1 ) = β 1. The OLS variance formula is incorrec. Hard o ell wheher i s an overesimae or an underesimae wihou addiional assumpions. Generally, OLS formula will underesimae he variance. 3. OLS esimaor is no longer efficien. 4. The esimae of ˆσ is biased: E( ˆσ ) σ. I can be shown ha E( ˆσ ) < σ if boh X and u are boh posiively serially correlaed. This will conribue o inflaed R and o arificially low sandard errors. 4 Deecion of Auocorrelaion Noe ha, in order o es for serial correlaion, i is imporan o order he daa as a series (e.g. by year for ime series) 1. Graphical Mehod I: Plo residuals (or sandardized residuals) agains ime. (Noe: residuals are no he same as errors). 3

. Graphical Mehod II: To es for AR1, plo e agains e 1 3. The Runs Tes: Under he assumpion of no serial correlaion (i.e. independen observaions), find he mean and variance of he asympoically normally disribuion of he number of runs. These only depend on he number of observaions, posiive residuals and negaive residuals. Consruc a 95 per cen confidence inerval for he number of runs. Find wheher he observed number of runs is inside he confidence inerval. If he number of runs is smaller, here is posiive serial correlaion. 4. Durbin Wason Tes: d = (e e 1 ) (13) Assuming ha u is normally disribued and follows an AR1 process, regressors are nonsochasic, here are no lagged dependen variables, and he mode includes an inercep, Durbin Wason derived lower and upper bounds of values ha help us o es wheher here is serial correlaion. ha To see how his saisic reflec serial correlaion, noe d = (e e 1 ) (14) d = + 1 e 1 (15) Since 1, d (1 e 1 ) = (1 ˆρ) (16) 4

Noe ha 0 d 4, and 0 d < if posiive auocorrelaion, < d 4 if negaive auocorrelaion, and d = if no auocorrelaion. To carry ou he es, we need o find he lower d L and upper d U criical values for he D-W saisic for he relevan sample size and number of variables. Then compue he D-W saisic d for he regression using residuals. Now, (a) 0 < d < d L : posiive serial correlaion (b) d L d d U : inconclusive (c) d U < d < 4 d U : no serial correlaion (d) 4 d U d 4 d L : inconclusive (e) 4 d L < d < 4:negaive serial correlaion 5. Breusch-Godfrey Tes: This is no limied o AR1 and works even wih lagged dependen variables, and sochasic regressors. (a) Run regression and save residuals (b) Pick he number of lags (p) for he AR(p) process (c) Run a AR(p) regression (d) Do a hypohesis es for all slopes equal zero using (n p)r χ (p) 5