Generalized Least Squares

Similar documents
Heteroskedasticity and Autocorrelation

Empirical Macroeconomics

LECTURE 13: TIME SERIES I

Empirical Macroeconomics

Linear Regression. y» F; Ey = + x Vary = ¾ 2. ) y = + x + u. Eu = 0 Varu = ¾ 2 Exu = 0:

Economics 620, Lecture 13: Time Series I

Empirical Macroeconomics

AUTOCORRELATION. Phung Thanh Binh

F9 F10: Autocorrelation

Testing Linear Restrictions: cont.

x i = 1 yi 2 = 55 with N = 30. Use the above sample information to answer all the following questions. Show explicitly all formulas and calculations.

13. Time Series Analysis: Asymptotics Weakly Dependent and Random Walk Process. Strict Exogeneity

Econometrics I. Professor William Greene Stern School of Business Department of Economics 25-1/25. Part 25: Time Series

1 The Multiple Regression Model: Freeing Up the Classical Assumptions

Notes on Time Series Modeling

Econometrics. Week 4. Fall Institute of Economic Studies Faculty of Social Sciences Charles University in Prague

Föreläsning /31

Introductory Econometrics

Linear Model Under General Variance Structure: Autocorrelation

Econometrics Multiple Regression Analysis: Heteroskedasticity

Questions and Answers on Heteroskedasticity, Autocorrelation and Generalized Least Squares

Module 3. Descriptive Time Series Statistics and Introduction to Time Series Models

GLS. Miguel Sarzosa. Econ626: Empirical Microeconomics, Department of Economics University of Maryland

ECON 4230 Intermediate Econometric Theory Exam

Intermediate Econometrics

LECTURE 11. Introduction to Econometrics. Autocorrelation

Iris Wang.

Econometrics. 9) Heteroscedasticity and autocorrelation

Instrumental Variables and Two-Stage Least Squares

Asymptotic Theory. L. Magee revised January 21, 2013

Diagnostics of Linear Regression

Environmental Econometrics

1 Introduction to Generalized Least Squares

Estimation and Inference of Linear Trend Slope Ratios

GMM-based inference in the AR(1) panel data model for parameter values where local identi cation fails

Auto correlation 2. Note: In general we can have AR(p) errors which implies p lagged terms in the error structure, i.e.,

Dynamic Regression Models (Lect 15)

Chapter 2. Dynamic panel data models

Multiple Regression Analysis

Chapter 11 GMM: General Formulas and Application

LECTURE 10: MORE ON RANDOM PROCESSES

Notes on Generalized Method of Moments Estimation

Introductory Econometrics

Week 11 Heteroskedasticity and Autocorrelation

Econometrics Summary Algebraic and Statistical Preliminaries

GMM estimation of spatial panels

Lecture 5 Linear Quadratic Stochastic Control

Econ 423 Lecture Notes

An estimate of the long-run covariance matrix, Ω, is necessary to calculate asymptotic

Parameter estimation: ACVF of AR processes

1 Outline. 1. Motivation. 2. SUR model. 3. Simultaneous equations. 4. Estimation

11.1 Gujarati(2003): Chapter 12

Lectures on Structural Change

Lecture notes to Stock and Watson chapter 4

Heteroskedasticity ECONOMETRICS (ECON 360) BEN VAN KAMMEN, PHD

(c) i) In ation (INFL) is regressed on the unemployment rate (UNR):

2014 Preliminary Examination

Reading Assignment. Serial Correlation and Heteroskedasticity. Chapters 12 and 11. Kennedy: Chapter 8. AREC-ECON 535 Lec F1 1

Models, Testing, and Correction of Heteroskedasticity. James L. Powell Department of Economics University of California, Berkeley

Volume 31, Issue 1. The "spurious regression problem" in the classical regression model framework

Economics 308: Econometrics Professor Moody

Econometrics I KS. Module 2: Multivariate Linear Regression. Alexander Ahammer. This version: April 16, 2018

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14

ARIMA Models. Richard G. Pierse

G. S. Maddala Kajal Lahiri. WILEY A John Wiley and Sons, Ltd., Publication

Ch.10 Autocorrelated Disturbances (June 15, 2016)

Quick Review on Linear Multiple Regression

21. Panel: Linear A. Colin Cameron Pravin K. Trivedi Copyright 2006

Exercises (in progress) Applied Econometrics Part 1

Heteroskedasticity. Part VII. Heteroskedasticity

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 48

Demand analysis is one of the rst topics come to in economics. Very important especially in the Keynesian paradigm.

Regression with time series

ECONOMETRICS II (ECO 2401S) University of Toronto. Department of Economics. Spring 2013 Instructor: Victor Aguirregabiria

Department of Economics, UCSD UC San Diego

11. Further Issues in Using OLS with TS Data

STAT Financial Time Series

Chapter 6: Model Specification for Time Series

ECONOMET RICS P RELIM EXAM August 19, 2014 Department of Economics, Michigan State University

Read Section 1.1, Examples of time series, on pages 1-8. These example introduce the book; you are not tested on them.

Outline. Overview of Issues. Spatial Regression. Luc Anselin

Zellner s Seemingly Unrelated Regressions Model. James L. Powell Department of Economics University of California, Berkeley

Autoregressive Moving Average (ARMA) Models and their Practical Applications

ECON2228 Notes 10. Christopher F Baum. Boston College Economics. cfb (BC Econ) ECON2228 Notes / 54

Eco and Bus Forecasting Fall 2016 EXERCISE 2

Section 6: Heteroskedasticity and Serial Correlation

Microeconometrics: Clustering. Ethan Kaplan

ECONOMETRICS FIELD EXAM Michigan State University May 9, 2008

Economics 620, Lecture 14: Lags and Dynamics

Testing for Serial Correlation in Fixed-Effects Panel Data Models

GLS-based unit root tests with multiple structural breaks both under the null and the alternative hypotheses

Topic 6: Non-Spherical Disturbances

FINANCIAL ECONOMETRICS AND EMPIRICAL FINANCE -MODULE2 Midterm Exam Solutions - March 2015

Linear Regression with Time Series Data

Econ 300/QAC 201: Quantitative Methods in Economics/Applied Data Analysis. 17th Class 7/1/10

Multiple Regression Analysis. Part III. Multiple Regression Analysis

Likely causes: The Problem. E u t 0. E u s u p 0

ECONOMET RICS P RELIM EXAM August 24, 2010 Department of Economics, Michigan State University

TIME SERIES DATA ANALYSIS USING EVIEWS

Ch 4. Models For Stationary Time Series. Time Series Analysis

Transcription:

Generalized Least Squares Upuntilnow,wehaveassumedthat Euu = ¾ I Now we generalize to let Euu = where is restricted to be a positive de nite, symmetric matrix ommon Examples Autoregressive Models AR() u t = ½u t + e t ; j½j < ; e t» iid ;¾ Varu t = Eu t = E (½u t + e t ) = E ½ u t +½u t e t + e t = ½ Eu t +½Eu t e t + Ee t = ½ Eu t + Ee t = ½ Eu t + ¾ (stationarity) ) Varu t = ¾ ½ ov (u t ;u t ) = Eu t u t = E (½u t + e t ) u t = ½Eu t + Ee tu t = ½Eu t ½¾ = ½ ov(u t ;u t ) = Eu t u t = E (½u t + e t ) u t = ½Eu t u t + Ee t u t = ½Eu t u t = ½ ¾ ½

) AR() ov (u t ;u t n )= ½n ¾ ½ ½ ½ = ¾ ½ ½ ½ B ½ ½ A u t = ½ u t + ½ u t + e t ; e t» iid ;¾ e Varu t = ¾ = Eu t = E [½ u t + ½ u t + e t ] () = ½ Eu t + ½ Eu t + Ee t +½ ½ Eu t u t +½ Eu t e t +½ Eu t e t = ½ ¾ + ½ ¾ + ¾ e +½ ½ ¾ where ¾ n = ov(u t ;u t n ) ¾ = Eu t u t = E [½ u t + ½ u t + e t ] u t () = ½ Eu t + ½ Eu t u t + Ee t u t = ½ ¾ + ½ ¾ We can write equations () and () in matrix form as µ µ ½ ½ ½ ½ ¾ ¾ = e ½ ½ ¾ (3) and solve for µ ¾ ½ = ½ µ ½ ½ ¾ e ¾ ½ ½ he condition for stationarity is that the eigenvalues of ½ ½ ½ ½ ½ ½ are greater than one Equation (3) is called the Yule-Walker equations Students should work out the Yule-Walker equations for an AR(3)

Moving Average MA() u t = ½ e t + ½ e t ; e t» iid ;¾ e Eu t = E [½ e t + ½ e t ] = ½ Ee t + ½ Ee t +½ ½ Ee t e t = ½ ¾ e + ½ ¾ e; Eu t u t = E [½ e t + ½ e t ][½ e t + ½ e t ]=½ ½ ¾ e Eu t u t n = E [½ e t + ½ e t ][½ e t n + ½ e t n ]= if n> hus ½ + ½ ½ ½ ½ ½ ½ + ½ ½ ½ = ¾ e ½ ½ ½ 6 + ½ 4 ½ + ½ 3 7 5 MA(n) n u t = ½ i e t i i= e t» iid ;¾ e " n # Eu t = E ½ i e t i = ¾ e i= n ½ i ; i= " n #" n # Eu t u t = E ½ i e t i ½ i e t i Eu t u t k = i= i= " n #" n+ # = E ½ i e t i ½ i e t i = ¾ e i= i= ½ P ¾ n e i=k ½ i½ i if k n if k>n n ½ i ½ i ; i= 3

3 ARMA m n u t = i u t i + ½ i e t i i= e t» iid ;¾ e i= is an ARMA(m,n) process Work out the Yule-Walker equations 4 Heteroskedasticity = B ¾ ¾ ¾ A 5 Random oe cients onsider the model y t = t t + u t with t» iid ; hen we can write the model as y t = t + t t + u t = t + e t where e t = t t + u t Eee = E + u + u = + ¾ u I 6 Random E ects onsider the model y it = it + u i + e it ; i = ; ; n; t = ;; u i» iid ;¾ u ; eit» iid ;¾ e 4

(explain panel data; give examples) De ne y i i y i y i = B A ; i i = B A ; e i = B y i i e i e i e i A ; = B A hen the model can be written as y i = i + u i + e i Now de ne y = B y y y n A ; = B n A ; e = B e e e n A ; u = B u u u n A hen the model can be written as y = + v with v = u + e v = Evv = E (u + e)(u + e) A A = B A A where A = B ¾ u + ¾ e ¾ u ¾ u ¾ u ¾ u + ¾ e ¾ u A ¾ u ¾ u ¾ u + ¾ e GLS and OLS onsider the model y = + u; u» (; ) 5

b OLS =( ) y = +( ) u E b OLS = + E ( ) u = +( ) Eu = ³ D b OLS = E ( ) uu ( ) = ( ) Euu ( ) = ( ) ( ) 6= ¾ u ( ) (especially because there is no such object as ¾ u) herefore, b OLS is unbiased but the standard errors are wrong We could correct the standard errors Alternatively, let R R = ; and consider Ry = R + Ru Note that ERu = REu =; ERuu R = REuu R = R R = I De ne b GLS = (R) (R) (R) (Ry) = [ R R] [ R Ry] = y Note that b GLS = y = ( + u) = + u = + u ) E b GLS = + E u = + E Eu = ; 6

³ D b GLS Note that, if then and = E u u = Euu = = = = ¾ I; b GLS = y = ¾ I ¾ Iy = [ ] [ y]= b OLS ; ³ D b GLS = ¾ I ³ = ¾ [ ] = D b OLS 3 Realities of Data 3 esting for Heteroskedasticity and Other Deviations from = ¾ I 3 Durbin-Watson test statistic DW = = P t= (bu t bu t ) P t= bu t bu t bu t bu t + bu t P t= P t= bu t onsider the error structure associated with an AR() process plimdw = plim P t= bu t bu t bu t + bu t plim P t= bu t = ¾ ¾ ¾ =( ½) If ½ =; plimdw = If ½ ¼ ; plimdw ¼ If ½ ¼ ; plimdw ¼ 4 he distribution of the DW test can be found in tables he DW test is a special case of a Lagrange-Multiplier test statistic (to be learned later) 7

3 White Heteroskedasticity est Statistic onsider ¾ ¾ = B A ¾ and H ¾ = ¾ = = ¾ vs H ¾ 6= ¾ 6= 6= ¾ Let bu be the OLS residuals onsider regressing bu t = + j6=k tj tk jk + e t Under H, jk =8j; k One can show that R» Â k(k+)= 3 Estimation when is not Known onsider the model y = + u u» (; ) and is unknown Let bu be the OLS residuals he goal is to use the residuals to construct a consistent estimate of In general, has ( + ) = free parameters, and there are only residuals (with only (n + ) degrees of freedom) So, to make any progress, we will have to put a lot of structure on (White heteroskedasticity corrected estimators are exceptions) Examples AR() For an AR() model, there are only parameters ½ and ¾ e onsider P t= b½ = bu tbu t P t= bu t plim b½ = plim P t= bu tbu t plim P = ½¾ e = ½ t= bu ¾ t e = ( ½ ) = ½ onsider b¾ e = ³ b½ t= bu t 8

plimb¾ e = plim ³ ¾ e b½ ( ½ ) = ½ ¾ e ( ½ ) = ¾ e We can plug b½ and b¾ e into = ¾ e ½ B ½ ½ ½ ½ ½ ½ A to get a consistent estimate of, let ssay b estimator for to get b GLS = We can no longer talk about E b GLS b GLS = h i h i b b y But h i h i b b ( + u) h i h i = b b + h i h i = + b b u ; Now plug b into the GLS h b i h i b u " # " b plim b GLS = + plim " # " = + plim b # b u plim b u # = ; and with p ³ b GLS» N (;V) " # V = plim b = plim MA() For an MA() model, there are three parameters ½, ½ ; and ¾ e Without loss of generality, we can set ½ = (later we will understand this as an identi cation issue) What are good consistent estimates for ½ and ¾ e? 9

3 Random E ects For a random e ects model, there are two parameters ¾ u and ¾ e Let bv it be OLS residuals onsider s = bv it n Since i;t Ev it = ¾ u + ¾ e; plim s = ¾ u + ¾ e (4) onsider s = n i bv i where Since bv i = bv it t Ev i = ¾ u + ¾ e; plim s = ¾ u + ¾ e (5) We can solve equations (4) and (5) together to get consistent estimates of ¾ u and ¾ e 4 General Assume we can parameterize in terms of a small number of parameters, andlet ( ) represent as a function of Let be OLS residuals De ne b =argmin bu =(I P ) y D ( ( )) D µ bubu where D ( ) takes the independent elements of the argument Later we will show that b is a consistent estimate of Plug b into ( ) to get b, a consistent estimate of Plug b into the GLS estimator

33 White Heteroskedasticity orrection onsider the model y t = t + u t u t» ind ;¾ t We know that ³ D b OLS =( ) ( ) where tt = ¾ t and ts =for t 6= s b ts =for t 6= s ) plim b onsider b where b tt = bu t and = even though plimb doesn t exist µ µ b µ µ ) plim = Newey-West generalizes this by letting b ts = jj kj bu t bu s t