Generalized Least Squares Upuntilnow,wehaveassumedthat Euu = ¾ I Now we generalize to let Euu = where is restricted to be a positive de nite, symmetric matrix ommon Examples Autoregressive Models AR() u t = ½u t + e t ; j½j < ; e t» iid ;¾ Varu t = Eu t = E (½u t + e t ) = E ½ u t +½u t e t + e t = ½ Eu t +½Eu t e t + Ee t = ½ Eu t + Ee t = ½ Eu t + ¾ (stationarity) ) Varu t = ¾ ½ ov (u t ;u t ) = Eu t u t = E (½u t + e t ) u t = ½Eu t + Ee tu t = ½Eu t ½¾ = ½ ov(u t ;u t ) = Eu t u t = E (½u t + e t ) u t = ½Eu t u t + Ee t u t = ½Eu t u t = ½ ¾ ½
) AR() ov (u t ;u t n )= ½n ¾ ½ ½ ½ = ¾ ½ ½ ½ B ½ ½ A u t = ½ u t + ½ u t + e t ; e t» iid ;¾ e Varu t = ¾ = Eu t = E [½ u t + ½ u t + e t ] () = ½ Eu t + ½ Eu t + Ee t +½ ½ Eu t u t +½ Eu t e t +½ Eu t e t = ½ ¾ + ½ ¾ + ¾ e +½ ½ ¾ where ¾ n = ov(u t ;u t n ) ¾ = Eu t u t = E [½ u t + ½ u t + e t ] u t () = ½ Eu t + ½ Eu t u t + Ee t u t = ½ ¾ + ½ ¾ We can write equations () and () in matrix form as µ µ ½ ½ ½ ½ ¾ ¾ = e ½ ½ ¾ (3) and solve for µ ¾ ½ = ½ µ ½ ½ ¾ e ¾ ½ ½ he condition for stationarity is that the eigenvalues of ½ ½ ½ ½ ½ ½ are greater than one Equation (3) is called the Yule-Walker equations Students should work out the Yule-Walker equations for an AR(3)
Moving Average MA() u t = ½ e t + ½ e t ; e t» iid ;¾ e Eu t = E [½ e t + ½ e t ] = ½ Ee t + ½ Ee t +½ ½ Ee t e t = ½ ¾ e + ½ ¾ e; Eu t u t = E [½ e t + ½ e t ][½ e t + ½ e t ]=½ ½ ¾ e Eu t u t n = E [½ e t + ½ e t ][½ e t n + ½ e t n ]= if n> hus ½ + ½ ½ ½ ½ ½ ½ + ½ ½ ½ = ¾ e ½ ½ ½ 6 + ½ 4 ½ + ½ 3 7 5 MA(n) n u t = ½ i e t i i= e t» iid ;¾ e " n # Eu t = E ½ i e t i = ¾ e i= n ½ i ; i= " n #" n # Eu t u t = E ½ i e t i ½ i e t i Eu t u t k = i= i= " n #" n+ # = E ½ i e t i ½ i e t i = ¾ e i= i= ½ P ¾ n e i=k ½ i½ i if k n if k>n n ½ i ½ i ; i= 3
3 ARMA m n u t = i u t i + ½ i e t i i= e t» iid ;¾ e i= is an ARMA(m,n) process Work out the Yule-Walker equations 4 Heteroskedasticity = B ¾ ¾ ¾ A 5 Random oe cients onsider the model y t = t t + u t with t» iid ; hen we can write the model as y t = t + t t + u t = t + e t where e t = t t + u t Eee = E + u + u = + ¾ u I 6 Random E ects onsider the model y it = it + u i + e it ; i = ; ; n; t = ;; u i» iid ;¾ u ; eit» iid ;¾ e 4
(explain panel data; give examples) De ne y i i y i y i = B A ; i i = B A ; e i = B y i i e i e i e i A ; = B A hen the model can be written as y i = i + u i + e i Now de ne y = B y y y n A ; = B n A ; e = B e e e n A ; u = B u u u n A hen the model can be written as y = + v with v = u + e v = Evv = E (u + e)(u + e) A A = B A A where A = B ¾ u + ¾ e ¾ u ¾ u ¾ u ¾ u + ¾ e ¾ u A ¾ u ¾ u ¾ u + ¾ e GLS and OLS onsider the model y = + u; u» (; ) 5
b OLS =( ) y = +( ) u E b OLS = + E ( ) u = +( ) Eu = ³ D b OLS = E ( ) uu ( ) = ( ) Euu ( ) = ( ) ( ) 6= ¾ u ( ) (especially because there is no such object as ¾ u) herefore, b OLS is unbiased but the standard errors are wrong We could correct the standard errors Alternatively, let R R = ; and consider Ry = R + Ru Note that ERu = REu =; ERuu R = REuu R = R R = I De ne b GLS = (R) (R) (R) (Ry) = [ R R] [ R Ry] = y Note that b GLS = y = ( + u) = + u = + u ) E b GLS = + E u = + E Eu = ; 6
³ D b GLS Note that, if then and = E u u = Euu = = = = ¾ I; b GLS = y = ¾ I ¾ Iy = [ ] [ y]= b OLS ; ³ D b GLS = ¾ I ³ = ¾ [ ] = D b OLS 3 Realities of Data 3 esting for Heteroskedasticity and Other Deviations from = ¾ I 3 Durbin-Watson test statistic DW = = P t= (bu t bu t ) P t= bu t bu t bu t bu t + bu t P t= P t= bu t onsider the error structure associated with an AR() process plimdw = plim P t= bu t bu t bu t + bu t plim P t= bu t = ¾ ¾ ¾ =( ½) If ½ =; plimdw = If ½ ¼ ; plimdw ¼ If ½ ¼ ; plimdw ¼ 4 he distribution of the DW test can be found in tables he DW test is a special case of a Lagrange-Multiplier test statistic (to be learned later) 7
3 White Heteroskedasticity est Statistic onsider ¾ ¾ = B A ¾ and H ¾ = ¾ = = ¾ vs H ¾ 6= ¾ 6= 6= ¾ Let bu be the OLS residuals onsider regressing bu t = + j6=k tj tk jk + e t Under H, jk =8j; k One can show that R» Â k(k+)= 3 Estimation when is not Known onsider the model y = + u u» (; ) and is unknown Let bu be the OLS residuals he goal is to use the residuals to construct a consistent estimate of In general, has ( + ) = free parameters, and there are only residuals (with only (n + ) degrees of freedom) So, to make any progress, we will have to put a lot of structure on (White heteroskedasticity corrected estimators are exceptions) Examples AR() For an AR() model, there are only parameters ½ and ¾ e onsider P t= b½ = bu tbu t P t= bu t plim b½ = plim P t= bu tbu t plim P = ½¾ e = ½ t= bu ¾ t e = ( ½ ) = ½ onsider b¾ e = ³ b½ t= bu t 8
plimb¾ e = plim ³ ¾ e b½ ( ½ ) = ½ ¾ e ( ½ ) = ¾ e We can plug b½ and b¾ e into = ¾ e ½ B ½ ½ ½ ½ ½ ½ A to get a consistent estimate of, let ssay b estimator for to get b GLS = We can no longer talk about E b GLS b GLS = h i h i b b y But h i h i b b ( + u) h i h i = b b + h i h i = + b b u ; Now plug b into the GLS h b i h i b u " # " b plim b GLS = + plim " # " = + plim b # b u plim b u # = ; and with p ³ b GLS» N (;V) " # V = plim b = plim MA() For an MA() model, there are three parameters ½, ½ ; and ¾ e Without loss of generality, we can set ½ = (later we will understand this as an identi cation issue) What are good consistent estimates for ½ and ¾ e? 9
3 Random E ects For a random e ects model, there are two parameters ¾ u and ¾ e Let bv it be OLS residuals onsider s = bv it n Since i;t Ev it = ¾ u + ¾ e; plim s = ¾ u + ¾ e (4) onsider s = n i bv i where Since bv i = bv it t Ev i = ¾ u + ¾ e; plim s = ¾ u + ¾ e (5) We can solve equations (4) and (5) together to get consistent estimates of ¾ u and ¾ e 4 General Assume we can parameterize in terms of a small number of parameters, andlet ( ) represent as a function of Let be OLS residuals De ne b =argmin bu =(I P ) y D ( ( )) D µ bubu where D ( ) takes the independent elements of the argument Later we will show that b is a consistent estimate of Plug b into ( ) to get b, a consistent estimate of Plug b into the GLS estimator
33 White Heteroskedasticity orrection onsider the model y t = t + u t u t» ind ;¾ t We know that ³ D b OLS =( ) ( ) where tt = ¾ t and ts =for t 6= s b ts =for t 6= s ) plim b onsider b where b tt = bu t and = even though plimb doesn t exist µ µ b µ µ ) plim = Newey-West generalizes this by letting b ts = jj kj bu t bu s t