Chapter 7 Generalzed and Weghted Least Squares Estmaton The usual lnear regresson model assumes that all the random error components are dentcally and ndependently dstrbuted wth constant varance. When ths assumpton s volated, then ordnary least squares estmator of regresson coeffcent looses ts property of mnmum varance n the class of lnear and unbased estmators. The volaton of such assumpton can arse n anyone of the followng stuatons:. The varance of random error components s not constant.. The random error components are not ndependent. 3. The random error components do not have constant varance as well as they are not ndependent. In such cases, the covarance matrx of random error components does not reman n the form of an dentty matrx but can be consdered as any postve defnte matrx. Under such assumpton, the OLSE does not reman effcent as n the case of dentty covarance matrx. The generalzed or weghted least squares method s used n such stuatons to estmate the parameters of the model. In ths method, the devaton between the observed and expected values of y s multpled by a weght ω where ω s chosen to be nversely proportonal to the varance of y. For smple lnear regresson model, the weghted least squares functon s n ( ). S( β, β ) = ω y β β x The least squares normal equatons are obtaned by dfferentatng S( β, β) wth respect to β and β and equatng them to zero as n n n ˆ β ω + ˆ β ω x = ω y = = = n n n x + x = xy = = = ˆ β ω ˆ β ω ω. Soluton of these two normal equatons gve the weghted least squares estmate of β and β.
Generalzed least squares estmaton Suppose n usual multple regresson model = β + ε ε = ε = σ, y X wth E( ), V( ) I the assumpton V( ε) = σ I s volated and become V ( ε) = σ Ω where Ω s a known n n nonsngular, postve defnte and symmetrc matrx. Ths structure of Ω ncorporates both the cases. - when Ω s dagonal but wth unequal varances and - when Ω s not necessarly dagonal dependng on the presence of correlated errors, some of dagonal elements are nonzero. The OLSE of β s b= ( X ' X) X ' y In such cases OLSE gves unbased estmate but has more varablty as E( b) ( X ' X) X ' E( y) ( X ' X) X ' X = = β = β V b X X X V y X X X X X X X X X ( ) = ( ' ) ' ( ) ( ' ) = σ ( ' ) ' Ω ( ' ). Now we attempt to fnd better estmator as follows: Snce Ω s postve defnte, symmetrc, so there exsts a nonsngular matrx K such that. KK ' = Ω. Then n the model premutlply by y = Xβ + ε, K, ths gves K y = K Xβ + K or z = Bβ + g ε where and z K yb, K X, g K ε = = =. Now observe that Eg K Eε ) ( ) = ( ) =
{ }{ } Vg ( ) = E g Eg ( ) g Eg ( ) ' = E( gg ') = E K εε ' K' = K E( εε ') K' = σ K ΩK' = σ K KK ' K ' = σ I. Thus the elements of g have mean and they are uncorrelated. So ether mnmze S( β ) = g' g = ε' Ω ε and get normal equatons as - ˆ (X' Ω X) β = X ' Ω y ˆ β = X Ω X X Ω y or ( ' ) '. = Ω ( y Xβ)' ( y Xβ) Alternatvely, we can apply OLS to transformed model and obtan OLSE of β as ˆ ( BB ' ) ' β = Bz X K K X X K K y = ( ' ' ) ' ' X X X y = ( ' Ω ) ' Ω Ths s termed as generalzed least squares estmator (GLSE) of β. The estmaton error of GLSE s ˆ ( BB ' ) B'( B g) β = β + = β + BB ˆ β β = Bg ( ' ) ' or ( BB ' ) Bg '. Then E ˆ β β = BB BEg = ( ) ( ' ) ' ( ) whch shows that GLSE s an unbased estmator of β. The covarance matrx of GLSE s gven by { }{ } V( ˆ β) = E ˆ β E( ˆ β) ˆ β E( ˆ β) ' E ( BB ' ) BggB ' ' '( BB ' ) = = ( BB ' ) BEgg ' ( ') B'( BB ' ). 3
Snce so E( gg ') = K E( εε ') K ' = σ K ΩK' = σ K = σ I, KK ' K ' V( ˆ β) = σ ( BB ' ) BBBB ' ( ' ) = σ ( BB ' ) = σ ( X ' K' K X) = σ ( X ' Ω X). Now we prove that GLSE s the best lnear unbased estmator of β. The Gauss-Markov theorem for the case Var( ε ) = Ω The Gauss-Markov theorem establshes that the generalzed least-squares (GLS) estmator of ˆ β gven by β = ( X ' Ω X) X ' Ω y, s BLUE (best lnear unbased estmator). By best β, we mean that ˆβ mnmzes the varance for any lnear combnaton of the estmated coeffcents, ( ˆ β ) ( ' Ω ) ' Ω = ( X ' Ω X) X ' Ω E( y) = ( X ' Ω X) X ' Ω Xβ = β. E E X X X y Thus ˆβ s an unbased estmator of β. The covarance matrx of ˆβ s gven by V ˆ X Ω X X Ω V y X Ω X X Ω ( X ' Ω X) X ' Ω Ω ( X ' Ω X) X ' Ω ' ( X ' Ω X) X ' Ω Ω Ω X( X ' Ω X) = ( X ' Ω X). ( β ) ( ' ) ' ( ) ( ' ) ' ' 'βˆ. We note that Thus, Var( ' ˆ β) = ' Var( ˆ β) = Ω ' ( X ' X). 4
Let β be another unbased estmator of β that s a lnear combnaton of the data. Our goal, then, s to show that Var( ' β ) '( X ' Ω X ) wth at least one such that Var( ' β ) '( X ' X ) Ω. We frst note that we can wrte any other estmator of β that s a lnear combnaton of the data as β ( X ' Ω X) X ' Ω + B y+ b * where B s an p n matrx and estmator to form the alternatve estmate. Then * ( ) E( β ) = E ( X ' Ω X) X ' Ω + B y+ b o = ( X ' Ω X) X ' Ω + B E( y) + b * ( X ' Ω X ) X ' Ω + B XB + b = ( X ' Ω X) X ' Ω Xβ + BXβ + b = β + BX β + b. * * Consequently, β s unbased f and only f both * ( ) b * o s a p vector of constants that approprately adjusts the GLS V ( β ) = Var ( X ' Ω X ) X ' Ω + B y = X Ω X X Ω + B V y X Ω X X Ω + B = ( X ' Ω X) X ' Ω + B Ω ( X ' Ω X) X ' Ω + B ' ( X ' Ω X) X ' Ω + B Ω Ω X( X ' Ω X) + B' ( X ' Ω X) + BΩB' ( ' ) ' ( ) ( ' ) ' ' because BX =, whch mples that ( BX )' = X ' B ' =. Then Var( ' β) = ' V ( β) ( X X B B ) = ' ( ' Ω ) + Ω ' '( ' ) ' ' = X Ω X + BΩB = Var(' ˆ β ) + ' BΩB '. * b = and BX =. The covarance matrx of β s We note that Ω s a postve defnte matrx. Consequently, there exsts some nonsngulat matrx K such that Ω= K' K As a result, BΩ B ' = BK ' KB ' s at least postve semdefnte matrx; hence, ' BΩB'. Next note that we can defne * = KB '. As a result, p *' * * = ' BΩ B' = = whch must be strctly greater than for some unless B = lnear unbased estmator.. Thus, the GLS estmate of β s the best 5
Weghted least squares estmaton When ε 's are uncorrelated and have unequal varances, then ω V ( ε) σ σ = Ω= ω. ω n The estmaton procedure s usually called as weghted least squares. Let W = Ω then the weghted least squares estmator of β s obtaned by solvng normal equaton ( X ' WX ) ˆ β = X ' Wy whch gves ˆ ( X ' WX ) ' β = X Wy where ω, ω,..., ω n are called the weghts. The observatons wth large varances usual have smaller weghts than observatons wth small varance. 6