Chapter Simple Liear Regressio (part 6: matrix versio) Overview Simple liear regressio model: respose variable Y, a sigle idepedet variable X Y β 0 + β X + ε Multiple liear regressio model: respose Y, more tha oe idepedet variables X,X 2,, X p Y β 0 + β X + β 2 X 2 + + β p X p + ε To ivestigate multiple regressio model, we eed matrix 2 Review of Matrices A matrix: a rectagular array of elemets arraged i rows ad colums a example Row Row 2 Row 3 Colum Colum 2 00 22 300 46 600 8 2 A matrix with r rows ad c colums a a 2 a j a c a 2 a 22 a 2j a 2c A a i a i2 a ij a ic a r a r2 a rj a rc Sometimes deote it as A a ij i,, r; j,, c
r ad c are called the dimesio of a matrix 22 Square matrix ad Vector Square matrix: equal umber of rows ad colums a 4 7 a 2 a 3, a 3 9 2 a 22 a 23 a 3 a 32 a 33 vector: matrix with oly oe row or oe colum A 4 7 0 B 23 Traspose of a matrix ad equality of matrices 5 25 20 traspose of a matrix A is aother matrix deoted by A A 2 5 7 0 3 4, A 2 7 3 5 0 4 two matrices are equal if they have the same dimesio ad all the correspodig elemets are equal Suppose A a a 2 a 2 a 22 a 3 a 32 If A B, thea 7,a 2 2,, B 7 2 4 5 3 9 24 Matrix additio ad subtractio Addig or subtractig of two matrices require that they have the same dimesio 4 2 A 2 5, B 2 3, 3 6 3 4 A + B + 4+2 2+2 5+3 3+3 6+4 A B 4 2 2 2 5 3 3 3 6 4 2 6 4 8 6 0 0 2 0 2 0 2, 2
25 Matrix multiplicatio Multiplicatio of a Matrix by a Scalar 2 7 A 9 3 2 7 4A A4 4 9 3, 8 28 36 2 Multiplicatio of a Matrix by a Matrix If A has dimesio r c ad B has dimesio c s, the product AB is a matrix of dimesio r s with the elemet i the ith row ad jth colum: A 26 Regressio examples 4 2 5 8 c a ik b kj k a a 2 4a +2a 2 5a +8a 2 It is easy to check X X 2 X β0 β β 0 + β X β 0 + β X 2 β 0 + β X Let Y Y Y, X X X 2 X, β β0 β, E ε ε 2 ε The regressio model Y β 0 + β X + ε, β 0 + β X 2 + ε 2, Y β 0 + β X + ε ca be writte as Y Xβ + E 3
Other calculatios X X X X 2 X X Y X X 2 X X X 2 X Y Y Y Y Y Y i X i i Y i i X iy i Y Y i i i X i i X2 i 27 Special types of matrices Symmetric Matrix A A A 4 6 4 2 5 6 5 3 Diagoal Matrix: a square matrix whose off-diagoal elemets are all zeros a 0 0 0 a 2 0 0 0 a 3 Idetity Matrix I 0 0 0 0 0 0 facts: for ay appropriate matrix AI A ad IB B zero vector ad uit vector 0 0 0 0, 4
28 Iverse of a square matrix the iverse of a square matrix A is aother square matrix, deoted by A, such that AA A A I Sice 0 04 03 02 or 2 4 3 So A 2 4 3 2 4 3 0 04 03 02 A 29 Fidig the Iverse of a matrix 0 04 03 02 0 0 0 0 I I If the where D ad bc a b A c d A d D c D b D a D For high dimesioal matrix, its iverse is ot easy to calculate by had 20 Regressio example (cotiue) the iverse of matrix X X i X i i X i i X2 i D i X2 i ( i X i) 2 So (X X) i X2 i ( i X i) 2 i X2 i i (X i X) 2 i X i i (X i X) 2 i X i i (X i X) 2 i (X i X) 2 + X2 i (X i X) 2 i (X i X) 2 X i (X i X) 2 X i (X i X) 2 i (X i X) 2 5
2 Use of Iverse Matrix Suppose we wat to solve two equatios: 2y +4y 2 20 3y + y 2 0 Rewrite the equatios i matrix otatio: 2 4 3 y y 2 20 0 So the solutio to the equatios y y 2 2 4 3 0 04 03 02 20 0 20 0 Estimatio a regressio model eed to solve liear equatios, ad iverse matri is very useful 22 Other basic facts for matrices A + B B + A C(A + B) CA + CB (A ) A (AB) B A (A ) A (AB) B A (A ) (A ) 3 Radom vector ad matrices 2 4 Radom vector Y Y Y 3 6
Expectatio of radom vector E{Y} E{Y } E{ } E{Y 3 } Radom vector The Y Y Y 3, Z Z Z 2 Z 3 E(Y + Z) EY + EZ Variace-covariace Matrix of radom vector Var{Y} E{Y E{Y}Y E{Y} } Var{Y } Cov{Y, } Cov{Y,Y 3 } Cov{,Y } Var{ } Cov{,Y 3 } Cov{Y 3,Y } Cov{Y 3, } Var{Y 3 } I simple liear regressio model, errors are ucorrelated, so Var{E} σ 2 I Proof: for example cosider 3 Var{E} σ 2 0 0 0 σ 2 0 0 0 σ 2 3 Some basic facts If we have a radom vector W equal to a radom vector Y multiplied by a costat matrix A W AY we have E{W} AE{Y} If c is a costat vector, the Var{W} Var{AY} AVar{Y}A E(c + AY)c + AEY ad Var(c + AY)Var(AY)AVar{Y}A I simple liear regressio model, it follows from above Var{Y} σ 2 I 7
32 A illustratio W W E W 2 W 2 Y E{Y } E{ } Y Y 2 Y + E{Y } E{Y 2 } E{Y } + E{ } W Var{ W 2 } Var{Y } Cov{Y, } Cov{,Y } Var{ } 4 Simple liear regressio model (matrix versio) The model with assumptio Y β 0 + β X + ε β 0 + β X 2 + ε 2 Y β 0 + β X + ε E(ε i )0, 2 Var(ε i )σ 2,Cov(ε i,ε j ) 0 for all i j 3 ε i N(0,σ 2 ),i,, are idepedet Recall, the model ca be writte as Y Xβ + E Note that E{E} 0, Var{E} The assumptios ca be rewritte as E(E) 0, σ 2 0 0 0 0 σ 2 0 0 0 0 0 σ 2 σ2 I 2 Var(E) σ 2 I 3 E N(0,σ 2 I) 8
Thus E(Y) Xβ ad Var(Y) σ 2 I The model (with assumptios, 2, ad 3) ca also be writte as Y N(Xβ,σ 2 I) or Y Xβ + E, E N(0,σ 2 I) 4 Least squares estimator b 0, b The ormal equatios ca be writte as i X i i X b0 b 0 + b i X i i i X2 i b b 0 i X i + b i X2 i i Y i i X iy i X X X X 2 X X X 2 X i X i i X i i X2 i X Y X X 2 X Y Y i Y i i X iy i let The the ormal equatio is we ca fid b by b b0 b X Xb X Y b (X X) X Y 42 A example Y 6 5 0 5 3 22 ; X 4 2 3 3 4 9
X X b 6 7 7 55 6 7 7 55, X 8 Y 26 8 26 43 Fitted values ad residuals i matrix form Deote X(X X) X Ŷ Ŷ Ŷ 2 Ŷ by H, wehave b 0 + b X b 0 + b X 2 b 0 + b X e Y Ŷ e e 2 e Ŷ X(X X) X Y Xb Ŷ HY, e (I H)Y 44 Variace-covariace matrix for residuals e Var{e} Var{(I H)Y} (I H)Var{Y}(I H) Var{Y} Var{ε} σ 2 I (I H) I H I H HH X(X X) X X(X X) X X(X X) X H (I H)(I H) I 2H + HH I H Var{e} σ 2 (I H) estimatedbyˆσ 2 (I H) 0
45 Aalysis of variace i matrix form Let the J SST Y Y Y JY Y (I J)Y Proof Proof SSE SST (Y i Ȳ )2 i Y Y ( i i i i, Y Y Y i ) 2 Y Y Y JY i ( i Y i) 2 i e 2 i e e (Y Xb) (Y Xb) Y (I H)Y i Y i SSE (Y Xb) (Y Xb) Y Y 2b X Y + b X Xb Y Y 2b X Y + b X X(X X) X Y Y Y 2b X Y + b X Y Y Y b X Y Y (I H)Y SSR SST SSE Y (H J)Y
46 Variace-covariace matrix for b b (X X) X Y Var{b} (X X) X Var{Y}X(X X) σ 2 (X X) σ 2 + X2 X i (X i X) 2 i (X i X) 2 X i (X i X) 2 i (X i X) 2 where σ 2 ca be estimated by ˆσ 2 MSE 47 Variace for the predicted value Ŷ b 0 + b X Var{Ŷ } X b X Var{b} X σ 2 X (X X) X 2