Introduction to Matrices and Matrix Approach to Simple Linear Regression

Similar documents
Dr. Shalabh. Indian Institute of Technology Kanpur

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

Lecture Note to Rice Chapter 8

Matrix Algebra Tutorial With Examples in Matlab

Lecture 3 Probability review (cont d)

Simple Linear Regression

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Chapter 14 Logistic Regression Models

ε. Therefore, the estimate

Objectives of Multiple Regression

ρ < 1 be five real numbers. The

Chapter 5 Matrix Approach to Simple Linear Regression

Functions of Random Variables

Chapter 9 Jordan Block Matrices

Training Sample Model: Given n observations, [[( Yi, x i the sample model can be expressed as (1) where, zero and variance σ

Spreadsheet Problem Solving

Chapter 2 - Free Vibration of Multi-Degree-of-Freedom Systems - II

Answer key to problem set # 2 ECON 342 J. Marcelo Ochoa Spring, 2009

Multivariate Transformation of Variables and Maximum Likelihood Estimation

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

Chapter 4 Multiple Random Variables

Chapter 13, Part A Analysis of Variance and Experimental Design. Introduction to Analysis of Variance. Introduction to Analysis of Variance

ENGI 3423 Simple Linear Regression Page 12-01

GG313 GEOLOGICAL DATA ANALYSIS

STA302/1001-Fall 2008 Midterm Test October 21, 2008

12.2 Estimating Model parameters Assumptions: ox and y are related according to the simple linear regression model

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

COV. Violation of constant variance of ε i s but they are still independent. The error term (ε) is said to be heteroscedastic.

TESTS BASED ON MAXIMUM LIKELIHOOD

Chapter 3. Linear Equations and Matrices

Lecture 8: Linear Regression

Econometric Methods. Review of Estimation

Simple Linear Regression and Correlation.

Multiple Linear Regression Analysis

Maximum Likelihood Estimation

Lecture Notes Types of economic variables

Special Instructions / Useful Data

Qualifying Exam Statistical Theory Problem Solutions August 2005

Lecture Notes Forecasting the process of estimating or predicting unknown situations

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

The equation is sometimes presented in form Y = a + b x. This is reasonable, but it s not the notation we use.

Probability and. Lecture 13: and Correlation

4. Standard Regression Model and Spatial Dependence Tests

Example: Multiple linear regression. Least squares regression. Repetition: Simple linear regression. Tron Anders Moger

9 U-STATISTICS. Eh =(m!) 1 Eh(X (1),..., X (m ) ) i.i.d

Some Different Perspectives on Linear Least Squares

III-16 G. Brief Review of Grand Orthogonality Theorem and impact on Representations (Γ i ) l i = h n = number of irreducible representations.

QR Factorization and Singular Value Decomposition COS 323

STA 108 Applied Linear Models: Regression Analysis Spring Solution for Homework #1

ESS Line Fitting

Chapter 1 Simple Linear Regression (part 6: matrix version)

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

Simple Linear Regression

1 Solution to Problem 6.40

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

6.867 Machine Learning

Linear Regression with One Regressor

Point Estimation: definition of estimators

The expected value of a sum of random variables,, is the sum of the expected values:

Chapter 2 Supplemental Text Material

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

THE ROYAL STATISTICAL SOCIETY 2016 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5

Chapter 3 Sampling For Proportions and Percentages

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model

To use adaptive cluster sampling we must first make some definitions of the sampling universe:

Section 2 Notes. Elizabeth Stone and Charles Wang. January 15, Expectation and Conditional Expectation of a Random Variable.

MATH 247/Winter Notes on the adjoint and on normal operators.

Simulation Output Analysis

Wu-Hausman Test: But if X and ε are independent, βˆ. ECON 324 Page 1

STK3100 and STK4100 Autumn 2017

Singular Value Decomposition. Linear Algebra (3) Singular Value Decomposition. SVD and Eigenvectors. Solving LEs with SVD

STK4011 and STK9011 Autumn 2016

Investigation of Partially Conditional RP Model with Response Error. Ed Stanek

The number of observed cases The number of parameters. ith case of the dichotomous dependent variable. the ith case of the jth parameter

Lecture 3. Sampling, sampling distributions, and parameter estimation

Chapter Business Statistics: A First Course Fifth Edition. Learning Objectives. Correlation vs. Regression. In this chapter, you learn:

Chapter 13 Student Lecture Notes 13-1

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

Multiple Choice Test. Chapter Adequacy of Models for Regression

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

Recall MLR 5 Homskedasticity error u has the same variance given any values of the explanatory variables Var(u x1,...,xk) = 2 or E(UU ) = 2 I

ENGI 4421 Propagation of Error Page 8-01

STK3100 and STK4100 Autumn 2018

3D Geometry for Computer Graphics. Lesson 2: PCA & SVD


Chapter 8. Inferences about More Than Two Population Central Values

residual. (Note that usually in descriptions of regression analysis, upper-case

Chapter Two. An Introduction to Regression ( )

Lecture 4 Sep 9, 2015

Lecture 7: Linear and quadratic classifiers

X ε ) = 0, or equivalently, lim

Generative classification models

9.1 Introduction to the probit and logit models

CHAPTER VI Statistical Analysis of Experimental Data

Third handout: On the Gini Index

GENERALIZED METHOD OF MOMENTS CHARACTERISTICS AND ITS APPLICATION ON PANELDATA

Econ 388 R. Butler 2016 rev Lecture 5 Multivariate 2 I. Partitioned Regression and Partial Regression Table 1: Projections everywhere

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Transcription:

Itroducto to Matrces ad Matrx Approach to Smple Lear Regresso

Matrces Defto: A matrx s a rectagular array of umbers or symbolc elemets I may applcatos, the rows of a matrx wll represet dvduals cases (people, tems, plats, amals,...) ad colums wll represet attrbutes or characterstcs The dmeso of a matrx s t umber of rows ad colums, ofte deoted as r x c (r rows by c colums) Ca be represeted full form or abbrevated form: a a a j a c a a a j a c A aj,..., r; j,..., c a a aj a c a r ar arj arc

Square Matrx: Number of rows = # of Colums Specal Types of Matrces 0 3 50 b b A 8 4 B b b 8 46 60 Vector: Matrx wth oe colum (colum vector) or oe row (row vector) d 57 d C 4 D E' 7 3 ' f f f3 d F 3 r c 8 d4 Traspose: Matrx formed by terchagg rows ad colums of a matrx (use "prme" to deote traspose) 6 8 6 5 G ' 5 3 3 8 3 5 G 3 5 h h c h hr H h j,..., r; j,..., c ' h,..., ;,..., rc H j j c r cr hr h rc h c h rc Matrx Equalty: Matrces of the same dmeso, ad correspodg elemets same cells are all equal: 4 6 b b A = b 4, b 6, b, b 0 0 B b b

Regresso Examples - Toluca Data Respose Vector: ' Desg Matrx: ' 80 399 30 50 90 376 70 36 60 4 0 546 80 35 00 353 50 57 40 60 70 5 90 389 0 3 0 435 00 40 30 50 68 90 377 0 4 30 73 90 468 40 44 80 34 70 33

Matrx Addto ad Subtracto Addto ad Subtracto of Matrces of Commo Dmeso: 4 7 0 4 7 0 6 7 4 7 0 7 C 0 D 4 6 C D 0 4 6 4 8 C D 0 4 6 4 6 a a c b b c A rc a j,..., r; j,..., c b B j,..., r; j,..., c rc ar a rc br b rc a b a c b c A B aj b j,..., r; j,..., c rc a rbr arc b rc a b a c b c AB aj b,..., ;,..., rc j r j c ar br arc b rc Regresso Example: E E E E E sce E ε E ε E E E E

Matrx Multplcato Multplcato of a Matrx by a Scalar (sgle umber): 3() 3() 6 3 k 3 A 7 ka 3( ) 3(7) 6 Multplcato of a Matrx by a Matrx (#cols( A) = #rows( B)): If c r : A B A B AB = ab,..., r ; j,..., c r c r c r c A A B B A ab c r th th j sum of the products of the A B elemets of row of A ad j colum of B: 5 3 A 3 B 4 0 7 (3) 5() ( ) 5(4) 6 8 A B AB 3(3) ( )() 3( ) ( )(4) 7 7 0(3) 7() 0( ) 7(4) 4 8 3 3 3 B j A B c If c r c : A B AB = ab = a b,..., r ; j,..., c A B j k kj A B ra ca rb cb ra cb k

Matrx Multplcato Examples - I Smultaeous Equatos: ( equatos: x, x ukow): a x a x y ax ax y a x a x y a x a x y a a x y a a x y A = Sum of Squares: 4 3 4 3 0 0 0 Regresso Equato (Expected Values): 0 4 3 9

Matrx Multplcato Examples - II Matrces used smple lear regresso (that geeralze to multple regresso): ' ' ' 0 0 0 β 0

Specal Matrx Types Symmetrc Matrx: Square matrx wth a traspose equal to tself: A = A': 6 9 8 6 9 8 A 9 4 3 9 4 3 A' A 8 3 8 3 Dagoal Matrx: Square matrx wth all off-dagoal elemets equal to 0: 4 0 0 b 0 0 A 0 0 0 b 0 B Note:Dagoal matrces are symmetrc (ot vce versa) 0 0 0 0 b 3 Idetty Matrx: Dagoal matrx wth all dagoal elemets equal to (acts lke multplyg a scalar by ): 0 0 a a a3 a a a3 I 0 0 A a a a IA AI A a a a 3 3 33 33 0 0 a3 a3 a 33 a3 a3 a 33 Scalar Matrx: Dagoal matrx wth all dagoal elemets equal to a sgle umber" k 0 0 0 0 0 0 0 k 0 0 0 0 0 k k I 0 0 k 0 0 0 0 0 0 0 k 0 0 0 -Vector ad matrx ad zero-vector: 44 0 0 0 J 0 Note: ' r ' r rr r r r J rr rr

Lear Depedece ad Rak of a Matrx Lear Depedece: Whe a lear fucto of the colums (rows) of a matrx produces a zero vector (oe or more colums (rows) ca be wrtte as lear fucto of the other colums (rows)) Rak of a matrx: Number of learly depedet colums (rows) of the matrx. Rak caot exceed the mmum of the umber of rows or colums of the matrx. rak(a) m(r A,c a ) A matrx f full rak f rak(a) = m(r A,c a ) 3 A 3 Colums of are learly depedet rak( ) = 4 A A A A 0 A A 4 3 B 0 0 Colums of are learly depedet rak( ) = 4 B B B B 0 B B

Matrx Iverse Note: For scalars (except 0), whe we multply a umber, by ts recprocal, we get : (/)= x(/x)=x(x - )= I matrx form f A s a square matrx ad full rak (all rows ad colums are learly depedet), the A has a verse: A - such that: A - A = A A - = I 8 8 4 3 6 6 8 36 36 36 36 8 36 36 36 36 - - 0 A 4 A A A 4 4 4 8 8 3 4 0 I 36 36 36 36 36 36 36 36 4 0 0 / 4 0 0 4 / 4 0 0 0 0 0 0 0 0 0 0 0 0-0 / 0 - B B BB 0 0 0 0 / 0 0 0 0 0 0 I 0 0 6 0 0 / 6 0 0 0 0 0 0 0 0 6/ 6 0 0

Computg a Iverse of x Matrx a a A full rak (colums/rows are learly depedet) a a Determat of A A a a a a Note: If A s ot full rak (for some value k): a ka a ka A aa aa kaa aka 0 a a - A Thus does ot exst f s ot full rak a a A A A Whle there are rules for geeral r r matrces, we wll use computers to solve them Regresso Example: ' ' ' Note: '

Use of Iverse Matrx Solvg Smultaeous Equatos A = C where A ad C are matrces of of costats, s matrx of ukows - - - A A A C = A C (assumg A s square ad full rak) Equato : y 6y 48 Equato : 0y y 6 y 48 - A 0 y C = A C - 6 6 A ( ) 6(0) 0 84 0 - = A C 6 48 96 7 68 84 0 84 480 44 84 336 4 Note the wsdom of watg to dvde by A at ed of calculato!

Useful Matrx Results All rules assume that the matrces are coformable to operatos: Addto Rules: A B B A ( A B) C A ( B C) Multplcato Rules: (AB)C A(BC) C( A B) CA + CB k( A B) ka kb k scalar Traspose Rules: ( A ')' A ( A B)' A ' B ' ( AB)' B'A' (ABC)' = C'B'A' Iverse Rules (Full Rak, Square Matrces): - - - - - - - - - - - (AB) = B A (ABC) = C B A (A ) = A (A') = (A )'

Radom Vectors ad Matrces Show for case of =3, geeralzes to ay : Radom varables:,, 3 3 E Expectato: E E I geeral: Ej E,..., ; j,..., p p E 3 Varace-Covarace Matrx for a Radom Vector: E E E E ' E E E E 3 E3 3 E 3 E E E E 3 E 3 E E E E 3 E3 3 E3 E 3 E3 E 3 E 3 3 E 3 3 3 3

Lear Regresso Example (=3) Error terms are assumed to be depedet, wth mea 0, costat varace : j E 0, 0 j 0 0 0 0 ε E ε σ ε 0 0 I 3 0 0 0 σ σ β + ε = β + ε E E β + ε β E ε β 0 0 0 0 σ ε 0 0 I

Mea ad Varace of Lear Fuctos of A matrx of fxed costats k radom vector a a W a... a W A radom vector: k W a a W a... a k k k k k E W E W EW k E a... a a E... a E Eak... ak ake... ake a a E AE a a E k k σ W = E A - AE A - AE ' = E A - E A - E ' = E A - E - E 'A' = AE - E - E ' A' = Aσ A'

Multvarate Normal Dstrbuto μ = E Σ = σ Multvarate Normal Desty fucto: f / / - Σ exp - μ 'Σ - μ ~ Nμ, Σ ~,,..., N, j Note, f A W = A ~ N s a (full rak) matrx of fxed costats: Aμ, AΣA' j j

Smple Lear Regresso Matrx Form Smple Lear Regresso Model:,..., 0 0 0 Defg: 0 0 0 0 = β ε 0 = β + ε sce: β E Assumg costat varace, ad depedece of error terms : 0 0 0 0 σ = σ ε I 0 0 Further, assumg ormal dstrbuto for error terms : ~ N β, I

Estmatg Parameters by Least Squares Q Q Normal equatos obtaed from:, ad settg each equal to 0: 0 0 0 ( ) ( ) Note: I matrx form: ' 'β = ' β = ' ' - 0 ' Defg β Based o matrx form: Q - β ' - β = ' - 'β - β'' + β''β ' - 'β + β''β ' 0 0 0 Q 0 set 0 - ( Q) ' 'β 0 ' β ' β = ' ' β Q 0 Geeral Result for fxed symmetrc matrx A ad varable vector w : w Aw A' w waw Aw

0 b e 0 Ftted Values ad Resduals 0 - - β = ' ' = P P = ' ' 0 I Matrx form: P s called the "projecto" or "hat" matrx, ote that P s dempotet ( PP = P) ad symmetrc( P = P' ): e - = - β = - P = (I - P) - - - - - - PP = ' ' ' ' 'I ' ' ' ' P P' = ' ' ' = ' ' = P Note: - E = E P = PE = Pβ = ' 'β = β σ = Pσ IP' P MSE MSE E e = E I P = I P E = I P β β - β = 0 σ e = I P σ I I P ' I P s P s e = I P

Total (Corrected) Sum of Squares: Aalyss of Varace SSTO Note: ' SSTO ' ' 'J J 'J I J Defg SSE as the Resdual (Error) Sum of Squares: sce SSE e'e = - β ' - β = ' - 'β- β ' + β''β = ' - β '' = ' I - P β'' = '' ' - ' = 'P Defg SSR as the Regresso Sum of Squares: SSR SSTO SSE β'' 'P 'J ' P J Note that SSTO, SSR, ad SSE are all QUADRATIC FORMS: 'A for symmetrc ad dempotet matrces A

Ifereces Lear Regresso MSE where - - - β = ' ' E β = ' 'E = ' 'β = β - - - - - - σ β = ' 'σ ' = σ ' 'I ' = σ ' s β ' SSE s MSE Recall: - ' s β MSE MSE MSE MSE MSE Estmated Mea Respose at : h s MSE - h 0 h 'β h h h h's β h h' ' h h Predcted New Respose at - h 0 h 'β h s pred MSE h' ' h h :