X X X E[ ] E X E X. is the ()m n where the ( i,)th. j element is the mean of the ( i,)th., then

Similar documents
UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

Lecture 3 Probability review (cont d)

UNIVERSITY OF OSLO DEPARTMENT OF ECONOMICS

ρ < 1 be five real numbers. The

9 U-STATISTICS. Eh =(m!) 1 Eh(X (1),..., X (m ) ) i.i.d

Chapter 4 Multiple Random Variables

Lecture Note to Rice Chapter 8

STATISTICAL PROPERTIES OF LEAST SQUARES ESTIMATORS. x, where. = y - ˆ " 1

22 Nonparametric Methods.

TESTS BASED ON MAXIMUM LIKELIHOOD

Chapter 5 Properties of a Random Sample

Section 2 Notes. Elizabeth Stone and Charles Wang. January 15, Expectation and Conditional Expectation of a Random Variable.

Econometric Methods. Review of Estimation

Qualifying Exam Statistical Theory Problem Solutions August 2005

( ) = ( ) ( ) Chapter 13 Asymptotic Theory and Stochastic Regressors. Stochastic regressors model

Ordinary Least Squares Regression. Simple Regression. Algebra and Assumptions.

Dr. Shalabh. Indian Institute of Technology Kanpur

X ε ) = 0, or equivalently, lim

ENGI 4421 Joint Probability Distributions Page Joint Probability Distributions [Navidi sections 2.5 and 2.6; Devore sections

Chapter 4 Multiple Random Variables

Lecture 7. Confidence Intervals and Hypothesis Tests in the Simple CLR Model

Introduction to Matrices and Matrix Approach to Simple Linear Regression

Special Instructions / Useful Data

ECONOMETRIC THEORY. MODULE VIII Lecture - 26 Heteroskedasticity

Point Estimation: definition of estimators

{ }{ ( )} (, ) = ( ) ( ) ( ) Chapter 14 Exercises in Sampling Theory. Exercise 1 (Simple random sampling): Solution:

COV. Violation of constant variance of ε i s but they are still independent. The error term (ε) is said to be heteroscedastic.

Multivariate Transformation of Variables and Maximum Likelihood Estimation

Class 13,14 June 17, 19, 2015

CHAPTER VI Statistical Analysis of Experimental Data

Lecture 3. Sampling, sampling distributions, and parameter estimation

CHAPTER 3 POSTERIOR DISTRIBUTIONS

Chapter 14 Logistic Regression Models

Chapter 3 Sampling For Proportions and Percentages

THE ROYAL STATISTICAL SOCIETY 2016 EXAMINATIONS SOLUTIONS HIGHER CERTIFICATE MODULE 5

Summary of the lecture in Biostatistics

1 Solution to Problem 6.40

ESS Line Fitting

Chapter 8: Statistical Analysis of Simulated Data

Functions of Random Variables

Third handout: On the Gini Index

Simulation Output Analysis

The equation is sometimes presented in form Y = a + b x. This is reasonable, but it s not the notation we use.

ECON 5360 Class Notes GMM

Midterm Exam 1, section 2 (Solution) Thursday, February hour, 15 minutes

Lecture Notes Types of economic variables

4 Inner Product Spaces

STK4011 and STK9011 Autumn 2016

4. Standard Regression Model and Spatial Dependence Tests

CHAPTER 4 RADICAL EXPRESSIONS

Linear Regression with One Regressor

Chapter 9 Jordan Block Matrices

STK3100 and STK4100 Autumn 2018

ENGI 3423 Simple Linear Regression Page 12-01

STK3100 and STK4100 Autumn 2017

Lecture Notes to Rice Chapter 5

Lecture 2: The Simple Regression Model

Answer key to problem set # 2 ECON 342 J. Marcelo Ochoa Spring, 2009

Lecture 9: Tolerant Testing

Lecture Notes 2. The ability to manipulate matrices is critical in economics.

Homework 1: Solutions Sid Banerjee Problem 1: (Practice with Asymptotic Notation) ORIE 4520: Stochastics at Scale Fall 2015

2SLS Estimates ECON In this case, begin with the assumption that E[ i

Midterm Exam 1, section 1 (Solution) Thursday, February hour, 15 minutes

Econometrics. 3) Statistical properties of the OLS estimator

Mean is only appropriate for interval or ratio scales, not ordinal or nominal.

PROPERTIES OF GOOD ESTIMATORS

Simple Linear Regression

Wu-Hausman Test: But if X and ε are independent, βˆ. ECON 324 Page 1

ECON 482 / WH Hong The Simple Regression Model 1. Definition of the Simple Regression Model

Generalized Linear Regression with Regularization

Part 4b Asymptotic Results for MRR2 using PRESS. Recall that the PRESS statistic is a special type of cross validation procedure (see Allen (1971))

Investigation of Partially Conditional RP Model with Response Error. Ed Stanek

Handout #8. X\Y f(x) 0 1/16 1/ / /16 3/ / /16 3/16 0 3/ /16 1/16 1/8 g(y) 1/16 1/4 3/8 1/4 1/16 1

The expected value of a sum of random variables,, is the sum of the expected values:

Dr. Shalabh Department of Mathematics and Statistics Indian Institute of Technology Kanpur

Maximum Likelihood Estimation

PART ONE. Solutions to Exercises

Random Variables and Probability Distributions

Dimensionality Reduction and Learning

Discrete Mathematics and Probability Theory Fall 2016 Seshia and Walrand DIS 10b

Lecture 02: Bounding tail distributions of a random variable

STA302/1001-Fall 2008 Midterm Test October 21, 2008

GENERALIZED METHOD OF MOMENTS CHARACTERISTICS AND ITS APPLICATION ON PANELDATA

Study of Correlation using Bayes Approach under bivariate Distributions

MATH 247/Winter Notes on the adjoint and on normal operators.

Fundamentals of Regression Analysis

6.867 Machine Learning

Continuous Distributions

Overview. Basic concepts of Bayesian learning. Most probable model given data Coin tosses Linear regression Logistic regression

best estimate (mean) for X uncertainty or error in the measurement (systematic, random or statistical) best

8.1 Hashing Algorithms

Bounds on the expected entropy and KL-divergence of sampled multinomial distributions. Brandon C. Roy

Module 7. Lecture 7: Statistical parameter estimation

M2S1 - EXERCISES 8: SOLUTIONS

Simple Linear Regression

Lecture Notes Forecasting the process of estimating or predicting unknown situations

Department of Mathematics UNIVERSITY OF OSLO. FORMULAS FOR STK4040 (version 1, September 12th, 2011) A - Vectors and matrices

Probability and. Lecture 13: and Correlation

ε. Therefore, the estimate

Assignment 5/MATH 247/Winter Due: Friday, February 19 in class (!) (answers will be posted right after class)

Transcription:

Secto 5 Vectors of Radom Varables Whe workg wth several radom varables,,..., to arrage them vector form x, t s ofte coveet We ca the make use of matrx algebra to help us orgaze ad mapulate large umbers of radom varables smultaeously. We defe the expectato of a radom vector as elemet-by-elemet expectato: E[ ] E[ ] E[ x]. E[ ] If s a ()m matrx of radom varables, the E[ ] s the ()m where the (,)th j elemet s the mea of the (,)th j elemet of,.e., matrx f m m m, the E[ ] E[ ] E[ ] E[ ] E[ ] E[ ] E[ ]. E[ m] E[ m] E[ m ] These deftos provde a eat way for computg the varaces ad covaraces of the varables all at oce : var[ x] E[( x E[ x])( x E[ x]) ] ( E[ ])( E[ ])( [ E ])( [ E ])( [ ])( E [ ]) E ( E[ ])( E[ ])( [ E ])( [ E ])( [ ])( E [ ]) E E ( E[ ])( E[ ])( [ E ])( [ E ])( E[ ])( E[ ]) var[ ] cov[, ] cov[, ] cov[, ] var[ ] cov[, ] cov[, ] cov[, ] var[ ] We call var[ x] the varace-covarace matrx of x. Athoy Tay 5-

The formula var[ x] E[( x E[ x])( x E[ x]) ] ca be vewed as the matrx verso of the varace formula var[ ] E[( E[ ]) ] for a sgle varable. Sometmes we wat to compute a covarace matrx betwee two vectors of radom varables x ad y. We ca compute cov[ x, y] E[( x E[ x])( y E[ y]) ] ( E[ ])( Y E[ Y ])( [ E ])( Y [ E])( Y [ ])( E [ ]) Y E Y ( E[ ])( Y E[ Y ])( [ E ])( Y [ E ])( Y [ ])( E [ Y]) E Y E ( E[ ])( Y E[ Y ])( [ E ])( Y [ E Y])( E[ ])( Y E[ Y ]) cov[, Y ] cov[, Y ] cov[, Y ] cov[, Y ] cov[, Y ] cov[, Y ] cov[, Y ] cov[, Y ] cov[, Y ] Rules for dealg wth the mea vector ad the varace-covarace matrx If x s a ( ) vector of radom varables, s a ()m matrx of radom varables, b s a ( m ) vector of costats, ad A s a ()m matrx of costats, the. E[ Ax b] AE[ x] b. var -cov[ Ax b] A var[ x] A. I partcular, cx c x c j j j var[ c c... c ] var[ ] var[ ] c c cov[, ] 3. A useful result s E[ tr[ ]] E[... ] E[ ] E[ ]... E[ ]. tr[ E[ ]] The frst of these s straghtforward to show by smply wrtg out the expresso Ax b full ad takg expectatos. Ths formula s the matrx verso of the usual sgle varable result E[ a b] ae[ ] b Athoy Tay 5-

To show (), plug Ax b to the varace formula: var[ Ax b] E[( Ax b E[ Ax b])( Ax b E[ Ax b]) ] E[( Ax b AE[ x])( b Ax b [ AE]) ] x b E[( Ax AE[ x])( Ax AE[ x]) ] E[( A x [ E])(( x A x[ ])) E x] E[( A x [ E])( x x [ E]) x A] AE[( x E[ x])( x E[ x]) ] A A var[ x] A Ths s the matrx verso of the sgle varable result var[ a b] a var[ ]. Note that a varace-covarace matrx must be postve-defte -- has to be postve for all var[ c c... c ] var[ cx] cvar[ x] c c 0, sce varaces must be postve. Exercse be costats. Wrte out Let x be a vector of radom varables, ad a A a a a b ad b b Ax b full, ad take expectatos to show that E[ Ax b] AE[ x] b The Multvarate Normal Dstrbuto The radom vector x follows the multvarate ormal dstrbuto wth mea E[ x] ad varace-covarace matrx μ f ts dstrbuto has the form f Σ / ()() x Σexp{(/ )()'()} x μ Σ x μ. Athoy Tay 5-3

We deote ths by x ~( N,) μ Σ. Ths s aalogous to the uvarate ormal pdf: f () x () x exp Exercse Wrte the jot pdf out wthout matrx otato for the bvarate case x. The show that say? f ( x,)()() x f x f x whe 0. What does ths, Exercse Use computer software (say matlab) to plot the dstrbuto of the bvarate ormal for varous parameter values. The followg are mportat propertes of the multvarate ormal: 4. The codtoal dstrbutos are also ormal. I partcular, f x ~ N μ, Σ Σ x μ Σ Σ where x ad μ are ( ), x ad μ are ( ), Σ s (), Σ (), Σ s (), ad (), the a. the margal dstrbuto for x s N( μ,) Σ b. the codtoal dstrbuto for x gve x s N( μ,) Σ where () μ μ Σ Σ x μ ad Σ Σ Σ Σ Σ s Exercse I the bvarate case, ~ N, (4a) says that ~( N,) ad ~( N,). Wrte out the expressos (4b). Note partcular that the codtoal mea of gve s a lear fucto of. Athoy Tay 5-4

Exercse Usg the expressos for the codtoal mea ad codtoal varace of gve, show that f ( x,)( x )() f x x f x., (You ca make a smlar argumet for the geeral x ad x case.) 5. If x ~( N,) μ Σ, the Ax b ~( N Aμ,) b AΣA ; The expresso Ax b s ormal because lear combatos of ormal radom varables rema ormal. The formulae for the mea ad varace-covarace matrx are the usual oes. 6. If x ~( N,) μ Σ ad depedet. Σ dag(,,...,), the the radom varables x are The followg make use of the fact that The square of a stadard ormal varable has a dstrbuto; The sum of depedet s a ; If ~(0,) N, Y ~ ad ad Y are depedet, the ~ t Y / If ~ Y, ~ Y / Y m, ad Y ad Y are depedet, the ~ F Y / m We have 7. If x ~( N,) 0 I ad A s symmetrc, ad dempotet wth rak J, the the scalar radom varable xax ~ () J. I partcular, xx ~ (). Proof Because A s symmetrc, we ca wrte A CΛC, wth CC I. Note that Cx ~( N,) 0 I because var() Cx CIC CC I. That s, Cx s a vector of depedetly dstrbuted stadard ormal varables. Wrte xax xcλcx yλy y (,) m where y Cx. Each y s a depedet ch-sq degree oe, sce the y s are depedet stadard ormal varables. Because A s dempotet, there are J s equal to oe, ad () J s that are zero. Relabelg the y s so that the frst s are equal to oe, we have whch s a sum of J depedet xax J J y ~ () J. Therefore xax. Athoy Tay 5-5

8. If x ~( N,) μ Σ, the () ()() x ~ μ Σ x μ. Proof Σ s postve defte, symmetrc, ad full rak. Therefore we ca wrte / / / Σ Σ Σ. Note that z Σ () x~( μ,) N 0 I, therefore / / () ()()(())() x μ Σ x μ~ Σ x μ Σ x μ zz. 9. If x ~( N,) 0 I, ad A ad B are symmetrc ad dempotet, the xax ad xbx are depedet f AB 0. Proof Because A ad B are symmetrc ad dempotet, we have AA A ad BB B. Therefore we ca wrte the quadratc forms as xax xaax ()() Ax Ax. Because x s ormal wth mea 0, Ax also ormal wthmea 0. For vectors of zero mea radom varables, cov[ x, y] E[ xy] (why?). We have E cov Ax, Bx AxxB AE[ xx] B AB AB. Therefore, AB 0 mples that Ax ad Bx are ormally dstrbuted, wth covarace 0. Ths mples that Ax ad Bx are depedet (why?), ad therefore the quadratc forms xax ad xbx are also depedet. It follows from (7) that [ xax /()] rak A [ xbx /()] rak B s dstrbuted F rak() A,() rak B. 0. If x ~( N,) 0 I, ad A s symmetrc ad dempotet, the Lx ad xax are depedet f LA 0. Proof Same dea as (9). We use these results to prove some stadard results statstcs. Suppose,,..., are depedet draws from a N(,) dstrbuto,.e. x ~( N,) μ I. where μ s a ( ) vector μ. Athoy Tay 5-6

We kow that the sample mea s ormally dstrbuted (because a lear combato of ormal varables s ormal) wth mea ad varace E[ ] E E var[ ] var var Furthermore, ~ N, mples that / ~(0,) N Ufortuately, ths result s ot very helpful f, for stace, you wat to test a hypothess o, sce s geeral ukow, ad must be estmated. A ubased estmator s ˆ () To show ths, ote that (). Each has varace so Also, var[ ] E[ ], E E. E var E. Puttg all ths together, we have: Athoy Tay 5-7

E ˆ E () E ( ). Oe dea, the, s to substtute wth ˆ / to get the t-statstc t. ˆ / Ufortuately, the t -statstc does ot have a stadard ormal dstrbuto. We use the results dscussed earler to derve the dstrbuto of the t - statstc. We beg by dervg a matrx expresso for ˆ. Observe frst that 0 0 0 0 (()) I x 0 0 (/) Athoy Tay 5-8

A terestg property of the matrx dempotet, wth rak. Symmetrc: Idempotet: M I () s that t s symmetrc ad ()()() M I I I M ()() MM I I II ()()()() I I I ()()() cacels out M Because M symmetrc ad dempotet, rak()() M trace M, ad trace I ()()(())(()) tracei trace trace Therefore ˆ () xmmx xmx Furthermore, ote that x μ ~( N,) 0 I Mx M() x μ sce Mμ 0 (why?) Together wth the fact that M s symmetrc ad dempotet wth rak, result (7) mples that xmx ()() x μ ~ M x μ. Athoy Tay 5-9

Ths result s cosstet wth the fact that.e. Sce ˆ E[ ˆ ] E[ xmx ]. xmx, the result follows. The fact that ( ) ˆ xmx ~,. The mea of a s, whch we have just show, s of course a much stroger oe. The result that E[ ˆ ] does ot deped o the ormalty of x. If we have ormalty of x, the we have the Fally, ote that ad that result. ()() ~(0,) x μ N, ()()()()()() M I 0 whch says that cacels out ()() ~(0,) x μ N ad are depedet. Therefore, ( ) ˆ xmx ~ t () N(0,) ~ ˆ / ( ) ˆ ( ) t. Of course, f s are ot radom draws from N(,), the all of these results do ot hold (except E[ ], var[ ] / ad E[ ˆ ], whch does ot requre the ormalty assumpto). Uder reasoable codtos, the t -statstc wll coverge to the ormal as the sample sze grows. Athoy Tay 5-0