Lecture 11. Multivariate Normal theory

Similar documents
Part IB Statistics. Theorems with proof. Based on lectures by D. Spiegelhalter Notes taken by Dexter Chua. Lent 2015

The Multivariate Normal Distribution 1

BIOS 2083 Linear Models Abdus S. Wahed. Chapter 2 84

Random Vectors 1. STA442/2101 Fall See last slide for copyright information. 1 / 30

Random Vectors and Multivariate Normal Distributions

The Multivariate Normal Distribution 1

BASICS OF PROBABILITY

Chapter 5. The multivariate normal distribution. Probability Theory. Linear transformations. The mean vector and the covariance matrix

MAS223 Statistical Inference and Modelling Exercises

The Multivariate Normal Distribution. In this case according to our theorem

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

01 Probability Theory and Statistics Review

The Multivariate Normal Distribution. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 36

7.3 The Chi-square, F and t-distributions

Multivariate Distributions

Notes on Random Vectors and Multivariate Normal

Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Probability and Distributions

5.1 Consistency of least squares estimates. We begin with a few consistency results that stand on their own and do not depend on normality.

Chapter 2: Fundamentals of Statistics Lecture 15: Models and statistics

EEL 5544 Noise in Linear Systems Lecture 30. X (s) = E [ e sx] f X (x)e sx dx. Moments can be found from the Laplace transform as

Chapter 5 continued. Chapter 5 sections

Introduction to Normal Distribution

Probability Background

Lecture Note 1: Probability Theory and Statistics

M(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1

MIT Spring 2015

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

The Multivariate Normal Distribution. Copyright c 2012 Dan Nettleton (Iowa State University) Statistics / 36

STT 843 Key to Homework 1 Spring 2018

STA 2101/442 Assignment 3 1

Lecture 15: Multivariate normal distributions

TAMS39 Lecture 2 Multivariate normal distribution

Random Variables and Their Distributions

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Regression and Statistical Inference

3. Probability and Statistics

Gaussian vectors and central limit theorem

A Probability Review

5 Operations on Multiple Random Variables

Multivariate Random Variable

Multiple Random Variables

The Multivariate Gaussian Distribution

x. Figure 1: Examples of univariate Gaussian pdfs N (x; µ, σ 2 ).

Lecture 1: August 28

8 - Continuous random vectors

STAT 4385 Topic 01: Introduction & Review

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

Lecture 2: Review of Basic Probability Theory

CHAPTER 1 DISTRIBUTION THEORY 1 CHAPTER 1: DISTRIBUTION THEORY

STAT 100C: Linear models

Formulas for probability theory and linear models SF2941

Recall that if X 1,...,X n are random variables with finite expectations, then. The X i can be continuous or discrete or of any other type.

Chapter 5. Chapter 5 sections

MLES & Multivariate Normal Theory

discrete random variable: probability mass function continuous random variable: probability density function

Introduction to Probability and Stocastic Processes - Part I

Sampling Distributions

Elements of Probability Theory

Stat 206: Sampling theory, sample moments, mahalanobis

Mathematical Statistics

Lecture 14: Multivariate mgf s and chf s

1 Simulating normal (Gaussian) rvs with applications to simulating Brownian motion and geometric Brownian motion in one and two dimensions

Ch4. Distribution of Quadratic Forms in y

[y i α βx i ] 2 (2) Q = i=1

Probability Lecture III (August, 2006)

Moment Generating Function. STAT/MTHE 353: 5 Moment Generating Functions and Multivariate Normal Distribution

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

2 Functions of random variables

ACM 116: Lectures 3 4

Probability and Statistics Notes

Lecture 12: Small Sample Intervals Based on a Normal Population Distribution

The Multivariate Gaussian Distribution [DRAFT]

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Sampling Distributions

Joint p.d.f. and Independent Random Variables

Gaussian random variables inr n

STAT 512 sp 2018 Summary Sheet

Contents 1. Contents

STAT 100C: Linear models

Chapter 4. Chapter 4 sections

ANOVA: Analysis of Variance - Part I

Geometric Skew-Normal Distribution

1 Review of Probability and Distributions

Short course A vademecum of statistical pattern recognition techniques with applications to image and video analysis. Agenda

Stat 5101 Notes: Brand Name Distributions

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

1.12 Multivariate Random Variables

Chapter 4 : Expectation and Moments

Advanced Econometrics I

In this course we: do distribution theory when ǫ i N(0, σ 2 ) discuss what if the errors, ǫ i are not normal? omit proofs.

Vectors and Matrices Statistics with Vectors and Matrices

Multivariate Gaussian Distribution. Auxiliary notes for Time Series Analysis SF2943. Spring 2013

I L L I N O I S UNIVERSITY OF ILLINOIS AT URBANA-CHAMPAIGN

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

ECON Fundamentals of Probability

Gaussian, Markov and stationary processes

REVIEW OF MAIN CONCEPTS AND FORMULAS A B = Ā B. Pr(A B C) = Pr(A) Pr(A B C) =Pr(A) Pr(B A) Pr(C A B)

Transcription:

10. Lecture 11. Multivariate Normal theory Lecture 11. Multivariate Normal theory 1 (1 1)

11. Multivariate Normal theory 11.1. Properties of means and covariances of vectors Properties of means and covariances of vectors A random (column) vector X = (X 1,.., X n ) T has mean and covariance matrix µ = E(X) = (E(X 1 ),..., E(X n )) T = (µ 1,.., µ n ) T cov(x) = E[(X µ)(x µ) T ] = (cov(x i, X j )) i,j, provided the relevant expectations exist. For m n A, and E[AX] = Aµ, cov(ax) = A cov(x) A T, (1) since cov(ax) = E [ (AX E(AX ))(AX E(AX )) T ) ] = E [ A(X E(X ))(X E(X )) T A T ]. Define cov(v, W ) to be a matrix with (i, j) element cov(v i, W j ). Then cov(ax, BX) = A cov(x) B T. (check. Important for later) Lecture 11. Multivariate Normal theory 2 (1 1)

11. Multivariate Normal theory 11.2. Multivariate normal distribution Multivariate normal distribution Recall that a univariate normal X N(µ, σ 2 ) has density ( ) f X (x; µ, σ 2 ) = 1 2πσ exp 1 (x µ) 2 2 σ, x R, 2 and mgf M X (s) = E[e sx ] = exp ( µs + 1 2 σ2 s 2). X has a multivariate normal distribution if, for every t R n, the rv t T X has a normal distribution. If E(X) = µ and cov(x) = Σ, we write X N n (µ, Σ). Note Σ is symmetric and is non-negative definite because by (??), t T Σt = var(t T X) 0. By (??), t T X N(t T µ, t T Σt) and so has mgf M tt X(s) = E[e stt X ] = exp (t T µs + 12 ) tt Σts 2. Hence X has mgf M X (t) == E[e tt X ] = M t T X(1) = exp ( t T µ + 1 2 tt Σt ). (2) Lecture 11. Multivariate Normal theory 3 (1 1)

11. Multivariate Normal theory 11.2. Multivariate normal distribution Proposition 11.1 (i) If X N n (µ, Σ) and A is m n, then AX N m (Aµ, AΣA T ) (ii) If X N n (0, σ 2 I ) then X 2 σ 2 = XT X σ 2 = Xi 2 σ 2 χ2 n. Proof: (i) from exercise sheet 3. (ii) Immediate from definition of χ 2 n. Note that we often write X 2 σ 2 χ 2 n. Lecture 11. Multivariate Normal theory 4 (1 1)

11. Multivariate Normal theory 11.2. Multivariate normal distribution Proposition 11.2 ( ) X1 Let X N n (µ, Σ), X =, where X X i is a n i 1 column vector, and 2 ( ) ( ) µ1 Σ11 Σ n 1 + n 2 = n. Write similarly µ =, and Σ = 12, where Σ µ 2 Σ 21 Σ ij is 22 n i n j. Then (i) X i N ni (µ i, Σ ii ), (ii) X 1 and X 2 are independent iff Σ 12 = 0. Proof: (i) See Example sheet 3. (ii) From (??), M X (t) = exp ( t T µ + 1 2 tt Σt ), t R n. Write M X (t) = exp ( t 1 T µ 1 + t 2 T µ 2 + 1 2 t 1 T Σ 11 t 1 + 1 2 t 2 T Σ 22 t 2 + 1 2 t 1 T Σ 12 t 2 + 1 2 t 2 T Σ 21 t 1 ). From (i), M Xi (t i ) = exp ( ) ( ) t T i µ i + 1 2 tt i Σ ii t i so MX (t) = M X1 (t 1 )M X2 (t 2 ), for all t1 t = iff Σ t 12 = 0. 2 Lecture 11. Multivariate Normal theory 5 (1 1)

11. Multivariate Normal theory 11.3. Density for a multivariate normal Density for a multivariate normal When Σ is positive definite, then X has pdf f X (x; µ, Σ) = 1 Σ 1 2 ( 1 2π ) n exp [ 1 2 (x µ)t Σ 1 (x µ) ], x R n. Lecture 11. Multivariate Normal theory 6 (1 1)

Normal random samples 11. Multivariate Normal theory 11.4. Normal random samples We now consider X = 1 n Xi, and S XX = (X i X ) 2 for univariate normal data. Theorem 11.3 (Joint distribution of X and S XX ) Suppose X 1,..., X n are iid N(µ, σ 2 ), X = 1 n Xi, and S XX = (X i X ) 2. Then (i) X N(µ, σ 2 /n); (ii) S XX /σ 2 χ 2 n 1 ; (iii) X and S XX are independent. Lecture 11. Multivariate Normal theory 7 (1 1)

11. Multivariate Normal theory 11.4. Normal random samples Proof We can write the joint density as X N n (µ, σ 2 I ), where µ = µ1 ( 1 is a n 1 column vector of 1 s). Let A be the n n orthogonal matrix 1 n 1 n 1 n 1 n... n 1 1 1 2 1 2 1 0 0... 0 1 1 2 A = 3 2 3 2 3 2 0... 0..... 1 1 1 1... n(n 1) n(n 1) n(n 1) n(n 1) (n 1) n(n 1). So A T A = AA T = I. (check) (Note that the rows form an orthonormal basis of R n.) (Strictly, we just need an orthogonal matrix with the first row matching that of A above.) Lecture 11. Multivariate Normal theory 8 (1 1)

11. Multivariate Normal theory 11.4. Normal random samples By Proposition 11.1(i), Y = AX N n (Aµ, Aσ 2 IA T ) N n (Aµ, σ 2 I ), since AA T = I. nµ 0 We have Aµ =.., so Y 1 = 1 n n i=1 X i = n X N( nµ, σ 2 ) 0 (Prop 11.1 (ii)) and Y i N(0, σ 2 ), i = 2,..., n and Y 1,..., Y n are independent. Note also that Y2 2 +... + Yn 2 = Y T Y Y1 2 = X T A T A X Y1 2 = X T X n X 2 = n n Xi 2 n X 2 = (X i X ) 2 = S XX. i=1 To prove (ii), note that S XX = Y2 2 +... + Y n 2 σ 2 χ 2 n 1 (from definition of χ 2 n 1 ). FInally, for (iii), since Y 1 and Y 2,..., Y n are independent (Prop 11.2 (ii)), so are X and S XX. Lecture 11. Multivariate Normal theory 9 (1 1) i=1

Student s t-distribution 11. Multivariate Normal theory 11.5. Student s t-distribution Suppose that Z and Y are independent, Z N(0, 1) and Y χ 2 k. Then T = Z is said to have a t-distribution on k degrees of freedom, Y /k and we write T t k. The density of t k turns out to be f T (t) = Γ((k + 1)/2) Γ(k/2) ( ) (k+1)/2 1 1 + t2, t R. πk k This density is symmetric, bell-shaped, and has a maximum at t = 0, rather like the standard normal density. However, it can be shown that P(T > t) > P(Z > t) for all t > 0, and that the t k distribution approaches a normal distribution as k. E k (T ) = 0 for k > 1, otherwise undefined. var k (T ) = k k 2 for k > 2, = if k = 2, otherwise undefined. k = 1 is known as the Cauchy distribution, and has an undefined mean and variance. Lecture 11. Multivariate Normal theory 10 (1 1)

11. Multivariate Normal theory 11.5. Student s t-distribution Let t k (α) be the upper 100α% point of the t k - distribution, so that P(T > t k (α)) = α. There are tables of these percentage points. Lecture 11. Multivariate Normal theory 11 (1 1)

11. Multivariate Normal theory 11.6. Application of Student s t-distribution to normal random samples Application of Student s t-distribution to normal random samples Let X 1,..., X n iid N(µ, σ 2 ). From Theorem 11.3 X N(µ, σ 2 /n) so Z = n( X µ)/σ N(0, 1). Also S XX /σ 2 χ 2 n 1 independently of X and hence of Z. Hence n( X µ)/σ SXX /((n 1)σ 2 ) t n 1, ie n( X µ) SXX /(n 1) t n 1. (3) Let σ 2 = S XX n 1. Note this is an unbiased estimator, as E(S XX ) = (n 1)σ 2. Then a 100(1 α)% CI for µ is found from ( ) n( 1 α = P t n 1 ( α 2 ) X µ) t n 1 ( α σ 2 ) and has endpoints X ± σ n t n 1 ( α 2 ). See example sheet 3 for use of t distributions in hypothesis tests. Lecture 11. Multivariate Normal theory 12 (1 1)