Lecture The Sample Mean and the Sample Variance Under Assumption of Normality

Similar documents
Sampling Distributions

Sampling Distributions

STAT 135 Lab 7 Distributions derived from the normal distribution, and comparing independent samples.

LIST OF FORMULAS FOR STK1100 AND STK1110

Lecture 11. Multivariate Normal theory

Lecture 11. Probability Theory: an Overveiw

Lecture 16. Theory of Second Order Linear Homogeneous ODEs

Lecture 5: Moment generating functions

Lecture 35. Summarizing Data - II

Lecture 12. Introduction to Survey Sampling

Lecture Tricks with Random Variables: The Law of Large Numbers & The Central Limit Theorem

1 Expectation of a continuously distributed random variable

Lecture 2. Classification of Differential Equations and Method of Integrating Factors

Lecture 19: Properties of Expectation

4 Moment generating functions

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

Chapter 4. Chapter 4 sections

PROBABILITY THEORY LECTURE 3

, find P(X = 2 or 3) et) 5. )px (1 p) n x x = 0, 1, 2,..., n. 0 elsewhere = 40

Partial Solutions for h4/2014s: Sampling Distributions

Lecture 4. Continuous Random Variables and Transformations of Random Variables

3. Probability and Statistics

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Lecture 1: August 28

Lecture 31. Basic Theory of First Order Linear Systems

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Will Murray s Probability, XXXII. Moment-Generating Functions 1. We want to study functions of them:

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

STA 4322 Exam I Name: Introduction to Statistics Theory

Statistics, Data Analysis, and Simulation SS 2015

Probability Background

Statistics, Data Analysis, and Simulation SS 2013

Econ 508B: Lecture 5

Lecture 9. Systems of Two First Order Linear ODEs

Summary. Ancillary Statistics What is an ancillary statistic for θ? .2 Can an ancillary statistic be a sufficient statistic?

Math 341: Probability Seventeenth Lecture (11/10/09)

Notes on Random Vectors and Multivariate Normal

Random Variables and Their Distributions

STAT 3610: Review of Probability Distributions

1.1 Review of Probability Theory

M(t) = 1 t. (1 t), 6 M (0) = 20 P (95. X i 110) i=1

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

18.175: Lecture 15 Characteristic functions and central limit theorem

Math 341: Probability Eighteenth Lecture (11/12/09)

Characteristic Functions and the Central Limit Theorem

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

Chapter 7: Special Distributions

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Uses of Asymptotic Distributions: In order to get distribution theory, we need to norm the random variable; we usually look at n 1=2 ( X n ).

Stat410 Probability and Statistics II (F16)

7 Random samples and sampling distributions

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Proving the central limit theorem

5.6 The Normal Distributions

Statistics 3657 : Moment Generating Functions

Formulas for probability theory and linear models SF2941

Sometimes can find power series expansion of M X and read off the moments of X from the coefficients of t k /k!.

Continuous Distributions

18.440: Lecture 28 Lectures Review

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Exponential Distribution and Poisson Process

Math 180A. Lecture 16 Friday May 7 th. Expectation. Recall the three main probability density functions so far (1) Uniform (2) Exponential.

Math 510 midterm 3 answers

Expectation, variance and moments

Continuous Distributions

Lecture 14: Multivariate mgf s and chf s

Gaussian vectors and central limit theorem

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

MATH Notebook 5 Fall 2018/2019

Problems. Suppose both models are fitted to the same data. Show that SS Res, A SS Res, B

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

ACM 116: Lectures 3 4

Statistics 1B. Statistics 1B 1 (1 1)

18.440: Lecture 28 Lectures Review

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Stochastic Processes and Monte-Carlo Methods. University of Massachusetts: Spring 2018 version. Luc Rey-Bellet

Continuous Random Variables

Regression and Statistical Inference

MATH2715: Statistical Methods

Department of Mathematics

Limiting Distributions

Problems on Discrete & Continuous R.Vs

1 Solution to Problem 2.1

or E ( U(X) ) e zx = e ux e ivx = e ux( cos(vx) + i sin(vx) ), B X := { u R : M X (u) < } (4)

Probability and Statistics Notes

Lecture Testing Hypotheses: The Neyman-Pearson Paradigm

Probability and Distributions

i=1 k i=1 g i (Y )] = k f(t)dt and f(y) = F (y) except at possibly countably many points, E[g(Y )] = f(y)dy = 1, F(y) = y

Limiting Distributions

i=1 k i=1 g i (Y )] = k

MFin Econometrics I Session 4: t-distribution, Simple Linear Regression, OLS assumptions and properties of OLS estimators

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Probability and Stochastic Processes

8 Laws of large numbers

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Transcription:

Math 408 - Mathematical Statistics Lecture 13-14. The Sample Mean and the Sample Variance Under Assumption of Normality February 20, 2013 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 1 / 19

Framework Let X 1,..., X n be a sample drawn from a population. Suppose that the population is Gaussian X 1,..., X n N (µ, σ 2 ) We want to estimate population parameters µ and σ 2. Definition The sample mean is X n = 1 n n i=1 X i The sample variance is Sn 2 = 1 n n 1 i=1 (X i X n ) 2 Theorem X n and S 2 n are unbiased estimators of µ and σ 2, respectively, E[X n ] = µ, E[S 2 n ] = σ 2 Our goal: to describe distributions of X n and S 2 n Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 2 / 19

Distribution of X n Theorem If X 1,..., X n are independent N (µ, σ 2 ) random variables, then ) X n N (µ, σ2 n Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 3 / 19

Distribution of S 2 n Theorem If X 1,..., X n are independent N (µ, σ 2 ) random variables, then (n 1)S 2 n σ 2 χ 2 n 1 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 4 / 19

The χ 2 -distribution Definition Let Z 1,..., Z n be independent standard normal variables, Z 1,..., Z n N(0, 1) Then the distribution of Q = Z 2 1 + Z 2 2 +... + Z 2 n is called the χ 2 -distribution with n degrees of freedom, Probability Density Function: π(x) = Q χ 2 n 1 2 n/2 Γ(n/2) x n/2 1 e x/2 x 0 Γ is the gamma function Γ(z) = t z 1 e t dt 0 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 5 / 19

The χ 2 -distribution The χ 2 -distribution is especially important in hypothesis testing. Nice Properties: If X N (µ, σ 2 ), then X µ σ N (0, 1) and ( ) 2 X µ χ 2 1 σ If U χ 2 n and V χ 2 m, and U and V are independent, then U + V χ 2 n+m Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 6 / 19

Graph of the χ 2 n PDF: small n χ 2 distribution 1 n=1 n=2 n=3 n=5 0.8 π(x) 0.6 0.4 0.2 0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 x Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 7 / 19

Graph of the χ 2 n PDF: large n 0.1 0.09 0.08 0.07 0.06 χ 2 distribution n=10 n=15 n=20 n=25 π(x) 0.05 0.04 0.03 0.02 0.01 0 0 5 10 15 20 25 30 35 40 45 50 x CLT: χ 2 n converges to a normal distribution as n χ 2 n N (n, 2n), as n When n > 50, for many practical purposes, χ 2 n = N (n, 2n) Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 8 / 19

Distribution of S 2 n Theorem If X 1,..., X n are independent N (µ, σ 2 ) random variables, then (n 1)S 2 n σ 2 χ 2 n 1 Proof: is based on moment-generating functions... Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 9 / 19

Moment-generating functions Definition The moment-generating function (MGF) of a random variable X f (x) is M(t) = E[e tx ] = (if the expectation is defined) e tx f (x)dx Important Properties of MGFs: If ε > 0 such that M(t) exists for all t ( ε, ε), then M(t) uniquely determines the probability distribution, M(t) f (x). If M(t) exists in an open interval containing zero, then M (r) (0) = E[X r ] (hence the name) To find moments E[X r ], we must do integration. Knowing the MGF allows to replace integration by differentiation. Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 10 / 19

Moment-generating functions Important Properties of MGFs: (continuation) If X has the MGF M X (t) and Y = a + bx, then M Y (t) = e at M X (bt) If X and Y are independent, then M X +Y (t) = M X (t)m Y (t) If X and Y have a joint distribution, then their joint MGF is defined as M X,Y (s, t) = E[e sx +ty ] X and Y are independent if and only if M X,Y (s, t) = M X (s)m Y (t) Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 11 / 19

Moment-generating functions: Limitations and Examples The major limitation of the moment-generating function is that it may not exist. In this case, the characteristic function may be used: φ(t) = E[e itx ] Examples: N (µ, σ 2 ): M(t) = e µt e σ2 t 2 /2 χ 2 n: M(t) = (1 2t) n/2 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 12 / 19

Distribution of S 2 n Theorem If X 1,..., X n are independent N (µ, σ 2 ) random variables, then (n 1)S 2 n σ 2 χ 2 n 1 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 13 / 19

Bringing the t-distribution into the Game Theorem If X 1,..., X n are independent N (µ, σ 2 ) random variables, then X n µ S n / n t n 1 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 14 / 19

The t-distribution Definition Let Z N (0, 1), U χ 2 n, and Z and U are independent. Then the distribution of T = Z U/n is called the t-distribution with n degrees of freedom. Probability Density Function: π(x) = ( Γ ((n + 1)/2) 1 + x 2 ) (n+1)/2 nπγ(n/2) n The t-distribution is symmetric about zero, π(x) = π( x) As n, the t-distribution tends to the standard normal distribution. In fact, when n > 30, the two distributions are very close. Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 15 / 19

Graph of the t-distribution PDF: small n 0.4 0.35 0.3 t distribution n=1 n=2 n=3 N(0,1) 0.25 π(x) 0.2 0.15 0.1 0.05 0 5 4 3 2 1 0 1 2 3 4 5 x Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 16 / 19

Graph of the t-distribution PDF: large n 0.4 t distribution 0.35 N(0,1) n=30 0.3 0.25 π(x) 0.2 0.15 0.1 0.05 0 5 4 3 2 1 0 1 2 3 4 5 x Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 17 / 19

Bringing the t-distribution into the Game Theorem If X 1,..., X n are independent N (µ, σ 2 ) random variables, then X n µ S n / n t n 1 Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 18 / 19

Summary Under Assumption of Normality, X 1,..., X n N (µ, σ 2 ), the sample mean: have the following properties: ) X n N (µ, σ2 n X n = 1 n n i=1 the sample variance: S 2 n = 1 n 1 X i n (X i X n ) 2 i=1 (n 1)S 2 n σ 2 χ 2 n 1 χ 2 n = N (0, 1) 2 +... + N (0, 1) 2 X n µ S n / n t n 1 t n = N (0,1) χ 2 n /n Konstantin Zuev (USC) Math 408, Lecture 13-14 February 20, 2013 19 / 19