Probability and Statistics Notes

Similar documents
Limiting Distributions

Joint p.d.f. and Independent Random Variables

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Limiting Distributions

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

STAT Chapter 5 Continuous Distributions

Continuous Random Variables

Notes for Math 324, Part 20

ACM 116: Lectures 3 4

STAT 418: Probability and Stochastic Processes

STA 256: Statistics and Probability I

Probability and Distributions

STAT 414: Introduction to Probability Theory

STAT 430/510: Lecture 16

Chapter 4. Chapter 4 sections

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

Bivariate distributions

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

1.1 Review of Probability Theory

Math 365 Final Exam Review Sheet. The final exam is Wednesday March 18 from 10am - 12 noon in MNB 110.

[Chapter 6. Functions of Random Variables]

6 The normal distribution, the central limit theorem and random samples

0, otherwise. U = Y 1 Y 2 Hint: Use either the method of distribution functions or a bivariate transformation. (b) Find E(U).

14.30 Introduction to Statistical Methods in Economics Spring 2009

Asymptotic Statistics-III. Changliang Zou

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Math Spring Practice for the final Exam.

Chapter 4 Multiple Random Variables

BASICS OF PROBABILITY

Notes for Math 324, Part 19

Introduction to Computational Finance and Financial Econometrics Probability Review - Part 2

Random Variables and Their Distributions

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

1 Presessional Probability

Chapter 5 continued. Chapter 5 sections

Ch. 5 Joint Probability Distributions and Random Samples

Theory of probability and mathematical statistics

Week 9 The Central Limit Theorem and Estimation Concepts

Bivariate Distributions

Lecture 10. Variance and standard deviation

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

Practice Examination # 3

STAT/MATH 395 PROBABILITY II

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Lecture 2: Repetition of probability theory and statistics

2 (Statistics) Random variables

Final Exam # 3. Sta 230: Probability. December 16, 2012

ST 371 (IX): Theories of Sampling Distributions

Multivariate Random Variable

Lecture 11. Multivariate Normal theory

MAS223 Statistical Inference and Modelling Exercises

Chapter 5. Chapter 5 sections

18 Bivariate normal distribution I

Section 9.1. Expected Values of Sums

Actuarial Science Exam 1/P

Lecture 1: August 28

This does not cover everything on the final. Look at the posted practice problems for other topics.

Part (A): Review of Probability [Statistics I revision]

MATH Notebook 4 Fall 2018/2019

Test Problems for Probability Theory ,

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Continuous random variables

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

LIST OF FORMULAS FOR STK1100 AND STK1110

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Things to remember when learning probability distributions:

Lecture 2: Review of Probability

Multiple Random Variables

7 Random samples and sampling distributions

Applied Statistics I

Sampling Distributions

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

1 Review of Probability and Distributions

First Year Examination Department of Statistics, University of Florida

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

This midterm covers Chapters 6 and 7 in WMS (and the notes). The following problems are stratified by chapter.

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

Statistics. Statistics

If g is also continuous and strictly increasing on J, we may apply the strictly increasing inverse function g 1 to this inequality to get

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

STAT 512 sp 2018 Summary Sheet

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Joint Distribution of Two or More Random Variables

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

ENGG2430A-Homework 2

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

Partial Solutions for h4/2014s: Sampling Distributions

Bivariate Distributions

Question Points Score Total: 76

Notes on Random Vectors and Multivariate Normal

Problem Solving. Correlation and Covariance. Yi Lu. Problem Solving. Yi Lu ECE 313 2/51

Qualifying Exam in Probability and Statistics.

Gaussian vectors and central limit theorem

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Stat 5101 Notes: Algorithms

Exercises and Answers to Chapter 1

Transcription:

Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37

Outline 1 Section 5.2: Transformations of Two Random Variables 2 Section 5.3: Several Independent Random Variables 3 Section 5.4: The Moment-Generating Function Technique 4 Section 5.5: Random Functions Associated with Normal Distributions 5 Section 5.6: The Central Limit Theorem 6 Section 5.7: Approximations for Discrete Distributions 7 Review for Exam 1 (Tarleton State University) Chapter Five Notes Spring 2011 2 / 37

Change of Variables in the Bivariate Case Theorem Suppose X 1 and X 2 are random variables with joint p.d.f. f (x 1, x 2 ). Let Y 1 = u 1 (X 1, X 2 ) and Y 2 = u 2 (X 1, X 2 ). Also, assume the transformation is 1-1 and satisfies certain regularity conditions analogous to those in section 5.1. Let X 1 = v 1 (Y 1, Y 2 ) and X 2 = v 2 (Y 1, Y 2 ) be the inverse mappings. Then the joint p.d.f. for Y 1 and Y 2 is g(y 1, y 2 ) = f (v 1 (y 1, y 2 ), v 2 (y 1, y 2 )) J, where J = x 1 y 1 x 2 x 1 y 2 x 2 y 1 y 2. (Tarleton State University) Chapter Five Notes Spring 2011 3 / 37

The F Distribution Definition Suppose U and V are independent, and U χ 2 (r 1 ) and V χ 2 (r 2 ). Then the random variable W = U/r 1 V /r 2 is said to have an F distribution with r 1 and r 2 degrees of freedom, denoted F(r 1, r 2 ). If 0 < α < 1, then F α (r 1, r 2 ) is the critical value such that P[W F α (r 1, r 2 )] = α. (Tarleton State University) Chapter Five Notes Spring 2011 4 / 37

Proposition If W F(r 1, r 2 ), then the p.d.f. for W is f (w) = (r 1/r 2 ) r 1/2 Γ[(r 1 + r 2 )/2]w r 1/2 1 Γ(r 1 /2)Γ(r 2 /2)[1 + (r 1 w/r 2 )] (r 1+r 2 )/2. (Tarleton State University) Chapter Five Notes Spring 2011 5 / 37

Outline 1 Section 5.2: Transformations of Two Random Variables 2 Section 5.3: Several Independent Random Variables 3 Section 5.4: The Moment-Generating Function Technique 4 Section 5.5: Random Functions Associated with Normal Distributions 5 Section 5.6: The Central Limit Theorem 6 Section 5.7: Approximations for Discrete Distributions 7 Review for Exam 1 (Tarleton State University) Chapter Five Notes Spring 2011 6 / 37

Independence and Random Samples Definition Suppose X 1,..., X n are random variables with joint p.d.f. f (x 1,..., x n ), and let f i (x i ) be the p.d.f. of X i, for i = 1,..., n. Then X 1,..., X n are independent if f (x 1,..., x n ) = f 1 (x 1 ) f n (x n ). If these random variables all have the same distribution, they are said to be identically distributed. If X 1,..., X n are independent and identically distributed (IID), then they are referred to as a random sample of size n from their common distribution. (Tarleton State University) Chapter Five Notes Spring 2011 7 / 37

Example A certain population of women have heights that are normally distributed, with mean 64 inches and standard deviation 2 inches. Let (X 1, X 2, X 3 ) be a random sample of size 3 from this population. Find the joint p.d.f. for (X 1, X 2, X 3 ). Find the probability that everyone s height in the sample exceeds 67 inches. (Tarleton State University) Chapter Five Notes Spring 2011 8 / 37

Proposition If X 1,..., X n are independent, then for any sets A 1,..., A n, P(X 1 A 1,..., X n A n ) = P(X 1 A 1 ) P(X n A n ) Also, for any functions u 1,..., u n, E[u 1 (X 1 ) u n (X n )] = E[u 1 (X 1 )] E[u n (X n )] (Tarleton State University) Chapter Five Notes Spring 2011 9 / 37

Mean and Variance of a Linear Combination of R.V. s Theorem Suppose X 1,..., X n are independent R.V. s with means µ 1,..., µ n, and variances σ1 2,..., σ2 n. If a 1,..., a n R, then E[a 1 X 1 + + a n X n ] = a 1 µ 1 + + a n µ n, and Var[a 1 X 1 + + a n X n ] = a1 2 σ2 1 + + a2 nσn. 2 (Tarleton State University) Chapter Five Notes Spring 2011 10 / 37

Mean and Variance of the Sample Mean Definition Let X 1,..., X n be a random sample. The sample mean is Proposition X = 1 n n X i. Let X 1,..., X n be a random sample from a population with i=1 population mean µ and population variance σ 2. Then E(X) = µ, and Var(X) = σ2 n. (Tarleton State University) Chapter Five Notes Spring 2011 11 / 37

Outline 1 Section 5.2: Transformations of Two Random Variables 2 Section 5.3: Several Independent Random Variables 3 Section 5.4: The Moment-Generating Function Technique 4 Section 5.5: Random Functions Associated with Normal Distributions 5 Section 5.6: The Central Limit Theorem 6 Section 5.7: Approximations for Discrete Distributions 7 Review for Exam 1 (Tarleton State University) Chapter Five Notes Spring 2011 12 / 37

m.g.f. of a Linear Combination Theorem (5.4-1) Suppose X 1,..., X n are independent R.V. s with moment-generating functions M Xi (t), for i = 1,..., n. Then the moment-generating function of Y = n i=1 a ix i is Example (5.4-2) Suppose 0 p 1, and M Y (t) = n M Xi (a i t). i=1 X 1,..., X n all have a Bernoulli(p) distribution. Find the distribution of Y = n i=1 X i. (Tarleton State University) Chapter Five Notes Spring 2011 13 / 37

Corollaries for Random Samples Corollary (5.4-1) Suppose X 1,..., X n is a random sample from a distribution with m.g.f. M(t). Then the m.g.f. of Y = n i=1 X i is M Y (t) = [M(t)] n, and the m.g.f. of X is M X (t) = [ ( )] t n M. n Example (5.4-3) Suppose (X 1, X 2, X 3 ) is a random sample from an exponential distribution with mean θ. Find the distributions of Y = X 1 + X 2 + X 3 and X. (Tarleton State University) Chapter Five Notes Spring 2011 14 / 37

Theorem (5.4-2) If X 1,..., X n are independent, and X i χ 2 (r i ), for each i, then X 1 + + X n χ 2 (r 1 + + r n ). Corollary (5.4-2) If Z 1,..., Z n are independent standard normal R.V. s, then W = Z 2 1 + + Z 2 n χ 2 (n). Corollary (5.4-3) If X 1,..., X n are independent, and each X i N(µ i, σ 2 i ), then W = n (X i µ i ) 2 σ 2 i=1 i χ 2 (n). (Tarleton State University) Chapter Five Notes Spring 2011 15 / 37

Outline 1 Section 5.2: Transformations of Two Random Variables 2 Section 5.3: Several Independent Random Variables 3 Section 5.4: The Moment-Generating Function Technique 4 Section 5.5: Random Functions Associated with Normal Distributions 5 Section 5.6: The Central Limit Theorem 6 Section 5.7: Approximations for Discrete Distributions 7 Review for Exam 1 (Tarleton State University) Chapter Five Notes Spring 2011 16 / 37

Linear Combinations of Independent Normal R.V. s Theorem (5.5-1) Suppose X 1,..., X n are independent, and X i N(µ i, σi 2 ), for i = 1,..., n. Then Y = Example (5.5-1) ( n n ) n c i X i N c i µ i, ci 2 σi 2. i=1 i=1 i=1 Suppose X 1 and X 2 are independent normal random variables, X 1 N(693.2, 22820), and X 2 N(631.7, 19205). Find P(X 1 > X 2 ). (Tarleton State University) Chapter Five Notes Spring 2011 17 / 37

Sample Mean and Variance for a Normal Population Theorem (5.5-2) Let (X 1,..., X n ) be a random sample from N(µ, σ 2 ). Then the sample mean X = 1 n n X i, i=1 and the sample variance, S 2 = 1 n 1 n (X i X) 2, i=1 are independent. Their distributions are X N(µ, σ 2 /n), and S 2 σ2 n 1 χ2 (n 1). (Tarleton State University) Chapter Five Notes Spring 2011 18 / 37

Example Consider a population of women whose heights are normally distributed with mean 64 inches and standard deviation 2 inches. For a sample of size n = 10, find P(63 < X < 65), and find constants a and b such that P(a < S 2 < b) = 0.95. Repeat the problem when n = 81. (Tarleton State University) Chapter Five Notes Spring 2011 19 / 37

Unbiasedness of X and S 2 From Theorem 5.5-2, we have E[X] = µ, and E[S 2 ] = σ 2. X, the sample mean, is used to estimate the population mean µ. S 2, the sample variance, is used to estimate the population variance σ 2. On average, each of these estimators are equal to the parameters they are intended to estimate. That is, X and S 2 are unbiased. (Tarleton State University) Chapter Five Notes Spring 2011 20 / 37

Remarks about Degrees of Freedom In the proof of Theorem 5.5-2, we noted that n (X i µ) 2 σ 2 i=1 χ 2 (n), and n (X i X) 2 σ 2 χ 2 (n 1). i=1 Replacing the parameter µ with its estimator X resulted in a loss of one degree of freedom. There are many examples where the degrees of freedom is reduced by one for each parameter being estimated. (Tarleton State University) Chapter Five Notes Spring 2011 21 / 37

Student s t Distribution Theorem (5.5-3) Suppose Z and U are independent r.v. s, Z N(0, 1), and U χ 2 (r). Then, T = Z U/r has a t distribution with r degrees of freedom, denoted t(r). The p.d.f. for a t distribution is f (t) = Γ((r + 1)/2) 1 πrγ(r/2) (1 + t 2, for t R. /r)(r+1)/2 (Tarleton State University) Chapter Five Notes Spring 2011 22 / 37

Relevance to Samples from Normal Distributions Corollary Suppose X 1,..., X n is a random sample from N(µ, σ 2 ). Then T = X µ S/ n has a t distribution with n 1 degrees of freedom. (Tarleton State University) Chapter Five Notes Spring 2011 23 / 37

Outline 1 Section 5.2: Transformations of Two Random Variables 2 Section 5.3: Several Independent Random Variables 3 Section 5.4: The Moment-Generating Function Technique 4 Section 5.5: Random Functions Associated with Normal Distributions 5 Section 5.6: The Central Limit Theorem 6 Section 5.7: Approximations for Discrete Distributions 7 Review for Exam 1 (Tarleton State University) Chapter Five Notes Spring 2011 24 / 37

The Central Limit Theorem Theorem (5.6-1) Suppose X 1, X 2,... is a sequence of IID random variables, from a distribution with finite mean µ and finite positive variance σ 2. Let X = 1 n n i=1 X i, for n = 1, 2,... Then, as n, Advanced texts: X µ σ/ n = n i=1 X i nµ nσ N(0, 1). Introduction to Mathematical Statistics, 6th ed., by Hogg, McKean, and Craig. Probability and Measure, 3rd ed., by Billingsley. (Tarleton State University) Chapter Five Notes Spring 2011 25 / 37

Informal Statement of CLT Informal CLT Suppose X 1,..., X n is a random sample from a distribution with finite mean µ and finite positive variance σ 2. Then, if n is sufficiently large, X N(µ, σ 2 /n), and n X i N(nµ, nσ 2 ). i=1 Conventionally, values of n 30 are usually considered sufficiently large, although this text applies the approximation for lower values of n, such as n 20. (Tarleton State University) Chapter Five Notes Spring 2011 26 / 37

Examples Example Consider a random sample of size 3000 from a uniform distribution on the interval [0, 1000]. Find (approximately) P(490 < X < 510). Find P ( 1, 470, 000 < 3000 i=1 X i < 1, 530, 000 ). (Tarleton State University) Chapter Five Notes Spring 2011 27 / 37

Lottery Tickets Example Consider a $1 scratch-off lottery ticket with the following prize structure: Prize($) Probability 0 0.80 2 0.15 10 0.05 Find the expected profit/loss from buying a single ticket. Also find the standard deviation. What is the chance of breaking even if you buy one ticket? If you buy 100 tickets? If you buy 500 tickets? (Tarleton State University) Chapter Five Notes Spring 2011 28 / 37

Insurance Example An auto insurance company has one million (statistically independent) customers. The annual costs incurred by an individual customer due to auto accidents are summarized below: Cost($) 0 500 5,000 15,000 Probability 0.80 0.10 0.08 0.02 Also, assume that each customer has at most one accident per year and has a $500 deductible. Find the expected value and variance of a single customer s claims. How much money must the company have to cover all of its customers claims with 99% probability? (Tarleton State University) Chapter Five Notes Spring 2011 29 / 37

Outline 1 Section 5.2: Transformations of Two Random Variables 2 Section 5.3: Several Independent Random Variables 3 Section 5.4: The Moment-Generating Function Technique 4 Section 5.5: Random Functions Associated with Normal Distributions 5 Section 5.6: The Central Limit Theorem 6 Section 5.7: Approximations for Discrete Distributions 7 Review for Exam 1 (Tarleton State University) Chapter Five Notes Spring 2011 30 / 37

Normal Approximation to the Binomial Distribution Proposition If np 5 and n(1 p) 5, then b(n, p) N(np, np(1 p)). Example In a city with a population of 10 million people, 55% of the population approves of the mayor. In a random sample of size 2000, find the probability that the number of people who approve of the mayor is between 1060 and 1150 inclusive. Continuity Correction When using this approximation to calculate probabilities, increase the width of the interval by 0.5 at each end. (Tarleton State University) Chapter Five Notes Spring 2011 31 / 37

Example Suppose X b(20, 0.3). Approximate the following probabilities: P(2 X 8) P(2 < X < 8) P(2 < X 8) (Tarleton State University) Chapter Five Notes Spring 2011 32 / 37

Normal Approximation to the Poisson Distribution Proposition If n is sufficiently large, then Poiss(n) N(n, n). Example A radioactive sample emits β-particles according to a Poisson process at an average rate of 35 per minute. Find the probability that the number of particles emitted in a 20 minute period exceeds 720. (Tarleton State University) Chapter Five Notes Spring 2011 33 / 37

Outline 1 Section 5.2: Transformations of Two Random Variables 2 Section 5.3: Several Independent Random Variables 3 Section 5.4: The Moment-Generating Function Technique 4 Section 5.5: Random Functions Associated with Normal Distributions 5 Section 5.6: The Central Limit Theorem 6 Section 5.7: Approximations for Discrete Distributions 7 Review for Exam 1 (Tarleton State University) Chapter Five Notes Spring 2011 34 / 37

Know how to do all homework and quiz problems. Given the joint p.d.f. of two random variables, be able to determine probabilities/expected values involving both random variables P[(X, Y ) A] = f (x, y) dydx, for any A R 2. E[u(X, Y )] = A u(x, y)f (x, y) dydx. marginal p.d.f. s and probabilities/expected values involving only one of the variables f 1 (x) = f (x, y) dy. P(X A) = f 1 (x) dx, for any A R. E[u(X)] = A u(x)f 1 (x) dx. (Tarleton State University) Chapter Five Notes Spring 2011 35 / 37

conditional p.d.f. s, conditional mean/variance, and conditional probabilities f (x, y) g(x y) = f 2 (y). E[u(X) Y = y] = u(x)g(x y) dx. Var(X Y = y) = E(X 2 Y = y) E(X Y = y) 2. the covariance and correlation coefficient σ XY = Cov(X, Y ) = E(XY ) E(X)E(Y ). ρ = Cov(X, Y ) σ X σ Y. the least squares regression line relating the variables y = µ Y + ρ σ Y σ X (x µ X ). (Tarleton State University) Chapter Five Notes Spring 2011 36 / 37

Be able to find the expected value, variance, and probabilities involving Y, conditioning on X = x, when X and Y are jointly normal µ Y x = E(Y X = x) = µ Y + ρ σ Y σ X (x µ X ). σ 2 Y x = Var(Y X = x) = σ2 Y (1 ρ2 ). The conditional distribution of Y given X = x is N(µ Y x, σ 2 Y x ). Be able to find the distribution of Y = u(x) using the distribution function technique or the change of variables formula. G(y) = P(Y y) = P(u(X) y), and g(y) = G (y). g(y) = f [v(y)] v (y), where v = u 1. x 1 g(y 1, y 2 ) = f [v 1 (y 1, y 2 ), v 2 (y 1, y 2 )] y 1 x 2 y 2 x 2 x 1 y 1 y 2. Know the material in sections 5.3-5.6 and be able to solve related problems. (Tarleton State University) Chapter Five Notes Spring 2011 37 / 37