x i p X (x i ) if X is discrete provided that the relevant sum or integral is absolutely convergent, i.e., i

Similar documents
LIST OF FORMULAS FOR STK1100 AND STK1110

More than one variable

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

SUMMARY OF PROBABILITY CONCEPTS SO FAR (SUPPLEMENT FOR MA416)

1 Review of Probability and Distributions

3. Probability and Statistics

1 Random Variable: Topics

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

2 (Statistics) Random variables

1.1 Review of Probability Theory

Chp 4. Expectation and Variance

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Random Variables and Their Distributions

Lecture 5: Moment generating functions

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Probability- the good parts version. I. Random variables and their distributions; continuous random variables.

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Formulas for probability theory and linear models SF2941

STAT 414: Introduction to Probability Theory

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Theory and Statistics. Peter Jochumzen

18.440: Lecture 28 Lectures Review

Probability and Distributions

STA 256: Statistics and Probability I

CDA6530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

STAT 418: Probability and Stochastic Processes

Chapter 2. Discrete Distributions

Homework 5 Solutions

Week 12-13: Discrete Probability

18.440: Lecture 28 Lectures Review

Continuous Distributions

Actuarial Science Exam 1/P

1 Basic continuous random variable problems

Lecture 19: Properties of Expectation

Partial Solutions for h4/2014s: Sampling Distributions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

Exercises and Answers to Chapter 1

MATH2715: Statistical Methods

Sampling Distributions

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

Probability and Statistics Notes

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

ECE302 Exam 2 Version A April 21, You must show ALL of your work for full credit. Please leave fractions as fractions, but simplify them, etc.

STAT Chapter 5 Continuous Distributions

BASICS OF PROBABILITY

Bivariate distributions

Stat410 Probability and Statistics II (F16)

Sampling Distributions

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

1 Review of Probability

General Random Variables

STAT 515 MIDTERM 2 EXAM November 14, 2018

A Course Material on. Probability and Random Processes

STAT515, Review Worksheet for Midterm 2 Spring 2019

Bivariate Distributions

Chapter 4 : Expectation and Moments

2 Functions of random variables

We introduce methods that are useful in:

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Math 510 midterm 3 answers

Joint Distribution of Two or More Random Variables

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

Continuous random variables

Lecture 1: August 28

1.12 Multivariate Random Variables

Review of Probability Theory

3 Continuous Random Variables

ACM 116: Lectures 3 4

5 Operations on Multiple Random Variables

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Continuous r.v practice problems

14.30 Introduction to Statistical Methods in Economics Spring 2009

Brief Review of Probability

Chap 2.1 : Random Variables

Continuous Random Variables

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

EE4601 Communication Systems

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

Final Exam # 3. Sta 230: Probability. December 16, 2012

Problem 1. Problem 2. Problem 3. Problem 4

Chapter 2. Probability

Continuous Random Variables and Continuous Distributions

Homework 9 (due November 24, 2009)

Lecture 1: Review on Probability and Statistics

BMIR Lecture Series on Probability and Statistics Fall 2015 Discrete RVs

Statistics STAT:5100 (22S:193), Fall Sample Final Exam B

Lecture 25: Review. Statistics 104. April 23, Colin Rundel

Continuous Random Variables

Joint Distributions. (a) Scalar multiplication: k = c d. (b) Product of two matrices: c d. (c) The transpose of a matrix:

Chapter 3: Random Variables 1

Week 2. Review of Probability, Random Variables and Univariate Distributions

MATH Notebook 4 Fall 2018/2019

1 Presessional Probability

Practice Examination # 3

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Chapter 4. Chapter 4 sections

Transcription:

Chapter 5 Expectation 5. Introduction Def The expectation (mean), E[X] or µ X, of a random variable X is defined by: x i p X (x i ) if X is discrete E[X] µ X i xf(x)dx if X is continuous provided that the relevant sum or integral is absolutely convergent, i.e., i x i p X (x i ) < and x f(x)dx <. Otherwise, E[X] does not exist. Example A lot contains 4 good components and 3 defective components. A sample of 3 is taken by a inspector. Find the expected value of the number of good components in this samples. Let X denote the number of good components in the samples. Then, That is, p X (x) ( 4 x )( 3 ) 3 x ( ) 7, x 0,, 2, 3. 3 x 0 2 3 p X (x) /35 2/35 8/35 4/35

Hence, E[X] 0 35 + 2 35 + 2 8 35 + 3 4 35 2 7.7 Example 2 Let X be the random variable that denotes the life in hours of a certain electronic device. The pdf is given by f(x) Find the expected life of this device. { 20,000 x 3 x > 00 0 otherwise E[X] 00 00 200 20, 000 x dx x 3 20, 000 dx x 2 Example 3 Let X be a discrete random variable with pmf Show that E[X] is undefined. p X (x) { x, 2,... x(x+) 0 otherwise pf) Hence, E[X] is undefined. x x x(x + ) x x + 5.2 Expectation of a Function of a Random Variable Let X be a random variable, and let Y g(x) be a real-valued function. Then g(x i )p X (x i ) if X is discrete E[Y ] E[g(X)] i g(x)f X (x)dx if X is continuous 2

Example 4 Suppose that the number of cars, X, that pass through a car wash between 4:00 P.M. and 5:00 P.M on any sunny Friday has the following probability distribution: x 4 5 6 7 8 9 P(X x) 2 2 Let g(x) 2X represent the amount of money in dollars, paid to the attendant by the manager. Find the attendant s expected earnings for this particular time period. 4 4 6 6 9 E[g(X)] E[2X ] (2x )p X (x) x4 7 2 + 9 2 + 4 + 3 4 + 5 6 + 7 6 2.67 Example 5 Let X be a random variable with pdf f(x) Find the expected value of g(x) 4X + 3. { x 2 3 < x < 2 0 otherwise Moments Def 2 E[g(X)] E[4X + 3] 3 8 2 2 (4x 3 + 3x 2 )dx (4x + 3)x 2 dx 3. E[X k ] is defined to be the kth moment of a random variable X. 2. E[(X E[X]) k ] (µ k ) is defined to be the kth central moment of a random variable X. Def 3 The variance of a random variable X is V ar[x] σx 2 E[(X E[X]) 2 ] E[(X µ X ) 2 ] (x i µ X ) 2 p X (x i ) if X is discrete i (x µ x ) 2 f X (x)dx if X is continuous 3

The square root, σ X is called the standard deviation of random variable X. Example 6 Let X B(n, p). Find E[X] and V ar[x]. np ( n x x ) n E[X] p x q n x x0 ( ) n n n p x q n x x x ) n x0 ( n x np(p + q) n np p x q (n ) x ) n V ar[x] np) x0(x 2( n p x q n x x ( ) n n (x 2 2npx + n 2 p 2 ) p x q n x x x0 ) ( ) n x0x 2( n n n n ( p x q n x 2np x p x q n x + n 2 p 2 n x x x x0 x0 (npq + n 2 p 2 ) 2n 2 p 2 + n 2 p 2 npq ) p x q n x Example 7 Let X Exp(λ). Find E[X] and V ar[x]. E[X] 0 x λe λx dx λ 4

V ar[x] λ ( 0 0 Γ(3) λ 2 x λ) 2 λe λx dx x 2 e λx dx 2 xe λx dx + e λx dx 0 λ 0 2Γ(2) λ 2 + Γ() λ 2 λ 2 5.3 Expectation of Functions of Multiple Random Variables Let X,...,X n be n random variables defined on the same probability space, and let Y g(x,...,x n ) be a real-valued function. Then E[Y ] E[g(X,...,X n )] g(x,...,x n )p(x,...,x n ) x x n g(x,...,x n )f(x,...,x n )dx dx n discrete case continuous case Example 8 A box contains 3 balls numbered,2 and 3. Now, select two ball without replacement from the box. Let X, Y denote the number of the first and the second balls, respectively. Find E[X + Y ] and E[XY ]. The jpmf of X,Y is as follows: Y 2 3 0 6 6 X 2 6 0 6 3 6 0 6 E[X + Y ] [( + 2) + ( + 3) + (2 + ) + (2 + 3) + (3 + ) + (3 + 2)] 6 4 5

E[XY ] [( 2) + ( 3) + (2 ) + 6 (2 3) + (3 ) + (3 2)] 3 Example 9 Find E[Y/X] for the pdf x(+3y 2 ) 0 < x < 2, 0 < y < 4 f(x,y) 0 otherwise E[Y/X] 2 0 0 y( + 3y 2 ) dxdy 0 4 y + y 3 dy 2 5 8 5.4 Important Properties of Expectation Linearity E. E[k] k for any constant k. E2. E[a X + a 2 X 2 + a n X n ] a E[X ] + a 2 E[X 2 ] + + a n E[X n ] where a i s are any constants and X i s are any random variables. Example 0 Let X,Y be two continuous random variables. Show that E[X +Y ] E[X]+ E[Y ]. pf) E[X + Y ] x (x + y)f(x,y)dxdy xf X (x)dx + E[X] + E[Y ] f(x,y)dydx + yf Y (y)dy y f(x,y)dxdy 6

Independence n n E3. If random variables X,...,X n are independent, then E[ X i ] E[X i ]. i i Example Let X and Y be two independent continuous random variables. Show that E[XY ] E[X]E[Y ]. pf) E[XY ] xf X (x) E[X]E[Y ] xyf(x, y)dydx yf Y (y)dydx Remark E[XY ] E[X]E[Y ] does not imply that X and Y are independent. Example 2 Suppose that the possible outcomes of (X,Y ) are (, 0), (, 0), (0, ), and (0, ), which are assumed to occur with the same probability. Show that E[XY ] E[X]E[Y ], but X and Y are not independent. pf) E[XY ] (0 + 0 + 0 + 0) 0 4 E[X] ( + + 0 + 0) 0 4 E[Y ] (0 + 0 + ) 0 4 E[XY ] E[X]E[Y ] P(X 0,Y 0) 0 4 P(X 0)P(Y 0) Hence, X and Y are not independent. Covariance 7

We are now interested in finding V ar[x + Y ]. V ar[x + Y ] E[((X + Y ) E[X + Y ]) 2 ] E[((X + Y ) E[X] E[Y ]) 2 ] E[(X E[X]) 2 + (Y E[Y ]) 2 + 2(X E[X])(Y E[Y ])] E[(X E[X]) 2 ] + E[(Y E[Y ]) 2 ] + 2E[(X E[X])(Y E[Y ])] V ar[x] + V ar[y ] + 2E[(X E[X])(Y E[Y ])] Define We have Cov(X,Y ) E[(X E[X])(Y E[Y ])], V ar[x + Y ] V ar[x] + V ar[y ] + 2Cov(X,Y ) Example 3 Let X, Y be two independent random variables. Show that Cov(X,Y ) 0 pf) Cov(X,Y ) E[(X E[X])(Y E[Y ])] E[XY XE[Y ] Y E[X] + E[X]E[Y ]] E[XY ] E[X]E[Y ] E[Y ]E[X] + E[X]E[Y ] E[XY ] E[X]E[Y ] Since X, Y are independent, we have E[XY ] E[X]E[Y ]. Hence Cov(X, Y ) 0. Remark Cov(X,Y ) 0 does not imply that X and Y are independent. E4. Cov(X,X) V ar[x]. E5. V ar[x] E[X 2 ] (E[X]) 2. pf) E6. Cov(X,Y ) Cov(Y,X). V ar[x] Cov(X, X) E[XX] E[X]E[X] E[X 2 ] (E[X]) 2 E7. If X,Y are independent, V ar[x +Y ] V ar[x]+v ar[y ]. More generally, if X,...,X n are independent, V ar[x + + X n ] V ar[x ] + + V ar[x n ] 8

E8. m n m n Cov α i X i, β j Y j α i β j Cov(X i,y j ) i j i j pf) m n Cov α i X i, βy i i j m n [ m ] n E α i X i β j Y j E α i X i E β j Y j i j i j m n m n E α i β j X i Y j α i β j E[X i ]E[Y j ] i j m n i j m n i j m n i j α i β j E[X i Y j ] i j m n i j α i β j (E[X i Y j ] E[X i ]E[Y j ]) α i β j Cov(X i,y j ) α i β j E[X i ]E[Y j ] E9. [ n ] n V ar α i X i αiv 2 ar[x i ] + α i α j Cov(X i,x j ) i i i j Example 4 Let X Γ(α,λ). Find E[X], E[X m ], and V ar[x]. E[X m ] 0 λα Γ(α) x m Γ(α) λα x α e λx dx E[X] α λ E[X 2 α(α + ) ] λ 2 0 x m+α e λx dx λα Γ(m + α) λ m+α Γ(α) α(α + ) (α + m ) λ m V ar[x] E[X 2 ] (E[X]) 2 α λ 2 9

Example 5 Let U i U(0, ),i, 2,...,n, be independent random variables. Let X min(u,...,u n ) Y max(u,...,u n ) Find E[X m ], E[Y m ], E[X], E[Y ], V ar[x], V ar[y ], Cov(X,Y ). From Chapter 4, we know f(x,y) f X (x) f Y (y) { n(n )(y x) n 2 0 < x y < 0 otherwise { n( x) n 0 < x < 0 otherwise { ny n 0 < y < 0 otherwise E[X m ] n x m ( x) n dx 0 nγ(m + )Γ(n) Γ(m + n + ) m!n! E[X] E[X 2 ] (m + n)! n + 2 (n + )(n + 2) V ar[x] E[X 2 ] (E[X]) 2 E[Y m ] n n E[Y ] E[Y 2 ] 0 0 n m + n n n + n n + 2 y m y n dy y m+n dy V ar[y ] E[Y 2 ] (E[Y ]) 2 0 n (n + ) 2 (n + 2) n (n + ) 2 (n + 2)

E[XY ] n(n ) y n + 2 Cov(X,Y ) E[XY ] E[X]E[Y ] n + 2 n (n + ) 2 (n + ) 2 (n + 2) 0 0 xy(y x) n 2 dxdy Example 6 Suppose that E[X] 2, E[Y ], E[Z] 3, V ar[x] 4, V ar[y ], V ar[z] 5, Cov(X,Y ) 2, Cov(X,Z) 0, Cov(Y,Z) 2. Let U 3X 2Y + Z, V X + Y 2Z. Find. E[U]; 2. V ar[u]; 3. Cov(U,V ).. E[U] E[3X 2Y + Z] 3E[X] 2E[Y ] + E[Z] 3 2 2 + 3 5 2. V ar[u] 3 2 V ar[x] + 2 2 V ar[y ] + V ar[z] 2Cov(X,Y ) 4Cov(Y,Z) + 6Cov(Z,X) 9 4 + 4 + 5 2( 2) 4 2 + 6 0 6 3. Cov(U,V ) Cov(3X 2Y + Z,X + Y 2Z) 3Cov(X,X) + 3Cov(X,Y ) 6Cov(X,Z) 2Cov(Y,X) 2Cov(Y,Y ) + 4Cov(Y,Z) +Cov(Z,X) + Cov(Z,Y ) 2Cov(Z,Z) Correlation Coefficient

Theorem Schwartz Inequality Let X, Y be two independent random variable. Then, pf) (E[XY ]) 2 E[X 2 ]E[Y 2 ] 0 E[(X λy ) 2 ] λ 2 E[Y 2 ] 2λE[XY ] + E[X 2 ] We now Find λ to minimize E[(X λy ) 2 ] by setting the corresponding derivative to zero. de[(x λy ) 2 ] dλ 2λE[Y 2 ] 2E[XY ] 0 λ E[XY ] E[Y 2 ] a Hence, Equivalently, 0 E[(X ay ) 2 ] E[X 2 ] (E[XY ]) 2 E[X 2 ]E[Y 2 ] (E[XY ])2 E[Y 2 ] E0. Cov(X,Y ) 2 V ar[x]v ar[y ]. Def 4 The correlation coefficient ρ X,Y of two random variables X and Y are defined by ρ X,Y Cov(X,Y ) V ar[x]v ar[y ] E. ρ X,Y. E2. Let a be a positive constant, and let b be any constant. Then { Y ax + b ρ X,Y Y ax + b pf) Consider Y ax + b. Cov(X,Y ) Cov(X, ax + b) acov(x,x) + Cov(X,b) av ar[x] V ar[y ] V ar[ ax + b] a 2 V ar[x] 2

ρ X,Y Cov(X,Y ) V ar[x]v ar[y ] av ar[x] V ar[x] a 2 V ar[x] Example 7 Find ρ X,Y of Example 5 n V ar[x] V ar[y ] (n + ) 2 (n + 2) Cov(X,Y ) (n + ) 2 (n + 2) Cov(X,Y ) ρ X,Y V ar[x]v ar[y ] n Example 8 Let X and Y be two independent random variables. Suppose that E[X], E[Y ] 2, V ar[x] 3, and V ar[y ] 4. Find. V ar[3x 2Y ]; 2. V ar[xy ]; 3. ρ X+Y,X Y.. V ar[3x 2Y ] 3 2 V ar[x] + 2 2 V ar[y ] 43 2. V ar[xy ] E[(XY ) 2 ] (E[XY ]) 2 E[X 2 ]E[Y 2 ] (E[X]E[Y ]) 2 (V ar[x] + (E[X]) 2 )(V ar[y ] + (E[Y ]) 2 ) (E[X]E[Y ]) 2 (3 + 2 )(4 + 2 2 ) ( 2) 2 28 3

3. Cov(X + Y,X Y ) Cov(X,X) Cov(X,Y ) + Cov(Y,X) Cov(Y,Y ) V ar[x] V ar[y ] V ar[x + Y ] V ar[x] + V ar[y ] 7 V ar[x Y ] V ar[x] + V ar[y ] 7 ρ X+Y,X Y 7 Example 9 A box contains 3 red balls and 2 black balls. Now, select two balls without replacement from the box. Let X and Y denote the numbers of red balls and black balls, respectively. Find ρ X,Y. Method. The jpmf of X,Y is as follows: Y 0 2 0 0 0 0 X 0 3 5 0 2 3 0 0 0 E[X] 0 0 + 3 5 + 2 3 0 6 5 E[Y ] 0 3 0 + 3 5 + 2 0 4 5 E[XY ] 3 5 3 5 Cov(X,Y ) E[XY ] E[X]E[Y ] 9 25 V ar[x] E[X 2 ] (E[X]) 2 9 25 V ar[y ] E[Y 2 ] (E[Y ]) 2 9 25 9 25 ρ X,Y 9 9 25 25 Method 2. Since X + Y 2, or Y X + 2, we immediately have ρ X,Y. 4

5.5 Conditional Expectation Def 5 Let X and Y be two random variables. The conditional expectation of Y given X x, denoted by E[Y X x] or E[Y x] (µ Y x ), is defined by Remark y i p Y X (y i x) if Y is discrete E[Y x] i yf Y X (y x)dy if Y is continuous. E[Y x] is a function of x. Hence, E[E[Y x]] is a real. 2. The conditional variance of random variable Y given that X x is defined by V ar[y x] E[(Y µ Y x ) 2 x] E[Y 2 x] (E[Y x]) 2 Example 20 Let random variables X,Y have jpdf f(x,y) Find E[Y X 0.5] and V ar[y X 0.5]. { 2 (2x + y) 0 < x <, 0 < y < 3 0 otherwise E3. E[E[Y x]] E[Y ] 2 f X (x) 0 3 (2x + y)dy ( + 4x), 0 < x < 3 f Y X (y x) f(x,y) f X (x) 2(2x + y) + 4x, 0 < x <, 0 < y < f Y X (y 0.5) 2 (y + ), 0 < y < 3 E[Y 0.5] 2 3 0 yf Y X (y 0.5)dy y(y + )dy 5 9 E[Y 2 0.5] 2 y 2 (y + )dy 7 3 0 8 V ar[y 0.5] E[Y 2 0.5] (E[Y 0.5]) 2 3 62 5

pf) E[E[Y x]] E[Y ] µ Y x f X (x)dx f X (x) y ( y f(x,y) f X (x) dydx yf(x, y)dydx yf Y (y)dy f(x,y)dxdy ) yf Y X (y x)dy f X (x)dx 5.6 Moment Generating Functions Def 6 The moment generating function M X (t) of a random variable X is defined by M X (t) E[e Xt ] The domain of M X (t) is all real numbers such that e Xt has finite expectation. Example 2 Let X B(n,p). Find M X (t). ) n M X (t) E[e Xt ] e xt( n p x q n x x x0 ( ) n n (pe t ) x q n x x x0 (pe t + q) n Example 22 Let X Γ(α,λ). Find M X (t). M X (t) E[e Xt ] e xt 0 Γ(α) λα x α e λx dx λα x α e (λ t)x dx Γ(α) 0 6

Γ(α) Γ(α) (λ t) α ( ) α λ λα λ t ( t λ ) α Facts. M X (t) E[e Xt ] [ E + Xt! E[] + E[X]! E[X k ] t k k! k0 + X2 t 2 2! t + E[X2 ] 2! + X3 t 3 3! + ] t 2 + E[X3 ] t 3 + 3! 2. M X(t) E[X] + E[X] M X(0) k2 X k (k )! tk 3. M X(t) E[X 2 ] + k3 X k (k 2)! tk 2 E[X 2 ] M X(0) V ar[x] M X(0) (M X(0)) 2 4. E[X k ] M (k) (0) Example 23 Use generating function to find the means of variances of the following random variables:. X B(n,p); 2. Y Γ(α,λ). 7

. M X (t) (pe t + q) n M X(t) n(pe t + q) n pe t M X(t) n(n )(pe t + q) n 2 pe t pe t + n(pe t + q) n pe t E[X] M X(0) np E[X 2 ] M X(0) n(n )p 2 + np V ar[x] E[X 2 ] (E[X]) 2 np( p) npq 2. M Y (t) ( t ) α λ ( ) α M Y (t) α λ M Y (t) t λ α(α + ) λ 2 E[Y ] M Y (0) α λ E[Y 2 ] M Y (0) ( t ) α 2 λ α(α + ) λ 2 V ar[y ] E[Y 2 ] (E[Y ]) 2 α λ 2 8

Summary of Important Moments of Random Variables Distribution Density Mean Variance MGF Bernoulli p x q x p pq pe t + q X B(,p) x 0,. Binomial X B(n,p) n x p x q n x np npq (pe t + q) n x 0,, 2,...,n Geometric pq x p X G(p) x, 2,... Neg. Bin x p r q x r r X NB(r,p) x r,r +,... Poisson α x e α x! X P(α) x 0,,... Uniform n X I(,n) x, 2,...,n r p p p 2 r( p) p 2 pe t qe t ( pe t qe t α α e α( et ) n + 2 Exponential λe λx λ X Exp(λ) x > 0 Gamma Γ(α) λα x α e λx α λ n 2 2 λ 2 α λ 2 ) r e t e (n+)t n e t ( t ) λ ( t ) α λ X Γ(α,λ) x > 0 Chi-Square x n/2 e x/2 Γ(n/2)2 n/2 n 2n ( 2t) n/2 X χ 2 n x > 0 Normal e (x µ)2 2σ 2 2πσ µ σ 2 e µt e σ2 t 2 /2 X N(µ,σ 2 ) Uniform X U(α,β) x R β α α < x < β α + β 2 (β α) 2 2 e βt e αt (β α)t 9

Theorem 2 (Correspondence Theorem or Uniqueness Theorem) Let X,X 2 be two random variables. Then, M X (t) M X2 (t) for all t if and only if F X (x) F X2 (x) for all x. Example 24 Find a) the corresponding distributions; b) means and variances. for the following moment generating functions of random variables.. M X (t) (0.3 + 0.7e t ) 5 ; 2. M Y (t) e 4(et ) ; 3. M Z (t) 0.3et 0.7et,t < ln 0.7; 4. M W (t) e 2t+2t2.. (a) X B(5, 0.7). (b) µ X 3.5,σ 2 X.05. 2. (a) Y P(4). (b) µ Y 4,σ 2 Y 4. 3. (a) Z G(0.3). (b) µ Z 0/3,σ 2 Z 70/9. 4. (a) W N(2, 2 2 ). Theorem 3 (b) µ W 2,σ 2 W 4.. [Linear Translation] where a,b are constants. Y ax + b M Y (t) e bt M X (at), 2. [Convolution] Let X,X 2,...,X n be n independent random variables, and let Y X + X 2 + + X n. Then, M Y (t) M X (t)m X2 (t) M Xn (t) 20

pf). M Y (t) E[e Y t ] E[e (ax+b)t ] E[e bt e X(at) ] e bt E[e X(at) ] e bt M X (at) 2. M Y (t) E[e Y t ] E[e (X +X 2 + +X n)t ] E[e Xt e X 2t e Xnt ] E[e Xt ]E[e X2t ] E[e Xnt ] M X (t)m X2 (t) M Xn (t) Example 25 Let X i Exp ( ) iβ,i,...,n be n independent random variables. Let n ( ) Xi Y. ni i. Find E[Y ]. 2. Let Z 2nY/β. Find the distribution of Z.. 2. Hence, Z χ 2 n. E[X i ] iβ [ n E[Y ] E i n i n i M Xi (t) ( iβt) M Xi /(ni) ( ) ] Xi ni ni E[X i] β n β M Xi (t/(ni)) ( βt/n) M Y (t) ( βt/n) n M Z (t) M 2nY/β (t) M Y (2nt/β) ( 2t) n 2

Example 26 Let random variable X satisfy E[X 2k ] (2k)! 2 k k! 0,, 2,... Find the pdf of X and E[X 2k+ ] 0 for k Hence, X N(0, ). M X (t) k0 k0 k0 k0 k0 e t2 /2 E[X k ] t k k! E[X 2k ] t 2k + (2k)! k0 E[X 2k+ ] (2k + )! t2k+ E[X 2k ] t 2k (Since E[X 2k+ ] 0) (2k)! (2k)! 2 k k! (2k)! t2k ( ) t 2 k Theorem 4 Let X i,...,x n be independent random variables. k!. X i B(,p) X + + X n B(n,p). 2. X i B(m i,p) X + + X n B(m + + m n,p). 3. X i G(p) X + + X n NB(n,p). 4. X i NB(r i,p) X + + X n NB(r +,r n,p). 5. X i P(α i ) X + + X n P(α + + α n ). 6. X i Exp(λ) X + + X n Γ(n,λ). 7. X i Γ(α i,λ) X + + X n Γ(α + + α n,λ). 8. X i χ 2 X + + X n χ 2 n. 9. X i χ 2 m i X + + X n χ 2 m + +m n. 0. X i N(µ i,σ 2 i ) X + + X n N(µ + + µ n,σ 2 + + σ 2 n). 2 22

pf) 0. X i N(µ i,σ 2 i ) M Xi (t) e µ it e σ2 i t2 /2. M X + +X n (t) e (µ + µ n)t e (σ2 + σ2 n )t2 /2. Hence, X N(µ + + µ n,σ 2 + σ 2 n) 5.7 Inequalities Theorem 5 Markov Inequality Let X be a nonnegative random variable with E[X] µ. Then, for any t > 0, pf) Define discrete r.v Y as follows: Clearly, the pmf of Y is P(X t) µ t Y p Y (y) { 0 X < t t X t { P(X < t) y 0 P(X t) y t E[Y ] 0 P(X < t) + t P(X t) tp(x t) E[X] P(X t) µ t Example 27 Consider a system with M T T F 00 hours.. Use Markov inequality to approximate the reliability R(t) P(X > t) with t 90, 00, 0 and 200. 2. Suppose that the lifetime of the system is exponentially distributed. Find the reliability R(t) with t 90, 00, 0 and 200. 3. What is your conclusion from the above answers. 23

. P(X > 90) 00/90. P(X > 00) 00/00 P(X > 0) 00/0 0.9 P(X > 200) 00/200 0.5 2. X Exp(/00) P(X > 90) e 0.9 0.40657 P(X > 00) e 0.367879 P(X > 0) e. 0.33287 P(X > 200) e 2 0.35335 3. The performance of Markov inequality is not so good. Theorem 6 Chebyshev s Inequality P( X µ X t) σ2 X t 2 pf) Substituting X with (X µ x ) 2 into Markov inequality, we have Facts P[(X µ X ) 2 t 2 ] E[(X µ X) 2 ] t 2 P( X µ X t) σ2 X t 2 P( X µ X t) σ2 X t 2 P( X µ X kσ X ) k 2 P( X µ X kσ X ) k 2 Example 28 Seventy new jobs are opening up at an automobile manufacturing plant, but,000 applicants show up for the 70 positions. To select the best 70 from among the applicants, the company gives a test to them. The mean and standard deviation of the scores turn out to be 60 and 6, respectively. Can a person who has an 84 score count on getting one of the jobs? 24

Let X denote the score. Then, µ X 60 and σ X 6. 84 µ X 24 4σ X. By Chebyshev s inequality, P(X µ X 4σ X ) P( X µ X 4σ X ) 4 2 This implies that at most 000/6 63 applicants whose scores exceed 84. Hence, this guy can get a job. 5.8 The Weak Law of Large Numbers and Central Limit Theorems Theorem 7 Weak Law of Large Numbers Let X,...,X n be independent, identically distributed random variables having finite mean µ. Define Then for any ǫ > 0 pf) X n (X + + X n ). lim P( X µ > ǫ) 0 n Let the variance of the population be σ 2. [ ] σ 2 V ar[x] V ar X n (X + + X n ) n 2(V ar[x ] + + V ar[x n ]) By Chebyshev s inequality, nσ2 n 2 σ2 n Clearly, P( X µ > ǫ) σ2 X ǫ 2 σ2 nǫ lim P( X µ > ǫ) 0 n Theorem 8 Central Limit Theorem Let X,...,X n be independent, identically distributed random variables having mean µ and finite nonzero variance σ 2. Define Then, S n X + + X n X n (X + + X n ). 25

pf). lim n S n N(nµ,nσ 2 ); or 2. lim n S n nµ σ n N(0, ); or 3. lim n X N(µ,σ 2 /n). 4. lim n X µ σ/ n N(0, ). We show the 4th part of the theorem. Define Clearly, Hence, Z n X µ σ/ n n X i µ n i σ n ( Y i let Y i X ) i µ n i σ E[Y i ] 0 E[Yi 2 ] V ar[y i ] + (E[Y i ]) 2 M Yi (t) + E[Y i]! t + E[Y i 2 ] t 2 + E[Y i 3 ] t 3 + 2! 3! + 2 t2 + E[Y i 3 ] t 3 + 6 M Yi / n(t) M Yi (t/ n) i ] t 3 + 2n t2 + E[Y 3 6 n + 3/2 M Zn (t) M Y / n(t)m Y2 / n(t) M Yn/ n(t) [ + 2n t2 + E[Y i 3 ] t 3 ] n 6 n + 3/2 By L Hopital rule, one can show (exercise) lim M Z n n (t) e t2 /2 26

This concludes lim Z n N(0, ) n Normal Approximation By the central limit theorem, when a sample size is sufficiently large, we can use normal distribution to approximate certain probabilities regarding to the sample or the parameters of its corresponding population. Example 29 Suppose that the lifetime of a bulb is exponentially distributed with mean length of 0 days. As soon as one light bulb burns out, a similar one is installed in its place. Find the probability that more than 50 bulbs will be required during a one-year period. Let X i denote the lifetime of the ith bulb. Clearly, X i Exp(/0). Hence, µ Xi 0 and σ 2 X i 00. Define S 50 X + + X 50 to be the lifetime of 50 bulbs. Then the probability that we want to find is P(S 50 < 365). Since n 50 > 30, we can approximate the distribution of S 50 by N(50 0, 50 00) N(500, 5000). Hence, P(S 50 < 365) Φ ( 365 500 5000 ) Φ(.9) 0.028 Example 30 Let X B(50, 0.3). Use normal approximation to compute P(X 20). Let X i B(, 0.3),i, 2,...,50 be independent random variables. Then, X X + + X 50. Clearly, µ Xi p 0.3 and σ 2 X i pq 0.2. According to the central limit theorem, we can approximate the distribution of X by N(5, 0.5). Hence, P(X 20) Φ ( 20 5 0.5 ) Φ(.543) 0.9386 27

5.9 Exercise. Find E[X] and V ar[x] for random variable X. (a) X B(n,p); (b) X G(p); (c) X NB(r,p); (d) X P(α); (e) X Exp(λ); (f) X Γ(α,λ); (g) X N(µ,σ 2 ); (h) X U(a,b). 2. Let X be a nonnegative integer-valued random variable having a finite expectation. Show that E[X] P(X x). x 3. Let X be a geometrically distributed random variable and let M > 0 be an integer. Set Y max(x,m), Z min(x,m). Find E[Y ] and E[Z]. (Hint: use the fact depicted in problem 2). 4. Let the pmf of random variable X be Find E[X] and V ar[x]. 5. Let X P(α). Find E[( X) ]. { 2x x, 2,...,N N(N+) p(x) 0 otherwise 6. Let X be a discrete random variable with pmf (a) Find E[X], V ar[x]. x - 0 2 p(x) /8 /8 /4 /2 (b) Let Y X 2. Find pmf p Y, E[Y ] and V ar[y ]. 7. Suppose that we have two decks of n cards, each numbered, 2,...,n. The two decks are shuffled and the cards are matched against each other. We say a match occurs at position i if the ith card from each deck has the same number. Let S n denote the number of matches. Compute E[S n ] and V ar[s n ]. 8. A box has 3 balls labelled,2 and 3. Two balls are selected without replacement from the box. Let X, Y denote the numbers of the st and 2nd balls, respectively. Compute Cov(X,Y ). 28

9. The length of time, in minutes, for an airplane to wait for clearance to take off at a certain airport is a random variable Y 3X 2, where X Exp(/4). Find the mean and the variance of Y. 0. The total time, measured in units of 00 hours, that a teenager runs her stereo set over a period of one year is a continuous random variable X that has pdf x 0 < x < f(x) 2 x x < 2 0 otherwise Let Y 60X 2 +39X denote the number of kilowatt hours expended annually. Compute E[Y ].. Let X,...,X n be independent random variables having a common density with mean µ and variance σ 2. Let X (X +... + X n )/n. (a) Show that (b) Conclude from (a) that n n (X k X) 2 (X k µ) 2 n(x µ) 2. k k [ n ] E (X k X) 2 (n )σ 2 k 2. If a random variable X is defined such that E[(X ) 2 ] 0, E[(X 2) 2 ] 6, find µ and σ 2. 3. If the jpdf of random variables of X and Y is given by f(x,y) Compute E[(X/Y 3 ) + X 2 Y ]. { 2 (x + 2y) 0 < x <, < y < 2 7 0 otherwise 4. Tossing 3 fair dice 0 times. Let X denote the number of times with the same three faces landing up, and let Y denote the number of times with exactly the same two faces landing up. Find E[XY ]. 5. Let X and Y be two independent random variables. Suppose that E[X 4 ] 2, E[Y 2 ], E[X 2 ], and E[Y ] 0. Find V ar[x 2 Y ]. 6. Let X χ 2 n. Compute E[ X]. 7. Let X N(0,σ 2 ). Compute E[ X ] and E[X 2 ]. 8. Let E[X n ] n!. Find the pdf of X. 29

9. Let the pdf of X is given by Find M X (t), E[X], and V ar[x]. 20. Let M X (t) e 66t+200t2. Find (a) µ X, σ 2 X; (b) P(70 < X < 200). f(x) 2 e x, x R. 2. Suppose that the moment generating function of random variable X is M X (t) 3 (2t + 3 t + 5 t ),t R. Find the cdf of X. 22. Let X Γ(α,λ) and Y Γ(α 2,λ) be independent random variables, and set Z X + Y. Find the conditional expectation of X given Z. 23. Use moment generating functions to prove Theorem 4. 24. Suppose that random variable X satisfies E[X n ] n!. Find the pdf of X. 25. Let the jpdf of X,Y be defined by { f(x,y) 6 e (x/2+y/3) x > 0,y > 0 0 otherwise Find (a) f X and f Y ; (b) Cov(X,Y ); (c) ρ X,Y ; (d) ρ 2X 5,3Y +2. 26. Let X,X 2,X 3 have the jpdf f(x,x 2,x 3 ) Compute Cov(X,X 3 ). { (x + x 2 )e x 3 0 < x,x 2 <,x 3 > 0 0 otherwise 27. A bolt manufacturer knows that 5% of his production is defective. He gives a guarantee on his shipment of 0,000 parts by promising to refund the money if more than a bolts are defective. How small can the manufacturer choose a and still be assured the he need not give a refund more than % of the time? 28. Suppose that we are given a random variable X with E[X] 20, E[X 2 ] 4, 464. Using Chebyshev s inequality to find the bounds for the probabilities (a) P(96 X 44) (b) P(08 X 32). 29. Let X P(λ). Use Chebyshev s inequality to verify 30

( (a) P X λ ) 4 2 λ ; (b) P(X 2λ) λ. 30. A local company manufactures telephone wire. The average length of the wire is 52 inches with standard deviation of 6.5 inches. At most, what percentage of the telephone wire from this company exceeds 7.5 inches. 3. Let random variables X,Y have the jpdf: f(x,y) Find f Y X, E[Y x], and V ar[y x]. { 20 x y x + 2, 0 x 0 0 otherwise 32. Let random variables X,Y have the jpmf p(x,y) { x+y 2 x, 2,y, 2, 3 0 otherwise Find (a) f X (b) f Y X (c) E[Y x] (d) E[E[Y x]] (e) E[Y 2 x] (f) V ar[y x] (g) E[V ar[y x]]. 33. Let X and Y have the joint probability density function f(x,y) 2, 0 x y 2 Compute (a) f X (x); (b) E[Y x]; (c) V ar[y x]; (d) ρ X,Y. 34. Suppose that the random variables X and Y have jpdf f(x,y) Find (a) E[X] (b) E[X y] (c) V ar[e[x y]]. { 2 0 < x < y < 0 otherwise 35. Let random variables X, Y be independent. Show that E[Y x] E[Y ] 36. Show that E[g(X)Y x] g(x)e[y x], where g is a real-valued function. 37. Show that V ar[y ] E[V ar[y X]] + V ar[e[y X]]. 38. Let X G(p),Y G(p) be independent. Find E[Y X + Y z]. 39. Let X P(λ ),Y P(λ 2 ) be independent. Find E[Y X + Y z]. 40. Let X i U(0, ),i, 2,...,n be independent, and let (a) Are Y and Y n independent? Y min(x,...,x n ), Y n max(x,...,x n ) 3

(b) Find E[Y n Y y ]. 4. The amount of time that a bank teller spends on a customer is a random variable with a mean µ 3.2 minutes and a standard deviation σ.6 minutes. If a random sample of 64 customers is observed, find the probability that their mean time at the teller s counter is (a) at most 2.7 minute; (b) more than 3.5 minutes; (c) at least 3.2 minutes but less than 3.4 minutes. 42. A random sample of size 25 is taken from a normal population having a mean of 80 and a standard deviation of 5. A second random sample of size 36 is taken from a different normal population having a mean of 75 and a standard deviation of 3. Find the probability that the sample mean computed from the 25 measurements will exceed the sample mean computed from the 36 measurements by at least 3.4 but less than 5.9. 32