Chapter 4 Multiple Random Variables Chapter 41 Joint and Marginal Distributions Definition 411: An n -dimensional random vector is a function from a sample space S into Euclidean space n R, n -dimensional Eample 41 (Sample space for dice) Roll two fair dice Recall that there are 36 equally likely outcomes in this eperiment Define =sum of dice and = difference of dice, then (, ) is a two dimensional random vector To calculate probabilities in terms of (, ), we need to go back to the original sample space For any have: If 1 1 P ((, ) A ) P ( s ) = { s S:( ( s), ( s)) A} A = I I = {(, y) : I, y I } (which is called a cross-product), we have: Specifically, if A = { } { y}, we have: P((, ) A) = P( I, I ) = P( s) 1 { s S : ( s ) I, ( s ) I } 1 P((, ) A) P(, y) P( s) = = = = { s S: ( s) = and ( s) = y} For eample, we can calculate: P((, ) A) = P((1,1)) + P((1,)) + P((,1)) + P((,)) = 4/36 if A = {( y, ) : + y 4} P ( = 5, = 3) = P((4,1)) + P((1,4)) = /36 A R, we 1
Definition 413: Let (, ) be a discrete bivariate random vector Then the function f ( yfrom, ) R into R defined by f ( y, ) = f, ( y, ) = P ( =, = y) is called the joint probability mass function or joint pmf of (, ) In this case, f ( ymust, ) satisfy (1) f ( y, ) 1 and () R f(, y) = 1 ( y, ) Eample: Find f, ( y, ) for (, ) defined in Eample 41 and calculate P((, ) A) y, 3 4 5 6 7 8 9 1 11 1 1/36 1/36 1/36 1/36 1/36 1/36 1 1/18 1/18 1/18 1/18 1/18 1/18 1/18 1/18 1/18 3 1/18 1/18 1/18 4 1/18 1/18 5 1/18 Similarly with the random variable P ( A) = f ( ), we have: A {:( s ( s), ( s)) A} (, y) A, P((, ) A) = P( s) = f (, y) For eample, we can calculate P((, ) A) for A = {( y, ) : = 7 and y 4} = {(7,),(7,1),(7,),(7,3),(7,4)}, We have P((, ) A) = f, (7,) + f, (7,1) + f, (7,) + f, (7,3) + f, (7,4) = f (7,1) + f (7,3) = 1/9,,
Similarly, can calculate P((, ) A) for A = {( y, ) : + y 4} = {(,),(,1),(3,),(3,1),(4,)}, We have P((, ) A) = f, (,) + f, (,1) + f, (3,) + f, (3,1) + f, (4,) = f (,) + f (3,1) + f (4,) = 4/36,,, Eample (Selecting a committee): An ad hoc committee of three is selected randomly from a pool of 1 students consisting of three seniors, three juniors, two sophomores, and two freshmen students Let be the number of seniors and the number of juniors selected What is the joint pmf of (, )? The joint pmf of (, ) is: 1 3 3 4 f, (, y) = n(, y)/ /1 3 = y 3 y,, y =,1,,3 and + y 3 How do we compute epected value of a function g(, )? Eg(, ) = R g(, y) f(, y) ( y, ) Eample 414: Find E( ) for (, ) defined in eample 41 Let gy (, ) = y, then E 11 ( ) = gy (, ) f ( y, ) = yf ( y, ) = 13 ( y, ) R ( y, ) R 18 Theorem: A function f ( y, ) is a pmf of a random vector (, ) if and only if a f ( y, ) for all, y 3
b f(, y ) = 1 ( y, ) Eample 415 (Joint pmf for dice) Define f ( y, ) by f (,) = f (,1) = 1/6, f (1,) = f (,1) = 1/ 3 f ( y, ) = for any other (, y ) For eample, we can calculate (without referring to the original sample space) P ( = ) = P ( =, = ) + P ( = 1, = 1) = 1/6 + 1/3 = 1/ If we toss two fair dice and define = if the first die shows at most and = 1 otherwise and define = if the second die shows an odd number and = 1 other wise Then (, ) has the above pmf Theorem 416: Let (, ) be a discrete bivariate random vector with joint pmf f, (, y ) Then the marginal pmfs of and, f ( ) = P( = ) and f ( y) = P( = y), are given by f ( ) = R f (, y) and f ( y) = R f, (, y) y, Eample 417 (Marginal pmf for dice eample): y, 3 4 5 6 7 8 9 1 11 1 f ( y) 1/36 1/36 1/36 1/36 1/36 1/36 6/36 1 1/18 1/18 1/18 1/18 1/18 1/36 4
1/18 1/18 1/18 1/18 8/36 3 1/18 1/18 1/18 6/36 4 1/18 1/18 4/36 5 1/18 /36 f ( ) 1/36 /36 3/36 4/36 5/36 6/36 5/36 4/36 3/36 /36 1/36 1 Eample 418 (Dice probabilities) Calculate the quantities involved only or using the joint or marginal pmf P ( < 3) = P ( = ) + P ( = 1) + P ( = ) = (6 + 1 + 8) /36 = /3 1 = y= P ( < 3) = P ( =, = y ) = /3 Note: It is possible to have the same marginals but not the same joint pmf Hence, the joint pmf cannot be determined from just the marginals Eample 419 (Same marginals, different joint pmf) Define a joint pmf by (1) f (,) = f(,1) = 1/ 6, f(1,) = f(1,1) = 1/3, and () f (,) = 1/1, f(1,) = 5/1, f(,1) = f(1,1) = 3/1 Then they have same marginal distributions for and 5
(1) f () = f(,) + f(,1) == 1/ 6 + 1/ 6 = 1/3, f () = f(,) + f(1,) = 1/ 6 + 1/ 3 = 1/ () f () = f(,) + f(,1) == 1/1 + 3/1 = 1/3, f () = f(,) + f(1,) = 1/1 + 5/1 = 1/ Definition 411: A function f ( y, ) from continuous bivariate random vector (, ) if for every R into R is called a joint density function or joint pdf of the A R, P((, ) A) f(, y) ddy = A Notes: R A valid joint pdf f ( y, ) must satisfy (1) f ( y, ) and () f (, y) ddy = 1 The epected value of a real-valued function g(, ) is Eg(, ) g(, y) f(, y) ddy = R The marginal pdfs of and are given by f ( ) = f(, y) dy and f ( y) = f(, y) d Definition: The joint cdf of((, ) is defined by F(, y) = P(, y) for all For the continuous case, which by the Fundamental Theorem of Calculus implies y F(, y) f( s, t) dsdt = F(, y) = y f ( y, ) y R (, ) 6
Eample 4111 (Calculating joint probabilities) Define the joint pdf by y < < < y < f(, y) =, otherwise 6, 1 and 1, (1) Show that this is a valid joint pdf () Find f ( ) and f ( y ) (3) Find P ( + 1) (1) () f ( ) =, f ( y) 3y = (3) 1 1 1 1 1 y 1 P ( + 1) = 6 yddy= 6 ydyd= 9/1 Eample 411: Let the continuous random vector (, ) have joint pdf y e, < < y<, f(, y) =, otherwise y Equivalently, f ( y, ) = e I{( uv, ): < u< v<} ( y, ) Please find P ( + 1) and marginal pdf of and 7
1/ 1 y P ( + 1) = 1 P ( + < 1) = 1 e dyd 1/ (1 ) 1/ 1 1 ( e e ) d e e = =, for f ( ) = f (, y) dy= ep( y) dy= ep( ) > y, for f ( y) = f (, y) d= ep( y) d= yep( y) y > Eample (Tensile Strength): The tensile strengths and of two kinds of nylon fibers have joint density function proportional to ep( ( + y)/ λ) for >, y > Then the joint pdf of (, ) is given by f, (, y) = cep( ( + y)/ λ) for >, y > (a) Find c (b) Find the cdf of (, ) (c) Find P ( + > 1) (1) () We have:, ep( / λ) ep( / λ) λ 1 = f (, y) ddy = cep( ( + y)/ λ) ddy = c d y dy= c 8
(3) We have: F (, y) = f ( s, t) dsdt = cep( ( s + t)/ λ) dsdt,, y y = c ep( s/ λ) ds cep( t/ λ) dt = [1 ep( / λ)][1 ep( y/ λ)] y P ( + > λ) = f ( yddy, ) = 1 f ( yddy, ) λ + y> λ, + y λ, λ y = 1 ( cep( ( + y)/ λ) d) dy λ = 1 ( cep( y/ λ)( ep( / λ) d) dy λ y λ 1 = 1 ep( y/ λ)[1 ep( ( λ y) / λ)] dy λ 1 λ = 1 (ep( y/ λ) ep( 1)) dy λ = 1 (1 ep( 1) ep( 1)) = ep( 1) 9
P ( + > λ) = f ( yddy, ) λ + y> λ, = ( cep( ( + y)/ λ) dy) d + ( cep( ( + y)/ λ) dy) d λ λ λ 1 1 1 1 = ep( / λ)( ep( y/ λ) dy) d ep( / λ)( ep( y/ λ) dy) d λ + λ λ λ λ λ λ 1 1 = ep( / λ)ep( ( λ ) / λ) d+ ep( / λ) d λ λ λ 1 λ = ep( 1) d ep( / λ) ep( 1) λ = λ Eample (Obtaining pdf from cdf): Consider the cdf Then the pdf is: F, (, y) = [1 ep( )][1 ep( y)] for > and y > f, ( y, ) = Fy (, ) = ep( )ep( y) y Eample (Obtaining cdf from pdf): Consider the following pdf: f ( y, ) = 1 for < < 1, < y < 1 Then the corresponding cdf is: for > and y > 1
, < or y< ; y, < 1, y < 1; F(, y) = y,1, y< 1;, < 1,1 y; 1,1,1 y Chapter 4 Conditional Distributions and Independence Definition 41: Let (, ) be a discrete bivariate random vector with joint pmf f ( y, ) and marginal pmfs f ( ) and f ( y ) For any such such that P ( = ) = f ( ) >, the conditional pmf of given that = is the function of y denoted by f ( y ) : f ( y ) = P( = y = ) = f(, y)/ f ( ) For any such y such that P ( = y) = f ( y) >, the conditional pmf of given that = y is the function of denoted by f ( y ) : f ( y) = P( = = y) = f(, y)/ f ( y) Note: f ( y ) is a valid pmf since f ( y ), because f ( y, ) and f ( ) being joint and marginal pmfs, and since f( y ) = f(, y)/ f ( ) = f ( )/ f ( ) = 1 y y Eample 4: Define the joint pmf of (, ) by f (,1) = f (,) = /18, f (1,1) = f (1,3) = 3/18, f (1, ) = 4 /18, f (,3) = 4/18, that is y, 1 3 f ( y ) 1 /18 3/18 5/18 11
/18 4/18 6/18 3 3/18 4/18 7/18 f ( ) 4/18 1/18 4/18 1 (a) Obtain the conditional distribution of given (b) Find P ( > 1 = 1) = (a) f (1) = 1/18 f ( = 1 = 1) = 3/1, f ( = = 1) = 4/1, f ( = 3 = 1) = 3/1 (b) P ( > 1 = 1) = f( = = 1) + f( = 3 = 1) = 7/1 P ( > 1 = 1) = P ( > 1, = 1)/ P( = 1) = 7/1 Eample (Coin tossing): Consider tossing a fair coin three times and let be the number of heads and the number of tails before the first head, then the joint pmf of (, ) is given in the following table: 1 3 P ( = y) 1/8 /8 1/8 4/8 1 1/8 1/8 /8 1/8 1/8 3 1/8 1/8 P ( = ) 1/8 3/8 3/8 1/8 1 The conditional pmf f ( y ) is: 1
The epected value and moments: The probability: 1/4, = 1 or 3; f( ) = P ( = = ) = P ( =, = ) / P ( = ) = 1/, = ;, otherwise 1/, = 1 or ; f( 1) = P( = = 1) = P( =, = 1) / P( = 1) =, otherwise E( = ) = 1/4 + 1/* + 1/4*3 = E ( = ) = 1/4 + 1/* + 1/4*3 = 45 P ( = ) = P ( = = ) + P ( = 1 = ) + P ( = = ) = 3/4 Definition 43: Let (, ) be a continuous bivariate random vector with joint pdf f ( y, ) and marginal pdfs f ( ) and f ( y ) For any such such that f ( ) >, the conditional pdf of given that = is the function of y denoted by f ( y ) : f ( y ) = f(, y)/ f ( ) For any y such that f ( y ) >, the conditional pdf of given that = y is the function of denoted by f ( y ) : f ( y) = f(, y)/ f ( y) Eample 44: Let the continuous random vector (, ) have joint pdf f ( y, ) = e I ( y, ) y, {( uv, ): < u< v<} (a) Find the marginal pdf of (b) For any such that f ( ) >, find f ( y ) 13
(a) (, ) (b) f ( ) = e I ( ) f (, y) f y e I y, ( y ) ( ) = = (, ) ( ), > f ( ) Calculating Epected Values Using Conditional pmfs or pdfs: Let (, ) be a discrete (continuous) bivariate random vector with joint pmf (pdf) f ( y, ) and marginal pmfs (pdfs) f ( ) and f ( y ), and g ( ) is a function of, then the conditional epected value of g( ) given that = is denoted by E( g ( ) ) and is given by E( g ( ) ) = g( y) f( y ) [ E( g ( ) ) = g( y) f( y dy ) ] y Eample 44: (continued) (c) Find E( = ) (d) Find Var( = ) ( ) ( ) y 1 (c) E = = ye dy= + (d) ( ) y z E = = y e dy= ( z+ ) e dz = + +, therefore, ( ) Var E E ( = ) = ( = ) ( ( = )) = 1 Eample: A soft drink machine has random amount in supply at the beginning of a given day and dispenses a random amount 1 during the day It has been observed that they have the following joint pdf: 14
f( y1, y ) = 1/ for y1 y Then the conditional pdf of 1 given = y is: f ( y1 = y) = f( y1, y)/ f ( y ) = 1/ y for y1 y The probability P ( 1/ = 15) = 1//15= 1/3 1 Definition 45: Let (, ) be a bivariate random vector with joint pdf or pmf f ( y, ) and marginal pdfs or pmfs f ( ) and f ( y ) Then and are called independent random variables, if for every R and y R, f ( y, ) = f ( f ) ( y) Consequently, if and are independent, f ( y ) = f ( y) and f ( y) = f ( ) Technical Note: If f ( y, ) is the joint pdf for the continuous random vector (, ) where f ( y, ) f( f ) ( y) on a set Asuch that f (, y ) ddy=, then and are still called independent random A variables Eample 46: Consider the discrete bivariate random vector (, ), with joint pmf given by f (1,1) = f(,1) = f(,) = 1/1, f (1,) = f (1,3) = 1/5, and f (,3) = 3/1 Find the marginals of and Are and independent? No Because f(1,3) = 1/ 5 f (1) f (3) = 1/ 4 Question: Can we check for independence without knowing the marginals? 15
Lemma 47: Let (, ) be a bivariate random vector with joint pdf or pmf f (, y ) Then and are independent random variables if and only if there eist functions g( ) and h( y) such that, for every R and y R, f ( y, ) = ghy ( ) ( ) In this case f ( ) = cg( ) and f ( y) = dh( y), where c and d are some constants that would make f ( ) and f ( y ) valid pdfs or pmfs Eample 48: Consider the joint pdf independent random variables f(, y) 1 384 4 y ( /) = y e, > and y > By Lemma 47, and are Notes: 1 Consider the set {(, y) : A and y B}, where A= { : f ( ) > } and B= { y: f ( ) > }, then this set is called a cross-product and denoted by A B If f ( y, ) is a joint pdf or pmf such that the set {(, y) : f(, y ) > } is not a cross-product, then the random variables and with joint pdf or pmf f ( y, ) are not independent 3 If it is known that and are independent random variables with marginal pdfs (or pmfs) f ( ) and f ( y ), then the joint pdf (or pmf) of and is given by f ( y, ) = f( f ) ( y) (See Eample 49 for discrete case) Theorem 41: Let and be independent random variables a For any A R and B R, P ( A, B) = P ( AP ) ( B) b Let g( ) be a function only of and h( y ) be a function only of y Then E( g( ) h( )) = E( g( )) E( h( )) 16
Proof: Eample 411: Let and be independent eponential(1) random variables a Find the joint pdf of and b Find P ( 4, < 3) c Find E( ) Note: Checking Independence: Let (, ) be a random vector, then and are called independent random variables if for every C = {(, y) : A, y B}, we have P((, ) C) = P( A, B) = P( A) P( B) In addition, this definition can be simplified in terms of cdf with certain conditions: and are called independent random variables if F, (, y) = F ( ) F ( y) for and Eample (Checking Independence): Suppose (, ) has the joint pdf: f, ( y, ) = (1 + y)/4 for < 1 and y < 1 Then and is not independent but and are independent The marginal pdf of is: 1 f( ) = f, (, y) dy = [(1 + y)/4] dy = 1/ 1 By symmetry, the marginal pdf of is: f ( y ) = 1/ for y < 1 for 1 < 17
So f, ( y, ) f( f ) ( y) and and is not independent To get the joint pdf of (, ), we first get the joint cdf of (, ): F (, y) = P(, y) (< < 1,< y< 1) So the joint pdf of (, ) is:, = P(, y y) y = { [(1 + st)/4] dt} ds y 1 = = = ( yds ) y P ( P ) ( y) 1 f (, y) = F (, y) =,, y 4 y for < < 1, < y < 1 Theorem 41: Let and be independent random variables with moment generating functions M () t and M () t Then the moment generating function of the random variable Z = + is given by M Z() t = M() t M() t Proof: Eample 413: Let ~ N( μ, σ ) and ~ n( γ, τ ) Find the pdf of Z = + (This is a very important result!!) 18