STAT 430/510 Probability Hui Nie Lecture 16 June 24th, 2009
Review Sum of Independent Normal Random Variables Sum of Independent Poisson Random Variables Sum of Independent Binomial Random Variables Conditional Distributions: Discrete Case
Conditional Distributions: Continuous Case Let X and Y be jointly continuous r.v. s. Then for any x value for which f X (x) > 0, the conditional pdf of Y given X = x is f Y X (y x) = f (x, y) f X (x), < y < For any set A, P(Y A X = x) = If X and Y are independent, then A f Y X (y x)dy = A f (x, y) f X (x) dy f Y X (y x) = f (x, y) f X (x) = f X (x)f Y (y) = f Y (y) f X (x)
Example The joint density of X and Y is given by f (x, y) = { 12 5 x(2 x y), 0 < x < 1, 0 < y < 1 0, otherwise Compute the conditional density of X given that Y = y, where 0 < y < 1. f X Y (x y) = = = f (x, y) f Y (y) x(2 x y) 1 0 x(2 x y)dx 6x(2 x y) 4 3y
One Discrete R.V. and one Continuous R.V. Suppose that X is a continuous random variable having probability density function f and N is a discrete random variable. Then the conditional distribution of X given that N = n is P(N = n X = x) f X N (x n) = f (x) P(N = n)
Example Consider n + m trials having a common probability of success. Suppose, however, that this success probability is not fixed in advance but is chosen from a uniform (0,1) population. What is the conditional distribution of the success probability given that the n + m trials result in n successes?
Example: Solution Let X denote the probability that a given trials is a success. Let N be the number of successes. f X N (x n) = P(N = n X = x)f X (x) P(N = n) ( ) n + m x n n (1 x) m = P(N = n) x n (1 x) m The conditional density is a beta distribution with parameters (n + 1, m + 1)
Joint Probability Distribution of Functions of R.V. s X 1 and X 2 are jointly continuous random variables with joint probability density function f X1,X 2. Want to find the joint distribution of Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 1, X 2 )
Two Conditions Condition 1 The equations y 1 = g 1 (x 1, x 2 ) and y 2 = g 2 (x 1, x 2 ) can be uniquely solved for x 1 and x 2 in terms of y 1 and y 2, with solutions given by x 1 = h 1 (y 1, y 2 ) and x 2 = h 2 (y 1, y 2 ). Condition 2 The functions g 1 and g 2 have continuous partial derivatives at all points (x 1, x 2 ) and are such that the 2 2 determinant g 1 at all points (x 1, x 2 ) J(x 1, x 2 ) = x 1 g 2 x 2 g 2 g 1 x 1 x 2 0
Change of R.V. s Under two conditions above, random variables Y 1 and Y 2 are jointly continuous with joint density function given by f Y1,Y 2 (y 1, y 2 ) = f X1,X 2 (x 1, x 2 ) J(x 1, x 2 ) 1 where x 1 = h 1 (y 1, y 2 ), x 2 = h 2 (y 1, y 2 )
Example Let X 1 and X 2 be jointly continuous random variables with probability density function f X1,X 2. Let Y 1 = X 1 + X 2, Y 2 = X 1 X 2. find the joint density function of Y 1 and Y 2 in terms of f X1,X 2.
Example: Solution Let g 1 (x 1, x 2 ) = x 1 + x 2 and g 2 (x 1, x 2 ) = x 1 x 2. Then J(x 1, x 2 ) = 1 1 1 1 = 2 Equations g 1 (x 1, x 2 ) = x 1 + x 2 and g 2 (x 1, x 2 ) = x 1 x 2 have x 1 = (y 1 + y 2 )/2, x 2 = (y 1 y 2 )/2 as their solutions f Y1,Y 2 (y 1, y 2 ) = 1 2 f X 1,X 2 ( y 1+y 2 2, y 1 y 2 2 )
Generalization to n dimensions Two Conditions y 1 = g 1 (x 1,, x n ),,y n = g n (x 1,, x n ) have a unique solution x 1 = h 1 (y 1,, y n ),,x n = h n (y 1,, y n ) J(x 1,, x n ) 0 Conclusion f Y1,,Y n (y 1,, y n ) = f X1,,X n (x 1,, x n ) J(x 1,, x n ) 1 where x i = h i (y 1,, y n ), i = 1,, n
Example Let X 1, X 2 and X 3 be independent standard normal random variables. If Y 1 = X 1 + X 2 + X 3, Y 2 = X 1 X 2 and Y 3 = X 1 X 3, compute the joint density function of Y 1, Y 2, Y 3.
Example: Solution J = 1 1 1 1 1 0 1 0 1 = 3 X 1 = Y 1+Y 2 +Y 3 3, X 2 = Y 1 2Y 2 +Y 3 3, X 3 = Y 1+Y 2 2Y 3 3 f Y1,Y 2,Y 3 = exp{ (( y 1+y 2 +y 3 3 ) 2 + ( y 1 2y 2 +y 3 3 ) 2 + ( y 1+y 2 2y 3 3 ) 2 )/2} 3(2π) 3/2 = exp{ (y 2 1 /3 + 2y 2 2 /3 + 2y 2 3 /3 2y 2y 3 /3)/2} 3(2π) 3/2