STAT 430/510: Lecture 15 James Piette June 23, 2010
Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.4...
Conditional Distribution: Discrete Def: The conditional pmf of X given Y y is defined as p X Y (x y) P(X x Y y) p(x, y) p Y (y) for all values of y such that p Y (y) > 0. The conditional cdf of X given that Y y is defined similarily: F X Y (x y) P(X x Y y) a x p X Y (a y) Also, if X and Y are independent, then the conditional pmf of X on Y y is equal to: p X Y (x y) P(X x), P(Y y) P(Y y) P(X x)
Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n?
Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n)
Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n) P(X k, k + Y n) P(X + Y n) P(X k)p(y n k) P(X + Y n)
Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n) P(X k, k + Y n) P(X + Y n) P(X k)p(y n k) P(X + Y n) Recalling from the previous lecture, we know that X + Y is distributed...
Example 1 Question: If X and Y are independent Poisson r.v. s with respective parameters λ 1 and λ 2, then what is the conditional distribution of X given that X + Y n? Solution: Go through similar motions that we had when calculating the conditional probabilities of events: P(X k X + Y n) P(X k, X + Y n) P(X + Y n) P(X k, k + Y n) P(X + Y n) P(X k)p(y n k) P(X + Y n) Recalling from the previous lecture, we know that X + Y is distributed... Poisson with parameter λ 1 and λ 2.
Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1
Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1 λ k 1 λn k 2 n! (n k)!k! (λ 1 + λ 2 ) n
Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1 λ k 1 λn k 2 n! (n k)!k! (λ 1 + λ 2 ) n ( ) ( ) n k ( λ1 λ2 k λ 1 + λ 2 λ 1 + λ 2 ) n k Recognize this as anything?
Example 1 (cont.) Using that fact, we can proceed... P(X k)p(y n k) P(X + Y n) e λ 1λ k 1 e λ 2λ n k 2 k! (n k)! [ e (λ 1+λ 2 ) (λ 1 + λ 2 ) n n! ] 1 λ k 1 λn k 2 n! (n k)!k! (λ 1 + λ 2 ) n ( ) ( ) n k ( λ1 λ2 k λ 1 + λ 2 λ 1 + λ 2 ) n k Recognize this as anything? Binomial with n trials and prob. λ 1 λ 1 +λ 2.
Formalization Def: Let X and Y be jointly continuous r.v s. Then, for any x value for which f X (x) > 0, the conditional pdf of X given Y y is f X Y (x y) For any set A, P(X A Y y) f (x, y) f Y (y), where f Y (y) > 0 A f X Y (x y)dx By letting A (, a], we define the conditional cdf of X given Y y by F X Y (a y) P(X a Y y) a f X Y (x y)dx
Example 2 The joint density of X and Y is given by f (x, y) { 12 5 x(2 x y) 0 < x < 1, 0 < y < 1 0 otherwise Question: What is the conditional density of X given that Y y, where 0 < y < 1?
Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y)
Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y) f (x, y) f (x, y)dx
Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y) f (x, y) f (x, y)dx 1 0 x(2 x y) x(2 x y)dx
Example 2 (cont.) For 0 < x < 1, 0 < y < 1, we have f X Y (x y) f (x, y) f Y (y) f (x, y) f (x, y)dx 1 0 x(2 x y) x(2 x y)dx x(2 x y) 2 3 y 2 6x(2 x y) 4 3y
Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise Question: What is P(X > 1 Y y)? 0 < x <, 0 < y <
Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise 0 < x <, 0 < y < Question: What is P(X > 1 Y y)? Solution: First, we need to obtain the conditional density of X given that Y y.
Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise 0 < x <, 0 < y < Question: What is P(X > 1 Y y)? Solution: First, we need to obtain the conditional density of X given that Y y. f X Y (x y) f (x, y) f Y (y) e x/y e y /y e y 0 (1/y)e x/y dx
Example 3 Suppose that the joint density of X and Y is given by f (x, y) { e x/y e y y 0 otherwise 0 < x <, 0 < y < Question: What is P(X > 1 Y y)? Solution: First, we need to obtain the conditional density of X given that Y y. f X Y (x y) f (x, y) f Y (y) e x/y e y /y e y 0 (1/y)e x/y dx 1 y e x/y
Example 3 (cont.) With this, we can find P(X > 1 Y y)...
Example 3 (cont.) With this, we can find P(X > 1 Y y)... P(X > 1 Y y) 1 1 y e x/y dx
Example 3 (cont.) With this, we can find P(X > 1 Y y)... P(X > 1 Y y) 1 1 y e x/y dx e x/y 1
Example 3 (cont.) With this, we can find P(X > 1 Y y)... P(X > 1 Y y) 1 1 y e x/y dx e x/y e 1/y 1
One Discrete and One Continuous Suppose that X is a continuous r.v. with pdf f and N is a discrete r.v. Then, the conditional distribution of X given that N n is f X N (x n) P(N n X x) f (x) P(N n)
Example 4 Consider n + m trials having a common prob. of success. Suppose, however, that this success prob. is not fixed in advance, but is chosen from a uniform (0,1). Question: What is the conditional distribution of the success prob. given that n + m trials result in n successes?
Example 4 Consider n + m trials having a common prob. of success. Suppose, however, that this success prob. is not fixed in advance, but is chosen from a uniform (0,1). Question: What is the conditional distribution of the success prob. given that n + m trials result in n successes? Solution: Let X denote the prob. that a given trial is a success. Then X is distributed uniform (0,1).
Example 4 Consider n + m trials having a common prob. of success. Suppose, however, that this success prob. is not fixed in advance, but is chosen from a uniform (0,1). Question: What is the conditional distribution of the success prob. given that n + m trials result in n successes? Solution: Let X denote the prob. that a given trial is a success. Then X is distributed uniform (0,1). Given that X x, the n + m trials are independent with common prob. of success x, so N, the number of successes, is a binomial r.v. with parameters (n + m, x).
Example 4 (cont.) The conditional density of X given that N n is f X N (x n) P(N n X x)f X (x) P(N n)
Example 4 (cont.) The conditional density of X given that N n is f X N (x n) P(N n X x)f X (x) P(N n) ( n+m ) n x n (1 x) m where 0 < x < 1 P(N n)
Example 4 (cont.) The conditional density of X given that N n is f X N (x n) P(N n X x)f X (x) P(N n) ( n+m ) n x n (1 x) m where 0 < x < 1 P(N n) cx n (1 x) m The conditional density is that of a beta r.v. with parameters n + 1, m + 1.
Motivation Let X 1 and X 2 be jointly cont. r.v. s with joint pdf f X1,X 2. Suppose there are two r.v. s, Y 1 and Y 2, such that Y 1 g 1 (X 1, X 2 ) and Y 2 g 2 (X 1, X 2 ) for some functions g 1 and g 2. There are two assumptions: 1 Equations y 1 g 1 (x 1, x 2 ) and y 2 g 2 (x 1, x 2 ) can be uniquely solved for x 1 and x 2 in terms of y 1 and y 2, with solutions given by x 1 h 1 (y 1, y 2 ) and x 2 h 2 (y 1, y 2 ). 2 The functions g 1 and g 2 have cont. partial derivatives at all points (x 1, x 2 ) and are s.t. the 2 2 determinant: g 1 J(x 1, x 2 ) x 1 g 2 x 2 g 1 g 1 x 1 x 2 0
Formalization Under the previous two assumptions, the r.v. s Y 1 and Y 2 are jointly cont. with joint density given by f Y1 Y 2 (y 1, y 2 ) f X1 X 2 (x 1, x 2 ) J(x 1, x 2 ) 1 where x 1 h 1 (y 1, y 2 ), x 2 h 2 (y 1, y 2 ).
Example 5 Let X 1 and X 2 be jointly cont. r.v. s with pdf f X1 X 2. Let Y 1 X 1 + X 2, Y 2 X 1 X 2. Question: What is the joint pdf of Y 1 and Y 2 in terms of f X1 X 2?
Example 5 Let X 1 and X 2 be jointly cont. r.v. s with pdf f X1 X 2. Let Y 1 X 1 + X 2, Y 2 X 1 X 2. Question: What is the joint pdf of Y 1 and Y 2 in terms of f X1 X 2? Solution: Let g 1 (x 1, x 2 ) x 1 + x 2 and g 2 (x 1, x 2 ) x 1 x 2. Then, we need to verify the assumption 2...
Example 5 Let X 1 and X 2 be jointly cont. r.v. s with pdf f X1 X 2. Let Y 1 X 1 + X 2, Y 2 X 1 X 2. Question: What is the joint pdf of Y 1 and Y 2 in terms of f X1 X 2? Solution: Let g 1 (x 1, x 2 ) x 1 + x 2 and g 2 (x 1, x 2 ) x 1 x 2. Then, we need to verify the assumption 2... J(x 1, x 2 ) 1 1 1 1
Example 5 (cont.) And verify assumption 1...
Example 5 (cont.) And verify assumption 1... x 1 (y 1 + y 2 ) 2 x 2 (y 1 y 2 ) 2
Example 5 (cont.) And verify assumption 1... x 1 (y 1 + y 2 ) 2 x 2 (y 1 y 2 ) 2 Thus, the desired probability is:
Example 5 (cont.) And verify assumption 1... x 1 (y 1 + y 2 ) 2 x 2 (y 1 y 2 ) 2 Thus, the desired probability is: f Y1 Y 2 (y 1, y 2 ) 1 ( 2 f y1 + y 2 X 1 X 2, y ) 1 y 2 2 2
Let s cover self-test problems 6.13 and 6.14. We ve now covered up to 6.5 and started 6.7 (skipped 6.6). Remember, I ve posted HW4 up on my website and it is due on Mon. June 28th.