sheng@mail.ncyu.edu.tw
Content Joint distribution functions Independent random variables Sums of independent random variables Conditional distributions: discrete case Conditional distributions: continuous case Order statistics Joint probability distribution of functions of random variables Exchangeable random variables 2
6. Joint distribution functions The joint cumulative probability distribution function X and Y by F a, b = P X a, Y a a, b The distribution of X can be obtained from joint distribution of X and Y as F X a = P{X a} F(a, ) The distribution functions F X and F Y are sometimes referred to as the marginal distributions of X and Y. For discrete random variable, the joint probability mass function of X and Y by p x, y = P{X = x, Y = y}
Example a Suppose that balls are randomly selected from an urn containing red, 4 white, and 5 blue balls. If we let X and Y denote, respectively, the number of red and white balls chosen, then the joint probability mass function of X and Y, p(i,j)=p{x=i,y=j} is given by 4
Example a p, = p, = p,2 = 5 4 4 2 2 5 2 5 = 22 2 2 = 4 22 = 22 p, = 4 2 = 4 22 p, = 5 2 2 = 22 5
Example a p, = p,2 = p 2, = p 2, = p, = 2 2 4 4 2 5 4 2 5 2 2 2 2 = 6 22 = 8 22 = 5 22 = 2 22 = 22 6
Table 6. Because the individual probability mass functions of X and Y thus appear in the margin of such a table, they are often referred to as the marginal probability mass functions of X and Y, respectively. 7
Joint probability density function We say that X and Y are jointly continuous if there exists a function f(x,y), define for all real x and y, having the property that, for every set C of pairs of real numbers, P X, Y C = f x, y dxdy (x,y) C (.) The function f(x,y) is called the joint probability density function of X and Y. If A and B are any sets of real numbers, then, by defining C = { x, y : x A, y B}, we know P X A, Y B = f x, y dxdy B A (.) 8
Individually probability density functions The probability density function of X f X x = f x, y dy The probability density function of Y f Y y = f x, y dx 9
Example c The joint density function of X and Y is given by f x, y = 2e x e 2y, < x <, < y <, otherwise Compute (a) P{X>,Y<} (b) P{X<Y} and (c) P{X<a}
Solution. c (a) P X >, Y < = 2e x e 2y dxdy (b) 2e 2y e x = e 2e 2y dy P X < Y = 2e x e 2y dxdy x,y :x<y 2e 2y e y dy a = 2e 2y dy (c) P X < a = 2e x e 2y dydx = = e e 2 a y = 2e x e 2y dxdy 2e y dy = e x dx = = 2 = = e a
N random variables The joint cumulative probability distribution function F(a,a 2,..,a n ) of the n random variable X, X 2,, X n is defined by F a, a 2,, a n = P{X a, X 2 a 2,, X n a n } Further, the n random variables are said to be jointly continuous if there exists a function f(x,x 2,,x n ), called the joint probability density function, such that, for any set C in n-space P X, X 2,, X n C = f x,, x n dx dx 2 dx n 2
N random variables For any n sets of real numbers A, A 2,,A n P X A, X 2 A 2,, X n A n = f x,, x n dx dx 2 dx n A n A n A
6.2 Independent random variables The random variables X and Y are said to be independent if, for any two sets of real numbers A and B, P X A, Y B = P X A P Y B 2. Hence, in terms of the joint distribution function F of X and Y, X and Y are independent if F a, b = F X a F Y b for all a, b When X and Y are discrete random variables, the condition of independence (2.) is equivalent to p x, y = p X x p Y y for all x, y In the jointly continuous case, the condition of independence is equivalent to f x, y = f X x f Y y for all x, y
Example 2c A man and a woman decide to meet at a certain location. If each of them independently arrives at a time uniformly distributed between 2 noon and P.M., find the probability that the first to arrive has to wait longer than minutes. 8
Solution. 2c If we let X and Y denote, respectively, the time past 2 that the man and the woman arrive, then X and Y are independent random variables, each of which is uniformly distributed over (,6). The desired probability, P{X+<Y}+P{Y+<X}, which, by symmetry, equals 2P{X+<Y}, is obtained as follows: 2P X + < Y = 2 f x, y dxdy x+<y = 2 f X x f Y y dxdy = 2 6 x+<y 6 2 y dy = 25 6 = 2 6 y 6 2 dx dy 9
Example 2d Buffon s needle problem A table is ruled with equidistant parallel lines a distance D apart. A needle of length L, where L D, is randomly thrown on the table. What is the probability that the needle will intersect one of the lines (the other possibility being that the needle will be completely contained in the strip between two lines)? 2
Solution. 2d Let us determine the position of the needle by specifying () the distance X from the middle point of the needle to the nearest parallel line and (2) the angle θ between the needle and the projected line of length X. The needle will intersect a line if the hypotenuse of the right triangle is less than L/2. That is, if X cosθ < L 2 or X < L 2 cosθ 2
Solution. 2d As X varies between and D/2 and θ between and p/2, it is reasonable to assume that they are independent, uniformly distributed random variables over these respective ranges. Hence, P X < L 2 cos θ = f X x f θ y dxdy π/2 4 πd π/2 L/2 cos y L/2 cos y x<l/2 cos y D 2 π dxdy 2 dxdy = 4 πd = π/2 L 2 cos y dy = = 2L πd 22
Proposition 2. The continuous (discrete) random variables X and Y are independent if and only if their joint probability density (mass) function can be expressed as f X,Y x, y = h x g y < x <, < y < 2
Example 2f If the joint density function of X and Y is f x, y = 6e 2x e y < x <, < y < and is equal to outside this region, are the random variables independent? What if the joint density function is f x, y = 24xy < x <, < y <, < x + y < and is equal to otherwise? 24
Solution. 2f In the first instance, the joint density function factors, and thus the random variables, are independent (with one being exponential with rate 2 and the other exponential with rate ). In the second instance, because the region in which the joint density is nonzero cannot be expressed in the form x A, y B, the joint density does not factor, so the random variables are not independent. This can be seen clearly by letting if < x <, < y <, < x + y < I x, y = otherwise 25
Solution. 2f And writing f x, y = 24xyI x, y which clearly does not factor into a part depending only on x and another depending only on y. 26
Example 2h Let X, Y, Z be independent and uniformly distributed over (,). Compute P{X YX}. 27
Solution. 2h Since f X,Y,Z x, y, z = f X (x)f Y (y) f Z z = x, y, z we have P X YZ = f X,Y,Z x, y, z dxdydz x yz = dxdydz yz = z 2 dz = 4 = yz dydz 28
Independence is a symmetric relation The random variables X and Y are independent if their joint density function (or mass function in the discrete case) is the product of their individual density (or mass) functions. Therefore, to say that X is independent of Y is equivalent to saying that Y is independent of X - or just that X and Y are independent. As a result, in considering whether X is independent of Y in situations where it is not at all intuitive that knowing the value of Y will not change the probabilities concerning X, it can be beneficial to interchange the roles of X and Y and ask instead whether Y is independent of X. 29
Integrate e -4x Since e x = e x, then e x dx = e x + C, that is e u du = e u + C Let u=-4x, du=-4dx, then e 4x dx = 4 e 4x 4dx = 4 e u du = 4 eu + C = 4 e 4x + C