Bivariate Distributions EGR 260 R. Van Til Industrial & Systems Engineering Dept. Copyright 2013. Robert P. Van Til. All rights reserved. 1
What s It All About? Many random processes produce Examples.» The length and diameter of a part made by a CNC lathe.» Your SYS 317 final exam score and the amount of time spent studying. Note that the 2 outcomes 2
What s It All About? In this presentation, we will extent our probability results to random processes with 2 RV s. Including the development of probability functions, pdf s, expected value formulas, etc. Also, some new concepts will be presented. Note that these results can be readily extended to random processes with 3 or more outcomes. 3
Definition Let X and Y be discrete RV s. The joint probability function of X and Y is given by The marginal probability functions of X and Y, respectively, are given by and 4
Properties Note the joint probability function computes probability! Hence, it satisfies all the properties of a probability function, such as, and 0 P(x,y) 1 for any x,y S 5
Calculating P(x,y) Consider a discrete bivariate distribution with RV s X and Y. Values for the joint probability function can be calculated using f Y x (y x) is the f X (x) is the f XY (x,y) = P[(X=x) (Y=y)] = (1) Note it s ok to exchange X with Y in (1). 6
Example Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " $ X = 0 1st part good # % $ 1 1 st part bad " $ & Y = 0 2nd part good # % $ 1 2 nd part bad Determine the joint probability function and the marginal probability functions. 7
Aside Note for many discrete bivariate distributions, a Hence, the joint probability function is placed in a table.» The table also contains the marginal probability functions. 8
Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 9
Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 10
Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 11
Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 f Y (y) = n f XY (x i,y) i=1 12
Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 n f X (y) = f XY (x,y i ) i=1 13
Example For this example, the structure for the joint probability function table is X Y 0 1 f X (x) 0 f XY (0,0) f XY (0,1) f X (0) 1 f XY (1,0) f XY (1,1) f X (1) f Y (y) f Y (0) f Y (1) 1.0 14
Back to the Example Problem Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " $ X = 0 1st part good # % $ 1 1 st part bad " $ & Y = 0 2nd part good # % $ 1 2 nd part bad Determine the joint probability function and the marginal probability functions. 15
Example 16
Example Hence, X Y 0 1 f X (x) 0 0.875 1 0.125 f Y (y) 0.875 0.125 1.0 17
Definition Let X and Y be continuous RV s. The joint probability density function, f XY (x,y), is such that P[(a X b) (c Y d)] = The marginal probability density functions are given by b a d c f(x,y)dydx and 18
Properties The joint pdf is a pdf! Hence, f(x,y) 0 for all x,y S and 19
Example Any process for producing an industrial chemical will yield a product containing impurities. For a randomly selected sample of a particular chemical, let RV s X = {proportion of all impurities in the sample} Y = {proportion of type 1 impurities among all impurities found in the sample} After investigating several samples, it s determined that f XY (x, y) = 2(1 - x) 0 x 1, 0 y 1 0 elsewhere Determine the probability that X < 0.5 and 0.4 Y 0.7. 20
Example 21
Example 22
Recall As noted earlier, the conditional probability function for discrete RV s X and Y is f X y (X y) = f XY (x, y) f Y (y) for all f Y (y) 0 23
Definition The conditional probability density function for continuous RV s X and Y is given by The conditional pdf is used to calculate conditional probabilities, such as P(a < X < b c < Y < d) = b a d c f X y (x y)dydx 24
Example A soft drink machine starts the day with a supply of Y gallons and dispenses X gallons during the day without being resupplied. It is observed that X and Y have a joint pdf of # 0.5 0 x y, 0 y 2 f XY (x,y) = $ % 0 elsewhere Determine the probability that less than 0.4 gallons of pop is sold during the day given that the machine started the day with more than 1 gallon of pop. 25
Example Hence, want to determine 0.4 P(X < 0.4 Y > 1) = f(x y)dydx 1 26
Example So, the marginal probability of Y is f Y (y) = y - f XY (x,y)dx Thus, P(X < 0.4 Y > 1) = 0.4 0 2 1 0.5 0.5y dydx 27
Example Note that if the machine had started the day with Y > 1.5 gallons, then P(X < 0.4 Y > 1.5) = 0.4 2 0.4 1 1.5 y dydx = 0.194dx = 0.078 0 0 But P(X<0.4 Y>1) =» This implies that the 28
Definition The RV s X and Y are statistically independent if f XY (x,y) = f X (x)f Y (y) for all x,y S The RV s can be discrete or continuous. This is our existing definition written in terms of RV s. 29
Example The joint probability function and marginal probability functions from the earlier good-bad parts problem is given below. Are the RV s X and Y statistically independent? X Y 0 1 f X (x) 0 0.764 0.111 0.875 1 0.111 0.014 0.125 f Y (y) 0.875 0.125 1.0 30
Example 31
Example Does this make sense? Consider the random process. Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " $ X = 0 1st part good # % $ 1 1 st part bad " $ & Y = 0 2nd part good # % $ 1 2 nd part bad 32
Definition Let g(x,y) be any real valued function of RV s X and Y. If RV s X and Y are discrete, then the expected value of g(x,y) is If RV X and Y are continuous, then the expected value of g(x,y) is 33
Properties of Expected Value If RV s X and Y are statistically independent, then For any RV s X 1, X 2,..., X n (they do not need to be statistically independent), Also, E(X 1 - X 2 ) = E(X 1 ) - E(X 2 ) 34
Definition and Property Let g(x,y) be any real valued function of RV s X and Y, then VAR[g(X,Y)] = E[(g(X,Y) - E(g(X,Y))) 2 ] = E[g 2 (X,Y)] - E 2 [g(x,y)] Valid if X and Y are discrete RV s or continuous RV s. 35
Definition and Property If X 1, X 2,..., X n are statistically independent, then VAR(X 1 +... + X n ) = VAR(X 1 ) +... + VAR(X n ) Also, if X 1 and X 2 are statistically independent, then VAR(X 1 - X 2 ) = VAR(X 1 ) + VAR(X 2 ) 36
Special Property Suppose the normal RV s X N(µ X,σ X ) and Y N(µ Y,σ Y ) are statistically independent, then the RV given by is also a normal RV with Note this result holds for n statistically independent normal RV s, Q = X 1 + X 2 +...+ X n N µ 1 +µ 2 +...+µ n, σ 1 2 + σ2 2 +...+ σn 2 ( ) 37
Note All of what we covered in bivariate distributions up to now has been applying existing concepts (e.g., pdf or expected value) to a bivariate random process. Next, we introduce a new concept! 38
Definition The covariance of bivariate RV s X and Y is a 1. If X tends to be large when Y tends to be large, then X and Y will have a 2. If X tends to be large when Y tends to be small, then X and Y will have a 3. If X and Y are unrelated (i.e., statistically independent), then the covariance of X and Y 39
Illustration of Covariance Consider a random process that produces RV s X and Y. Collect several samples from this process and make a scatter plot of its data. Y Y Y X X X 40
Definition The covariance between RV s X and Y is given by Formula valid for discrete RV s or continuous RV s. Note an alternative formula for covariance is 41
Definition and Property Recall if X and Y are statistically independent, then E(XY) = E(X)E(Y) COV(X,Y) = 0 Converse not necessarily true. Note that for RV s X and Y, 42
A Problem with Covariance What is considered a large positive, or negative, value for COV(X,Y)? It depends on the units of the RV s X and Y. Can eliminate this problem by considering the correlation coefficient, which is defined by Note that: Also, ρ has same sign as COV(X,Y). 43
Example Determine the covariance and ρ for RV s X and Y for the previous good-bad part example where the joint probability function and marginal probability functions are given by X Y 0 1 f X (x) 0 0.764 0.111 0.875 1 0.111 0.014 0.125 f Y (y) 0.875 0.125 1.0 44
Example 45
Example 46
Example Recall the random process. Suppose a bin contains 56 parts of which 7 are defective. Select 2 parts at random without replacement and let the discrete RV s be " X = 0 1st part good # $ 1 1 st part bad " & Y = 0 2nd part good # $ 1 2 nd part bad Earlier we concluded that X and Y are statistically dependent. But not by too much! COV(X,Y) and ρ should be near zero. 47