Notes for Math 324, Part 19

Size: px
Start display at page:

Download "Notes for Math 324, Part 19"

Transcription

1 48 Notes for Math 324, Part 9

2 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which are functions from the sample space into R. Given two r.v. s defined in the same sample space and given a set A R 2, we can define P[(X, Y ) A] = P[s S : (X(s), Y (s)) A]. Alternatively, we can consider the function (X, Y ) : S R 2. In this chapter, we consider r.v. s with values in R 2 or other multidimensional space. 9. Multivariate Discrete distributions Definition 9... Given two discrete r.v. s X and Y defined in the same probability space. The joint probability mass function of X and Y is the function p(x, y) = P[X = x, Y = y]. Theorem 9.. Let p be the probability mass function of the r.v. s X and Y. Then, (i) for each x, y, p(x, y), (ii) x,y p(x, y) =. Definition Let X and let Y be two discrete r.v. s defined in the same probability space. The marginal probability mass function of X is the function p X (x) = P[X = x] = y p(x, y). The marginal probability mass function of Y is the function p Y (y) = P(Y = y) = x p(x, y). Example 9.. Suppose that 3 balls are randomly selected from an urn containing 3 white, 4 red, and 5 blue balls. Let X and Y denote, respectively, the number of white and red that are chosen. Find the joint probability function of X and Y. 49

3 5 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE The joint probability function of X and Y is the function ( x)( y)( 5 x y), for x, y integers with x, y and x + y 5 p(x, y) = ( 2 3 ) else Explicitly, p(, ) = (3 )( 4 )( 5 3) ( 2 3 ) p(, 3) = (3 )( 4 3)( 5 ) ( 2 3 ) p(, 2) = (3 )( 4 2)( 5 ) ( 2 3 ) p(3, ) = (3 3)( 4 )( 5 ) ( 2 3 ) = 22, p(, ) = (3 )( 4 )( 5 2) ( 2 3 ) = 4 22, p(, ) = (3 )( 4 )( 5 2) ( 2 3 ) = 8 22, p(2, ) = (3 2)( 4 )( 5 ) ( 2 3 ) = 22 = 4 22, p(, 2) = (3 )( 4 2)( 5 ) ( 2 3 ) = 3 22, p(, ) = (3 )( 4 )( 5 ) ( 2 3 ) = 5 22, p(2, ) = (3 2)( 4 )( 5 ) ( 2 3 ) = 3 22, = 6 22, = 2 22, We can set up the probabilities in a table: p(x, y) y = y = y = 2 y = 3 p X (x) 4 3 x = 22 x = x = 2 22 x = 3 p Y (y) The marginal probability mass functions are x p X (x) y p Y (y) Problem 9.. (# 3, Sample Test) Let X and Y be discrete random variable with joint probability function p(x, y) = 2x+y 2 for (x, y) = (, ), (, 2), (, 2), (, 3) otherwise Determine the marginal probability function of X.

4 9.2. JOINTLY CONTINUOUS DISTRIBUTIONS 5 for x = 6 (A) p(x) = 5 for x = 6 otherwise 2 for x = 9 3 for x = 9 (D) p(x) = 4 for x = 9 otherwise Answer (B) Solution: We have that p(, ) = 2 for x = 4 (B) p(x) = 3 for x = 4 otherwise y for x = 2 2+y (E) p(x) = for x = 2 otherwise for x = 3 (C) p(x) = 2 for x = 3 otherwise 2 4 5, p(, 2) =, p(, 2) =, p(, 3) = 2 = p X () = p(, ) + p(, 2) = 3 2 = 4 and p X() = p(, 2) + p(, 3) = 9 2. So, Problem 9.2. (# 27, May 2) A car dealership sells,, or 2 luxury cars on any day. When selling a car, the dealer also tries to persuade the customer to buy an extended warranty for the car. Let X denote the number of luxury cars sold in a given day, and let Y denote the number of extended warranties sold. What is the variance of X? Answer:.58 P[X =, Y = ] = 6 P[X =, Y = ] = 2 P[X =, Y = ] = 6 P[X = 2, Y = ] = 2 P[X = 2, Y = ] = 3 P[X = 2, Y = 2] = 6 Solution: We find the marginal probability mass function of X and then, the variance of X. We have that P[X = ] = p(, ) = 3, P[X = ] = p(, ) + p(, ) = and 6 2 P[X = 2] = p(2, ) + p(2, ) + p(2, 2) = 7. So, 2 and E[X] = () 6 + () (2) 7 2 = 7 2 E[X 2 ] = () () (2)2 7 2 = 3 2 Var(X) = E[X 2 ] (E[X]) 2 = Jointly continuous distributions ( ) 2 7 = Definition Two r.v. s X and Y defined in the same probability space are said to be jointly continuous if there exists a function f : R 2 R such that for each A R 2, P[(X, Y ) A] = f(x, y) dx dy. A

5 52 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE The function f above is called the joint probability density function of X and Y. Theorem 9.2. Let f be the joint probability density function of the r.v. s X and Y. Then, (i) for each x, y, f(x, y), (ii) f(x, y) dx dy =. Definition Let X and Y be two jointly continuous r.v. s X and Y defined in the same probability space with joint density function f X,Y. The marginal density function of X is the function f X (x) = The marginal density function of Y is the function f Y (y) = f(x, y) dy. f(x, y) dx. The marginal pdf f X of X is defined so that for each A R, f X,Y (x, y) dy dx = f X (x) dx. A Similarly, the marginal pdf f Y of Y is defined so that for each A R, f X,Y (x, y) dy dx = f Y (y) dy. A Problem 9.3. (# 4, Sample Test) Let X and Y be random losses with joint density function e (x+y) for x > and y >. An insurance policy is written to reimburse X + Y. Calculate the probability that the reimbursement is less than. Answer: 2e. A A Figure 9.: 4, Sample Test

6 9.2. JOINTLY CONTINUOUS DISTRIBUTIONS 53 Solution: We have to find P[X + Y ] = x e (x+y) dy dx = ( e x y ) = (e x e ) dx = e e = 2e. Problem 9.4. (# 36, November, 2) An insurance company insures a large number of drivers. Let X be the random variable representing the company s losses under collision insurance, and let Y represent the company s losses under liability insurance. X and Y have joint density function 2x+2 y for < x < and < y < 2 4 otherwise What is the probability that the total loss is at least? Answer:.7 x dx Figure 9.2: 36, November, 2 Solution: We need to find P[X + Y ] = = 2 (2x+2 y) 2 8 x ( ) = (3x+) 3 x x+2 y x dx = dy dx 4 ( (3x+) 2 (2x)2 8 8 ) dx = = 5 72 =.783 Problem 9.5. (# 5, May 2) A company is reviewing tornado damage claims under a farm insurance policy. Let X be the portion of a claim representing damage to the house and

7 54 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE let Y be the portion of the same claim representing damage to the rest of the property. The joint density function of X and Y is 6[ (x + y)] for x >, y >, x + y < otherwise Determine the probability that the portion of a claim representing damage to the house is less than.2.answer:.488 Figure 9.3: 5, May 2 Solution: We have to find P[X.2] =.2 x (6 6x 6y) dy dx =.2 x ( 3)( x y) 2 =.2 3( x) 2 dx = (.8) 3 =.488 Problem 9.6. (# 24, May 2) A device contains two components. The device fails if either component fails. The joint density function of the lifetimes of the components, measured in hours, is f(s, t), where < s < and < t <. What is the probability that the device fails during the first half hour of operation?.5.5 (A).5 f(s, t) ds dt (B) f(s, t) ds dt (C) f(s, t) ds dt (D) (E).5 f(s, t) ds dt +.5 f(s, t) ds dt.5 Answer: (E) f(s, t) ds dt + dx.5 f(s, t) ds dt

8 9.3. INDEPENDENT RANDOM VARIABLES. 55 Figure 9.4: 24, May 2 Problem 9.7. (# 2, November 2) The future lifetimes (in months) of two components of a machine have the following joint density function: 6 (5 x y) for < x < 5 y < 5 25, otherwise What is the probability that both components are still functioning 2 months from now? (A) 6 25, (C) 6 25, (E) 6 25, Answer: (B) x y (5 x y) dx dy (B) 6 5 x y 2 (5 x y) dx dy 2 (5 x y) dx dy (D) x 25, , 2 (5 x y) dx dy 2 5 x (5 x y) dx dy Independent random variables. Definition Two r.v. s X and Y are said to be independent if for each A, B R, P[X A, Y B] = P[X A]P[Y B]. Example 9.2. Two fair dice are rolled. Let X be the biggest value of the two dice and the Y be the smallest value. Are X and Y independent random variables. Why? Solution: The r.v. s X and Y are not independent because the relation p X,Y (x, y) = p X (x)p y (y) does not hold for each x, y. For example, for x = and y = 2, p X,Y (, 2) =, p X () = and p 36 Y (2) = 9. 36

9 56 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Figure 9.5: 2, November 2 Problem 9.8. (#2, November 2) An insurance company determines that N, the number of claims received in a week, is a random variable with P [N = n] =, where n. 2 n+ The company also determines that the number of claims received in a given week is independent of the number of claims received in any other week. Determine the probability that exactly seven claims will be received during a given two week period. Answer. 64 Solution: Let N be the number of claims in week. Let N 2 be the number of claims in week 2. We have that P [N + N 2 = 7] = P [(N, N 2 ) (, 7), (, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, ), (7, )}] P [(N, N 2 ) = (, 7)] + P [(N, N 2 ) = (, 6)] + P [(N, N 2 ) = (2, 5)] +P [(N, N 2 ) = (3, 4)] + P [(N, N 2 ) = (4, 3)] +P [(N, N 2 ) = (5, 2)] + P [(N, N 2 ) = (6, )] + P [(N, N 2 ) = (7, )] = P [N = ]P [N 2 = 7] + P []N = ]P []N 2 = 6] + P [N = 2]P [N 2 = 5] +P [N = 3]P [N 2 = 4] + P [N = 4]P [N 2 = 3] + P [N = 5]P []N 2 = 2] +P [N = 6]P [N 2 = ] + P [N = 7]P [N 2 = ] = = 64. Theorem 9.3. Let X and let Y be two independent r.v. s. functions f and let g, E[f(X)g(Y )] = E[f(X)]E[g(Y )]. Then, for each (integrable) Theorem 9.4. Let X and let Y be two discrete r.v. s. Let p X,Y be the joint probability mass function of X and Y, let p X be the joint probability mass function of X and let p Y be the joint probability mass function of Y. Then, X and Y are independent if and only if for each

10 9.3. INDEPENDENT RANDOM VARIABLES. 57 x, y R, p X,Y (x, y) = p X (x)p Y (y). Theorem 9.5. Let X and let Y be two jointly continuous r.v. s. Let f X,Y be the joint probability mass function of X and Y, let f X be the joint probability mass function of X and let f Y be the joint probability mass function of Y. Then, X and Y are independent if and only if for each x, y R, f X,Y (x, y) = f X (x)f Y (y). Example 9.3. Suppose that X and Y are independent random variables with uniform distribution in (, 2). Find P X + Y }. Solution: The density of X is f X (x) = 2, for x 2. The density of Y is f Y (y) = 2, for y 2. Since X and Y are independent r.v s, the joint density of X and Y is f X,Y (x, y) =, for x, y 2. 4 Since the density is a constant, the probability of a region in [, 2] [, 2] is its area over 4. The region determined by (x, y) : x + y } and [, 2] [, 2] is a rectangular triangle with perpendicular sides of lenght each. So, P[X + Y ] = 2 4 = 8. Example 9.4. Suppose that X and Y are independent random variables with exponential distribution of parameter λ =. Find P X + Y }. Solution: The density of X is f X (x) = e x, for x. The density of Y is f Y (y) = e y, for y. Since X and Y are independent r.v s, the joint density of X and Y is We have that f X,Y (x, y) = e x y, for x, y. P[X + Y ] = x e x y dy dx == ( e x y ) = (e x e ) dx = e e = 2e Problem 9.9. The joint probability density function of X and Y is given by k(x 2 + y 2 ) if < x <, < y < ; f X,Y (x, y) = else. Find k. Find the marginal densities. Are X and Y independent r.v. s? Why? Solution: Since the total probability is one, = k(x2 + y 2 ) dy dx = 4k So, k = 3. The marginal density of X is 8 f X (x) = (x2 + y 2 ) dy dx = 8k x 3 8 (x2 + y 2 ) dy = 3x2 4 +, for x. 4 dx x2 dy dx = 8k 3.

11 58 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Similarly, the marginal density of Y is Since f Y (y) = 3 8 (x2 + y 2 ) dy = 3y2 4 +, for y. 4 f X,Y (x, y) = f X (x)f Y (y) do not hold for each x, y, X and Y are not independent r.v. s. Problem 9.. The random variables X and Y have joint density function x if < x <, < y < 2, f X,Y (x, y) = otherwise. Find the marginal densities. Are the random variables X and Y independent r.v. s? Solution: The marginal density of X is f X (x) = 2 Similarly, the marginal density of Y is For each x, y, f Y (y) = x dy = 2x, for < x <. x dx =, for < y < 2. 2 f X,Y (x, y) = f X (x)f Y (y) So, X and Y are independent r.v. s. It is possible to prove that if the joint density of X and Y has the form f X,Y (x, y) = g(x)h(y) for a < x < b and c < y < d, where g and h are two functions, then X and Y are independent r.v. s Problem 9.. (# 28, November 2) Two insurers provide bids on an insurance policy to a large company. The bids must be between 2 and 22. The company decides to accept the lower bid if the two bids differ by 2 or more. Otherwise, the company will consider the two bids further. Assume that the two bids are independent and are both uniformly distributed on the interval from 2 to 22. Determine the probability that the company considers the two bids further. Answer:.9 Solution: Let X be the first bid. Let Y be the second bid. The density of X is f X (x) =, for 2 x 22. The density of Y is f 2 Y (y) =, for 2 y 22. Since X 2 and Y are independent r.v s, the joint density of X and Y is f X,Y (x, y) =, for 2 x, y Since the density is a constant the probability of set in the region [2, 22] [2, 22] is its area times 2 2. We are finding the probability of P[ X Y 2] = P[X 2 Y

12 9.3. INDEPENDENT RANDOM VARIABLES. 59 Figure 9.6: 28, November 2 X + 2]. The complementary of this region is two rectangular triangles whose perpendicular sides are 8 each. The area of each triangle is 82. So, the probability we are looking for is 2 (8) 2 = Problem 9.2. (# 3, May 2) A study is being conducted in which the health of two independent groups of ten policyholders is being monitored over a one-year period of time. Individual participants in the study drop out before the end of the study with probability.2 (independently of the other participants). What is the probability that at least 9 participants complete the study in one of the two groups, but not in both groups? Answer:.469 Solution: Let X be the number of participants with complete the study in the first group. Let Y be the number of participants with complete the study in the second group. We have to find P[X 9} [Y 9}}]. Both X and Y have a binomial distribution with parameters n = and p =.8. So, P[X 9] = P[X 9] + P[X ] = Since X and Y are independent r.v. s ( ) (.8) 9 (.2) + 9 ( ) (.8) = P[X 9} Y 9}] = P[X 9] + P[Y 9} P[X 9]P[Y 9} = (.3758)(.3758) = Problem 9.3. (# 22, May 2) The waiting time for the first claim from a good driver and the waiting time for the first claim from a bad driver are independent and follow exponential distributions with means 6 years and 3 years, respectively. What is the probability that the first claim from a good driver will be filed within 3 years and the first claim from a bad driver will be filed within 2 years? Answer: e 2/3 e /2 + e 7/6

13 6 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Solution: Let X be the waiting time for the first claim from a good driver. The density of X is f X (x) = 6 e x/6, for x. Let Y be the waiting time for the first claim from a bad driver. The density of Y is f Y (y) = 3 e y/3, for y. We need to find P[X 3, Y 2] = P[X 3]P[Y 2] = 3 6 e x/6 dx 3 3 e y/3 dy = ( e /2 )( e 2/3 ) = e 2/3 e /2 + e 7/6. Problem 9.4. (#, May 2) An insurance company sells two types of auto insurance policies: Basic and Deluxe. The time until the next Basic Policy claim is an exponential random variable with mean two days. The time until the next Deluxe Policy claim is an independent exponential random variable with mean three days. What is the probability that the next claim will be a Deluxe Policy claim? Answer:.4 Figure 9.7:, May 2 Solution: Let X be the time of the next Basic Policy claim. The density of X is f X (x) = 2 e x/2, for x. Let Y be the time until the next Deluxe Policy claim. The density of Y is f Y (y) = 3 e y/3, for y. Since X and Y are independent r.v. s, the joint density of X and Y is We need to find f X,Y (x, y) = 6 e x 2 y 3, for x, y. P[Y X] = = ( 2 )e x 2 y 3 x x 6 e x 2 y 3 dy dx dx = 2 e x 2 ( e x/3 ) dx = 2. 5 = ( 2 e x 2 2 e 5x 6 ) dx = ( e x e 5x 6 )

14 9.4. EXPECTATION OF A FUNCTION OF SEVERAL RANDOM VARIABLES Expectation of a function of several random variables Given discrete r.v. s X and Y with joint pmf p, and a function h : R 2 R, E[h(X, Y )] = x,y h(x, y)p(x, y). Given two jointly continuous r.v. s X and Y with joint pdf f, and a function h : R 2 R, E[h(X, Y )] = h(x, y)f(x, y) dy dx. Problem 9.5. (# 4, November, 2) A device contains two circuits. The second circuit is a backup for the first, so the second is used only when the first has failed. The device fails when and only when the second circuit fails. Let X and Y be the times at which the first and second circuits fail, respectively. X and Y have joint probability density function 6e x e 2y for < x < y < otherwise What is the expected time at which the device fails? Answer:.83 Figure 9.8: 4, November, 2 Solution: By a change of variable 2y = u, 3x = v, x E[Y ] = y6e x e 2y dy dx = u 2 6e x e u du dx = e x ( e u )( + u) dx = 3 2 e x e 2x ( + 2x) dx = ) u=2x dx = ( + 2) = 5 = = 3 2 e v ( + 2v 3 2x 2x 3 2 e x e u u du dx 3 2 e 3x ( + 2x) dx

15 62 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Problem 9.6. (# 3, November 2) Let T be the time between a car accident and reporting a claim to the insurance company. Let T 2 be the time between the report of the claim and payment of the claim. The joint density function of T and T 2, f(t, t 2 ), is constant over the region < t < 6, < t 2 < 6, t + t 2 <, and zero otherwise. Determine E[T + T 2 ], the expected time between a car accident and payment of the claim. Answer: 5.7 Figure 9.9: 3, November 2 Solution: The region consists of a square of side 6 with a triangle removed. The area of the region is = 34. So, the density is f(t 2, t 2 ) = in the region < t 34 < 6, < t 2 < 6, t + t 2 <. By symmetry E[T + T 2 ] = 2E[T ] = t dt 34 2dt t t dt dt = 6 7 = t dt t ( t )dt Problem 9.7. (# 3, Sample Test) Let X and Y be random losses with joint density function 2x for < x < and < y < an insurance policy is written to cover the loss X + Y. The policy has a deductible of. Calculate the expected payment under the policy. Answer: 4. Solution: Let Z = g(x, Y ) be the payment, where if X + Y < g(x, Y ) = X + Y if X + Y So, the expected payment is E[g(X, Y )] = (x + y )2x dy dx = x(x + y x )2 x dx = x3 dx = 4.

16 9.4. EXPECTATION OF A FUNCTION OF SEVERAL RANDOM VARIABLES 63 Figure 9.: 3, Sample Test Problem 9.8. (# 5, May 2) Let T and T 2 represent the lifetimes in hours of two linked components in an electronic device. The joint density function for T and T is uniform over the region defined by t t 2 L where L is a positive constant. Determine the expected value of the sum of the squares of T and T. Answer: 2 3 L2 Figure 9.: 5, May 2 Solution: Since the density is uniform in a triangle with area L2. The density is 2 f(t, t 2 ) = 2 L 2 for t t 2 L. Hence, E[T 2 + T2 2 ] = L L t (t 2 + t 2 2) 2 dt L 2 2 dt = L L (t2 t 2 + t3 2 3 ) 2 dt L 2 t = L ( 2t2 + 2L 2t3 2t3 L 3 L 2 ) dt 3L 2 = 2L2 + 2L2 L2 L2 = L2.

17 64 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE 9.5 Covariance. Definition Given two r.v. s X and Y, the covariance of X and Y is Cov(X, Y ) = E[(X E[X])(Y E[Y ])] As a rule of thumb, the covariance is positive is whenever one variable increases so does the other. For example, your score in a test X and the amount of hours you study for this test Y have positive covariance. The covariance of two r.v. s X and Y measures the linear dependence between the two variables. Definition Given two r.v. s X and Y, the coefficient of correlation between of X and Y is Cov(X, Y ) ρ(x, Y ) =. Var(X)Var(Y ) ρ = if and only if there are constants a > and b R such that Y = ax + b. ρ = if and only if there are constants a < and b R such that Y = ax + b. Theorem 9.6. Given two r.v. s X and Y, the covariance of X and Y is Cov(X, Y ) = E[XY ] E[X]E[Y ]. Proof. Let µ X = E[X] and let µ Y = E[Y ]. Then, Cov(X, Y ) = E[(X µ X )(Y µ X )] = E[XY Xµ Y µ X Y + µ X µ Y ] = E[XY ] µ X µ Y µ X µ Y + µ X µ Y = E[XY ] E[X]E[Y ]. Theorem 9.7. Given r.v. s X, Y, X,..., X n, Cov(X, Y ) = Cov(Y, X) Cov(X, X) = Var(X) Cov(aX, by ) = abcov(x, Y ) Cov(X, Y + Z) = Cov(X, Y ) + Cov(X, Z) i= Cov(X i, Y ) Cov( n i= X i, Y ) = n Cov( n i= X i, m j= Y j) = n Theorem 9.8. Given r.v. s X, Y, X,..., X n and a, b R, i= m j= Cov(X i, Y j ) Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ) Var(aX + by + c) = a 2 Var(X) + a 2 Var(Y ) + 2abCov(X, Y ) Var( n i= X i) = n i= Var(X i) + i j n Cov(X i, X j ) Var( n i= a ix i ) = n i= a2 i Var(X i ) + i j n a ia j Cov(X i, X j ) Theorem 9.9. If X and Y are independent r.v. s, then Cov(X, Y ) =. There are r.v. s which are not independent, but Cov(X, Y ) =. Q.E.D.

18 9.5. COVARIANCE. 65 Theorem 9.. If X,..., X n are independent r.v. s, then Var( n i= X i) = n i= Var(X i) Var( n i= a ix i ) = n i= a2 i Var(X i ) Problem 9.9. (# 38, November, 2) The profit for a new product is given by Z = 3X Y 5. X and Y are independent random variables with Var(X) = and Var(Y ) = 2. What is the variance of Z? Answer:. Solution: We have that Var(Z) = Var(3X Y 5) = 9Var(X) + Var(Y ) =. Problem 9.2. (# 32, May 2) A company has two electric generators. The time until failure for each generator follows an exponential distribution with mean. The company will begin using the second generator immediately after the first one fails. What is the variance of the total time that the generators produce electricity? Answer: 2 Solution: Let X and Y be the times until failure of the two generators. X and Y are independent r.v. s. E[X] = E[Y ] = and Var(X) = Var(Y ) = 2 =. So, Var(X + Y ) = Var(X) + Var(Y ) = + = 2. Problem 9.2. (# 7, May 2) A joint density function is given by kx for < x <, < y < otherwise where k is a constant. What is Cov(X, Y )? Answer: Solution: Since = kx dy dx = k 2 k = 2. The density of X is f X (x) = 2x, for x. The density of Y is f Y (y) =, for y. Since the joint density f X,Y satisfies f X,Y (x, y) = f X (x)f Y (y), for each x, y, X and Y are independent. So, Cov(X, Y ) =. Problem (# 35, Sample test) Suppose the remaining lifetimes of a husband and wife are independent and uniformly distributed on the interval [, 4]. An insurance company offers two products to married couples: One which pays when the husband dies; and One which pays when both the husband and wife have died. Calculate the covariance of the two payment times. Answer: Solution: Let X be the lifetime of the husband and let Y be the lifetime of the wife. We need to find Cov(X, max(x, Y )) = E[X max(x, Y )] E[X]E[max(X, Y )].

19 66 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE We have that E[X] = 2, and So, 4 E[max(X, Y )] = 4 max(x, y) dy dx 6 = 4 x x dy dx y dy dx 6 x 6 = 4 x 2 dx + 4 ( x2 ) dx = = E[X max(x, Y )] = 4 x max(x, y) dy dx 6 = 4 x x x dy dx xy dy dx 6 x 6 = 4 x 3 dx + 4 x ( x2 ) dx = (4)4 + (4)2 44 = = Cov(X, max(x, Y )) = 6 (2) 8 3 = 2 3 = Problem (# 7, November 2) A stock market analyst has recorded the daily sales revenue for two companies over the last year and displayed them in the histograms below. Figure 9.2: 7, November 2 The analyst noticed that a daily sales revenue above for Company A was always accompanied by a daily sales revenue below for Company B, and vice versa. Let X denote the daily sales revenue for Company A and let Y denote the daily sales revenue for Company B, on some future day. Assuming that for each company the daily sales revenues are independent and identically distributed, which of the following is true? (A) Var(X) > Var and Var(X + Y ) > Var(X) + Var(Y ). (B) Var(X) > Var(Y ) and Var(X + Y ) < Var(X) + Var(Y ). (C) Var(X) > Var(Y ) and Var(X + Y ) = Var(X) + Var(Y ). (D) Var(X) < Var(Y ) and Var(X + Y ) > Var(X) + Var(Y ).

20 9.6. MULTINOMIAL DISTRIBUTION 67 (E) Var(X) < Var(Y ) and Var(X + Y ) < Var(X) + Var(Y ). Answer: (E) Solution: Looking at the graphs, we see that Var(Y ) > Var(X) The information: The analyst noticed that a daily sales revenue above for Company A was always accompanied by a daily sales revenue below for Company B means that the covariance Co(X < Y ) <. This is equivalent to Var(X) + Var(Y ) > Var(X + Y ). So, the answer is (E). Problem (# 7, November 2) Let X denote the size of a surgical claim and let Y denote the size of the associated hospital claim. An actuary is using a model in which E(X) = 5, E(X 2 ) = 27.4, E(Y) = 7, E(Y 2 ) = 5.4, and V ar(x + Y ) = 8. Let C = X + Y denote the size of the combined claims before the application of a 2% surcharge on the hospital portion of the claim, and let C 2 denote the size of the combined claims after the application of that surcharge. Calculate Cov(C, C 2 ). Answer:8.8 Solution: First, we find the variance of X and Y : Var(X) = E[X 2 ] (E[X]) 2 = 27.4 (5) 2 = 2.4 and Var(Y ) = E[Y 2 ] (E[Y ]) 2 = 5.4 (7) 2 = 2.4. We find the covariance of X and Y using that Var(X + Y ) = 8, 8 = Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ) = Cov(X, Y ). So, Cov(X, Y ) = =.6. We have that C = X + Y and C 2 = X + Y + Y 5 = X + 6Y 5. So, Cov(C, C 2 ) = Cov ( X + Y, X + 6Y 5 = (2.4) + (.6) = ) = Var(X) Var(Y ) + 5 Cov(X, Y ) 9.6 Multinomial distribution Consider a series of n independent trials with possible outcomes,..., k. Suppose that the probability that a trial results in the i th is p i, for i k and each trial. Let X i be the number of trials resulting in the i th outcome. Then, (X,..., X n ) has a multinomial distribution. We have that ( ) n P[(X,..., X n ] = (j,..., j k )) = p j j,..., j p j k k, k for j + j k = n. In the case k = 2, the distribution is called the binomial distribution. In the case k = 3, the distribution is called the trinomial distribution. Example 9.5. A fair die is thrown out n times. The possible outcomes are, 2, 3, 4, 5, 6. Let p be the probability of obtaining in a throw. p 2, p 3, p 4, p 5 and p 6 are defined similarly.

21 68 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE We have that p = p 2 = p 3 = p 4 = p 5 = p 6 =. Let X 6 be the number of s in the n throws. X 2, X 3, X 4, X 5 and X 6 are defined similarly, then if j + j 2 + j 3 + j 4 + j 5 + j 6 = n, then ( ) ( ) n n P[X = j, X 2 = j 2, X 3 = j 3, X 4 = j 4, X 5 = j 5, X 6 = j 6 ] =. j, j 2, j 3, j 4, j 5, j 6 6 Problem (# 29, May 2) A large pool of adults earning their first driver s license includes 5% low-risk drivers, 3% moderate-risk drivers, and 2% high-risk drivers. Because these drivers have no prior driving record, an insurance company considers each driver to be randomly selected from the pool. This month, the insurance company writes 4 new policies for adults earning their first driver s license. What is the probability that these 4 will contain at least two more high-risk drivers than low-risk drivers? Answer:.49 Solution: Let X be the number of low risk drivers. Let Y be the number of moderate risk drivers. Let Z be the number of high risk drivers. Then, (X, Y, Z) has a trinomial distribution with parameters n = 4, p =.5, p 2 =.3 and p 3 =.2. Then, the probability that there will contain at least two more high-risk drivers than low-risk drivers 9.7 Problems. P[Z X + 2] = P[X =, Y =, Z = 4] + P[X =, Y =, Z = 3] +P[X =, Y = 2, Z = 2] + P[X =, Y +, Z = 3] = 4!!4! (.5)4 (.3) (.2) 4 + 4!!3! (.5) (.3) (.2) 3 + 4!2!2! (.5) (.3) 2 (.2) 2 + 4!!3! (.5) (.3) (.2) 3 =.488. Suppose that X and Y are independent random variables, X is uniformly distributed on (, ) and Y is uniformly distributed on (, ). Find P (X Y ). 2. Let 6x, < x < y <, and zero otherwise. Find the marginal densities of X and Y. Are X and Y independent random variables? 3. Let Y and Y 2 be two jointly continuous random variables with joint density function have density function cy if y, y 2 y + y 2, f(y, y 2 ) = else. (a) Find the value of c. (b) Find P (Y 3/4, Y 2 /2). (c) Find P (Y Y 2 ). 4. Let X and Y be two jointly continuous random variables with density ( ) x 2 + xy if < x <, < y < 2; 3 else. Find P2X > Y }.

22 9.7. PROBLEMS The joint probability density function of X and Y is given by 6 7 Are X and Y independent? Why? ( ) x 2 + xy 2 if < x <, < y < 2; else. 6. Let Y and Y 2 be two jointly continuous random variables with joint density function have density function y e (y +y 2 ) if y >, y 2 >, f(y, y 2 ) = else. (a) Find the marginal probability density functions of Y and Y 2. (b) Are Y and Y 2 independent random variables? Justify your answer. (c) Find P (Y + Y 2 ). (d) Find P (Y Y 2 2). 7. The random variables Y and Y 2 have joint density function c f(y, y 2 ) = y y 2, if < y <, < y 2 < 2, elsewhere. Find c and P [Y 2 2Y ] 8. The joint probability density function of X and Y is given by 24xy if < x <, < y < x; else. Find the marginal densities of X and Y. Are X and Y independent random variables? 9. Let X and let Y be two jointly continuous random variables with joint density 6( x y) if x, y, x + y f X,Y (x, y) = else. Find the marginal densities. Are X and Y independent random variables?. Let Y and Y 2 be two jointly continuous random variables with joint density function 24y y 2 if y, y 2, y + y 2, f(y, y 2 ) = else. (a) Find the mean and the variance of Y and Y 2. (b) Find the covariance and the correlation coefficient of Y and Y 2. (c) Find the mean and the variance of U = 2 + 3Y 4Y 2. (d) Find Cov( 2 + 3Y 4Y 2, 4 2Y + 3Y 2 ).

23 7 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE. The random variables X and Y have joint density function 3x 8 + 3y2 if < x < 2, < y <, 8 otherwise. Find P (X + Y 2). 2. The random variables X and Y have joint density function c(2x 2 + xy) if < x <, < y < 2, otherwise Find c and the covariance of X and Y. 3. The random variables X and Y have joint density function cx 2 y if < x <, < y < 2, otherwise. Find P [2X < Y ]. 4. The random variables X and Y have joint density function x if < x <, < y < 2, otherwise. Find P [X + Y ]. 5. Let X and Y be continuous random variables with joint density function for < y < x and x else What is Var(X)? 6. Let X and Y have the pdf cx if x y x, x else Find c, the mean and the variance of X and Y, and the covariance of X and Y. 7. Let X and Y be two independent r.v. s. Suppose that X is uniform distributed in the interval (, 2), and Y is an exponential r.v. with parameter λ =. Find Cov(X, Y ). 8. If X and Y are independent random variables with Find the variance of 3X 2Y +. E[X] =, Var(X) = 2, E[Y ] = 3, Var(Y ) = 4.

24 9.7. PROBLEMS Let X and X 2 be two independent random variables, both with mean and variance 4. Find E[(X 2 + 2)(X 2 )]. 2. Let X and Y be two independent random variables each having mean and variance 2. Find the covariance of the random variables X and 2X Y. 2. Let X and Y be two independent random variables each having mean and variance 5. Find the variance of the random variable 2X Y Let Y and Y 2 be two random variables satisfying that Find the covariance of Y and Y 2. Var(Y ) = and Var(Y 2 ) = Var(2Y Y 2 ) = Let X and X 2 be 2 random variables. Suppose that each E[X ] =, E[X 2 ] = 2, Var(X ) = 2, Var(X 2 ) = 8 and the correlation coefficient of X and X 2 is /2. Find the mean and the variance of Y = 3 + X 2X The joint probability mass function of X and Y is given by p(, ) = /8, p(, 2) = /4, p(, ) = /8, p(2, ) = /8, p(2, 2) = /2. Find the cumulative distribution function of the random variable X. 25. A jar contains 25 pieces of candy, of which are chocolate, 8 are mint and 7 are latte. A group of 5 pieces of candy are chosen at random. Let X equal the number of chocolate candies in the random sample of 5. Let Y equal the number of latte candies in the random sample of 5. Find the joint probability mass function of X and Y.

Notes for Math 324, Part 20

Notes for Math 324, Part 20 7 Notes for Math 34, Part Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B]

More information

November 2000 Course 1. Society of Actuaries/Casualty Actuarial Society

November 2000 Course 1. Society of Actuaries/Casualty Actuarial Society November 2000 Course 1 Society of Actuaries/Casualty Actuarial Society 1. A recent study indicates that the annual cost of maintaining and repairing a car in a town in Ontario averages 200 with a variance

More information

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

The mean, variance and covariance. (Chs 3.4.1, 3.4.2) 4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Course 1 Solutions November 2001 Exams

Course 1 Solutions November 2001 Exams Course Solutions November Exams . A For i =,, let R = event that a red ball is drawn form urn i i B = event that a blue ball is drawn from urn i. i Then if x is the number of blue balls in urn, ( R R)

More information

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Joint Probability Distributions and Random Samples (Devore Chapter Five) Joint Probability Distributions and Random Samples (Devore Chapter Five) 1016-345-01: Probability and Statistics for Engineers Spring 2013 Contents 1 Joint Probability Distributions 2 1.1 Two Discrete

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1 Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam Math 5. Rumbos Fall 23 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card in red on one side and white

More information

Homework 10 (due December 2, 2009)

Homework 10 (due December 2, 2009) Homework (due December, 9) Problem. Let X and Y be independent binomial random variables with parameters (n, p) and (n, p) respectively. Prove that X + Y is a binomial random variable with parameters (n

More information

Math438 Actuarial Probability

Math438 Actuarial Probability Math438 Actuarial Probability Jinguo Lian Department of Math and Stats Jan. 22, 2016 Continuous Random Variables-Part I: Definition A random variable X is continuous if its set of possible values is an

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

MULTIVARIATE PROBABILITY DISTRIBUTIONS

MULTIVARIATE PROBABILITY DISTRIBUTIONS MULTIVARIATE PROBABILITY DISTRIBUTIONS. PRELIMINARIES.. Example. Consider an experiment that consists of tossing a die and a coin at the same time. We can consider a number of random variables defined

More information

SOCIETY OF ACTUARIES EXAM P PROBABILITY EXAM P SAMPLE SOLUTIONS

SOCIETY OF ACTUARIES EXAM P PROBABILITY EXAM P SAMPLE SOLUTIONS SOCIETY OF ACTUARIES EXAM P PROBABILITY EXAM P SAMPLE SOLUTIONS Copyright 8 by the Society of Actuaries Some of the questions in this study note are taken from past examinations. Some of the questions

More information

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS STAT/MA 4 Answers Homework November 5, 27 Solutions by Mark Daniel Ward PROBLEMS Chapter Problems 2a. The mass p, corresponds to neither of the first two balls being white, so p, 8 7 4/39. The mass p,

More information

Multivariate distributions

Multivariate distributions CHAPTER Multivariate distributions.. Introduction We want to discuss collections of random variables (X, X,..., X n ), which are known as random vectors. In the discrete case, we can define the density

More information

Continuous Random Variables

Continuous Random Variables 1 / 24 Continuous Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 27, 2013 2 / 24 Continuous Random Variables

More information

STAT Chapter 5 Continuous Distributions

STAT Chapter 5 Continuous Distributions STAT 270 - Chapter 5 Continuous Distributions June 27, 2012 Shirin Golchi () STAT270 June 27, 2012 1 / 59 Continuous rv s Definition: X is a continuous rv if it takes values in an interval, i.e., range

More information

Mathematics 426 Robert Gross Homework 9 Answers

Mathematics 426 Robert Gross Homework 9 Answers Mathematics 4 Robert Gross Homework 9 Answers. Suppose that X is a normal random variable with mean µ and standard deviation σ. Suppose that PX > 9 PX

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA

Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Effective July 5, 3, only the latest edition of this manual will have its

More information

Analysis of Engineering and Scientific Data. Semester

Analysis of Engineering and Scientific Data. Semester Analysis of Engineering and Scientific Data Semester 1 2019 Sabrina Streipert s.streipert@uq.edu.au Example: Draw a random number from the interval of real numbers [1, 3]. Let X represent the number. Each

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes? Week 10 Worksheet Ice Breaker Question: Do you prefer waffles or pancakes? 1. Suppose X, Y have joint density f(x, y) = 12 7 (xy + y2 ) on 0 < x < 1, 0 < y < 1. (a) What are the marginal densities of X

More information

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, 008 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY P SAMPLE EXAM SOLUTIONS

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY P SAMPLE EXAM SOLUTIONS SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY P SAMPLE EXAM SOLUTIONS Copyright 9 by the Society of Actuaries and the Casualty Actuarial Society Some of the questions in this study

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

ACTEX Seminar Exam P

ACTEX Seminar Exam P ACTEX Seminar Exam P Written & Presented by Matt Hassett, ASA, PhD 1 Remember: 1 This is a review seminar. It assumes that you have already studied probability. This is an actuarial exam seminar. We will

More information

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay 1 / 13 Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay August 8, 2013 2 / 13 Random Variable Definition A real-valued

More information

Actuarial Science Exam 1/P

Actuarial Science Exam 1/P Actuarial Science Exam /P Ville A. Satopää December 5, 2009 Contents Review of Algebra and Calculus 2 2 Basic Probability Concepts 3 3 Conditional Probability and Independence 4 4 Combinatorial Principles,

More information

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n JOINT DENSITIES - RANDOM VECTORS - REVIEW Joint densities describe probability distributions of a random vector X: an n-dimensional vector of random variables, ie, X = (X 1,, X n ), where all X is are

More information

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

Multivariate Random Variable

Multivariate Random Variable Multivariate Random Variable Author: Author: Andrés Hincapié and Linyi Cao This Version: August 7, 2016 Multivariate Random Variable 3 Now we consider models with more than one r.v. These are called multivariate

More information

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES VARIABLE Studying the behavior of random variables, and more importantly functions of random variables is essential for both the

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

3 Continuous Random Variables

3 Continuous Random Variables Jinguo Lian Math437 Notes January 15, 016 3 Continuous Random Variables Remember that discrete random variables can take only a countable number of possible values. On the other hand, a continuous random

More information

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance Covariance Lecture 0: Covariance / Correlation & General Bivariate Normal Sta30 / Mth 30 We have previously discussed Covariance in relation to the variance of the sum of two random variables Review Lecture

More information

MATH/STAT 3360, Probability

MATH/STAT 3360, Probability MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

MAS113 Introduction to Probability and Statistics. Proofs of theorems

MAS113 Introduction to Probability and Statistics. Proofs of theorems MAS113 Introduction to Probability and Statistics Proofs of theorems Theorem 1 De Morgan s Laws) See MAS110 Theorem 2 M1 By definition, B and A \ B are disjoint, and their union is A So, because m is a

More information

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24 STAT/MATH 395 A - Winter Quarter 17 - Midterm - February 17, 17 Name: Student ID Number: Problem #1 # #3 #4 Total Points /5 /7 /8 /4 /4 Directions. Read directions carefully and show all your work. Define

More information

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu Home Work: 1 1. Describe the sample space when a coin is tossed (a) once, (b) three times, (c) n times, (d) an infinite number of times. 2. A coin is tossed until for the first time the same result appear

More information

Chapter 4 continued. Chapter 4 sections

Chapter 4 continued. Chapter 4 sections Chapter 4 sections Chapter 4 continued 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP:

More information

Multivariate probability distributions and linear regression

Multivariate probability distributions and linear regression Multivariate probability distributions and linear regression Patrik Hoyer 1 Contents: Random variable, probability distribution Joint distribution Marginal distribution Conditional distribution Independence,

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

STAT 515 MIDTERM 2 EXAM November 14, 2018

STAT 515 MIDTERM 2 EXAM November 14, 2018 STAT 55 MIDTERM 2 EXAM November 4, 28 NAME: Section Number: Instructor: In problems that require reasoning, algebraic calculation, or the use of your graphing calculator, it is not sufficient just to write

More information

2 (Statistics) Random variables

2 (Statistics) Random variables 2 (Statistics) Random variables References: DeGroot and Schervish, chapters 3, 4 and 5; Stirzaker, chapters 4, 5 and 6 We will now study the main tools use for modeling experiments with unknown outcomes

More information

SOLUTION FOR HOMEWORK 11, ACTS 4306

SOLUTION FOR HOMEWORK 11, ACTS 4306 SOLUTION FOR HOMEWORK, ACTS 36 Welcome to your th homework. This is a collection of transformation, Central Limit Theorem (CLT), and other topics.. Solution: By definition of Z, Var(Z) = Var(3X Y.5). We

More information

Name of the Student: Problems on Discrete & Continuous R.Vs

Name of the Student: Problems on Discrete & Continuous R.Vs Engineering Mathematics 08 SUBJECT NAME : Probability & Random Processes SUBJECT CODE : MA645 MATERIAL NAME : University Questions REGULATION : R03 UPDATED ON : November 07 (Upto N/D 07 Q.P) (Scan the

More information

Homework 5 Solutions

Homework 5 Solutions 126/DCP126 Probability, Fall 214 Instructor: Prof. Wen-Guey Tzeng Homework 5 Solutions 5-Jan-215 1. Let the joint probability mass function of discrete random variables X and Y be given by { c(x + y) ifx

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

Introduction to Machine Learning

Introduction to Machine Learning What does this mean? Outline Contents Introduction to Machine Learning Introduction to Probabilistic Methods Varun Chandola December 26, 2017 1 Introduction to Probability 1 2 Random Variables 3 3 Bayes

More information

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov Many of the exercises are taken from two books: R. Durrett, The Essentials of Probability, Duxbury

More information

BASICS OF PROBABILITY

BASICS OF PROBABILITY October 10, 2018 BASICS OF PROBABILITY Randomness, sample space and probability Probability is concerned with random experiments. That is, an experiment, the outcome of which cannot be predicted with certainty,

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. IE 230 Seat # Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators. Score Exam #3a, Spring 2002 Schmeiser Closed book and notes. 60 minutes. 1. True or false. (for each,

More information

MAS223 Statistical Inference and Modelling Exercises

MAS223 Statistical Inference and Modelling Exercises MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,

More information

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R Random Variables Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R As such, a random variable summarizes the outcome of an experiment

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro CS37300 Class Notes Jennifer Neville, Sebastian Moreno, Bruno Ribeiro 2 Background on Probability and Statistics These are basic definitions, concepts, and equations that should have been covered in your

More information

Review of Probability Theory

Review of Probability Theory Review of Probability Theory Arian Maleki and Tom Do Stanford University Probability theory is the study of uncertainty Through this class, we will be relying on concepts from probability theory for deriving

More information

Chapter 2. Probability

Chapter 2. Probability 2-1 Chapter 2 Probability 2-2 Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance with certainty. Examples: rolling a die tossing

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables This Version: July 30, 2015 Multiple Random Variables 2 Now we consider models with more than one r.v. These are called multivariate models For instance: height and weight An

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14 Math 325 Intro. Probability & Statistics Summer Homework 5: Due 7/3/. Let X and Y be continuous random variables with joint/marginal p.d.f. s f(x, y) 2, x y, f (x) 2( x), x, f 2 (y) 2y, y. Find the conditional

More information

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999. Math 447. 1st Homework. First part of Chapter 2. Due Friday, September 17, 1999. 1. How many different seven place license plates are possible if the first 3 places are to be occupied by letters and the

More information

Notes on Random Variables, Expectations, Probability Densities, and Martingales

Notes on Random Variables, Expectations, Probability Densities, and Martingales Eco 315.2 Spring 2006 C.Sims Notes on Random Variables, Expectations, Probability Densities, and Martingales Includes Exercise Due Tuesday, April 4. For many or most of you, parts of these notes will be

More information

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables: CHAPTER 5 Jointl Distributed Random Variable There are some situations that experiment contains more than one variable and researcher interested in to stud joint behavior of several variables at the same

More information

STOR Lecture 16. Properties of Expectation - I

STOR Lecture 16. Properties of Expectation - I STOR 435.001 Lecture 16 Properties of Expectation - I Jan Hannig UNC Chapel Hill 1 / 22 Motivation Recall we found joint distributions to be pretty complicated objects. Need various tools from combinatorics

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University Statistics for Economists Lectures 6 & 7 Asrat Temesgen Stockholm University 1 Chapter 4- Bivariate Distributions 41 Distributions of two random variables Definition 41-1: Let X and Y be two random variables

More information

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II Week 8 Jointly Distributed Random Variables Part II Week 8 Objectives 1 The connection between the covariance of two variables and the nature of their dependence is given. 2 Pearson s correlation coefficient

More information

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007 UC Berkeley Department of Electrical Engineering and Computer Science EE 6: Probablity and Random Processes Problem Set 4: Solutions Fall 007 Issued: Thursday, September 0, 007 Due: Friday, September 8,

More information

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan 2.4 Random Variables Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan By definition, a random variable X is a function with domain the sample space and range a subset of the

More information

Lecture 1: August 28

Lecture 1: August 28 36-705: Intermediate Statistics Fall 2017 Lecturer: Siva Balakrishnan Lecture 1: August 28 Our broad goal for the first few lectures is to try to understand the behaviour of sums of independent random

More information

PhysicsAndMathsTutor.com

PhysicsAndMathsTutor.com PhysicsAndMathsTutor.com June 2005 6. A continuous random variable X has probability density function f(x) where 3 k(4 x x ), 0 x 2, f( x) = 0, otherwise, where k is a positive integer. 1 (a) Show that

More information

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 8. For any two events E and F, P (E) = P (E F ) + P (E F c ). Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016 Sample space. A sample space consists of a underlying

More information

Problem 1. Problem 2. Problem 3. Problem 4

Problem 1. Problem 2. Problem 3. Problem 4 Problem Let A be the event that the fungus is present, and B the event that the staph-bacteria is present. We have P A = 4, P B = 9, P B A =. We wish to find P AB, to do this we use the multiplication

More information

Probability Distributions

Probability Distributions Probability Distributions Series of events Previously we have been discussing the probabilities associated with a single event: Observing a 1 on a single roll of a die Observing a K with a single card

More information