Notes for Math 324, Part 19

Similar documents
Notes for Math 324, Part 20

November 2000 Course 1. Society of Actuaries/Casualty Actuarial Society

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Course 1 Solutions November 2001 Exams

Joint Probability Distributions and Random Samples (Devore Chapter Five)

Bivariate distributions

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint Distribution of Two or More Random Variables

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

Notes for Math 324, Part 17

Jointly Distributed Random Variables

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Homework 10 (due December 2, 2009)

Math438 Actuarial Probability

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

1 Basic continuous random variable problems

Random Variables and Their Distributions

STA 256: Statistics and Probability I

Bivariate Distributions

MULTIVARIATE PROBABILITY DISTRIBUTIONS

SOCIETY OF ACTUARIES EXAM P PROBABILITY EXAM P SAMPLE SOLUTIONS

STAT/MA 416 Answers Homework 6 November 15, 2007 Solutions by Mark Daniel Ward PROBLEMS

Multivariate distributions

Continuous Random Variables

STAT Chapter 5 Continuous Distributions

Mathematics 426 Robert Gross Homework 9 Answers

Homework 4 Solution, due July 23

STAT 430/510: Lecture 16

Errata for the ASM Study Manual for Exam P, Fourth Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA

Analysis of Engineering and Scientific Data. Semester

ENGG2430A-Homework 2

Ch. 5 Joint Probability Distributions and Random Samples

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

ACM 116: Lectures 3 4

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

Week 10 Worksheet. Math 4653, Section 001 Elementary Probability Fall Ice Breaker Question: Do you prefer waffles or pancakes?

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY P SAMPLE EXAM SOLUTIONS

More than one variable

STAT/MATH 395 PROBABILITY II

18.440: Lecture 28 Lectures Review

ACTEX Seminar Exam P

Random Variables. Saravanan Vijayakumaran Department of Electrical Engineering Indian Institute of Technology Bombay

Actuarial Science Exam 1/P

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

P (x). all other X j =x j. If X is a continuous random vector (see p.172), then the marginal distributions of X i are: f(x)dx 1 dx n

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Multivariate Random Variable

IAM 530 ELEMENTS OF PROBABILITY AND STATISTICS LECTURE 3-RANDOM VARIABLES

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

3 Continuous Random Variables

Covariance. Lecture 20: Covariance / Correlation & General Bivariate Normal. Covariance, cont. Properties of Covariance

MATH/STAT 3360, Probability

Review of Probability. CS1538: Introduction to Simulations

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Chapter 4 continued. Chapter 4 sections

Multivariate probability distributions and linear regression

Math 510 midterm 3 answers

STAT 515 MIDTERM 2 EXAM November 14, 2018

2 (Statistics) Random variables

SOLUTION FOR HOMEWORK 11, ACTS 4306

Name of the Student: Problems on Discrete & Continuous R.Vs

Homework 5 Solutions

18.440: Lecture 28 Lectures Review

Discrete Random Variables

Probability and Statistics Notes

Introduction to Machine Learning

Exercises in Probability Theory Paul Jung MA 485/585-1C Fall 2015 based on material of Nikolai Chernov

BASICS OF PROBABILITY

1 Presessional Probability

3 Conditional Expectation

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

MAS223 Statistical Inference and Modelling Exercises

Definition: A random variable X is a real valued function that maps a sample space S into the space of real numbers R. X : S R

STAT 414: Introduction to Probability Theory

CS37300 Class Notes. Jennifer Neville, Sebastian Moreno, Bruno Ribeiro

Review of Probability Theory

Chapter 2. Probability

Multiple Random Variables

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

Notes on Random Variables, Expectations, Probability Densities, and Martingales

CHAPTER 5. Jointly Probability Mass Function for Two Discrete Distributed Random Variables:

STOR Lecture 16. Properties of Expectation - I

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Statistics for Economists Lectures 6 & 7. Asrat Temesgen Stockholm University

Outline Properties of Covariance Quantifying Dependence Models for Joint Distributions Lab 4. Week 8 Jointly Distributed Random Variables Part II

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

Arkansas Tech University MATH 3513: Applied Statistics I Dr. Marcel B. Finan

Lecture 1: August 28

PhysicsAndMathsTutor.com

Summary of basic probability theory Math 218, Mathematical Statistics D Joyce, Spring 2016

Problem 1. Problem 2. Problem 3. Problem 4

Probability Distributions

Transcription:

48 Notes for Math 324, Part 9

Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which are functions from the sample space into R. Given two r.v. s defined in the same sample space and given a set A R 2, we can define P[(X, Y ) A] = P[s S : (X(s), Y (s)) A]. Alternatively, we can consider the function (X, Y ) : S R 2. In this chapter, we consider r.v. s with values in R 2 or other multidimensional space. 9. Multivariate Discrete distributions Definition 9... Given two discrete r.v. s X and Y defined in the same probability space. The joint probability mass function of X and Y is the function p(x, y) = P[X = x, Y = y]. Theorem 9.. Let p be the probability mass function of the r.v. s X and Y. Then, (i) for each x, y, p(x, y), (ii) x,y p(x, y) =. Definition 9..2. Let X and let Y be two discrete r.v. s defined in the same probability space. The marginal probability mass function of X is the function p X (x) = P[X = x] = y p(x, y). The marginal probability mass function of Y is the function p Y (y) = P(Y = y) = x p(x, y). Example 9.. Suppose that 3 balls are randomly selected from an urn containing 3 white, 4 red, and 5 blue balls. Let X and Y denote, respectively, the number of white and red that are chosen. Find the joint probability function of X and Y. 49

5 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE The joint probability function of X and Y is the function ( x)( 3 4 5 y)( 5 x y), for x, y integers with x, y and x + y 5 p(x, y) = ( 2 3 ) else Explicitly, p(, ) = (3 )( 4 )( 5 3) ( 2 3 ) p(, 3) = (3 )( 4 3)( 5 ) ( 2 3 ) p(, 2) = (3 )( 4 2)( 5 ) ( 2 3 ) p(3, ) = (3 3)( 4 )( 5 ) ( 2 3 ) = 22, p(, ) = (3 )( 4 )( 5 2) ( 2 3 ) = 4 22, p(, ) = (3 )( 4 )( 5 2) ( 2 3 ) = 8 22, p(2, ) = (3 2)( 4 )( 5 ) ( 2 3 ) = 22 = 4 22, p(, 2) = (3 )( 4 2)( 5 ) ( 2 3 ) = 3 22, p(, ) = (3 )( 4 )( 5 ) ( 2 3 ) = 5 22, p(2, ) = (3 2)( 4 )( 5 ) ( 2 3 ) = 3 22, = 6 22, = 2 22, We can set up the probabilities in a table: p(x, y) y = y = y = 2 y = 3 p X (x) 4 3 x = 22 x = 3 22 5 x = 2 22 x = 3 p Y (y) 56 22 22 6 22 4 22 84 22 22 8 8 22 22 2 27 22 22 22 2 22 48 22 22 4 22 The marginal probability mass functions are x 2 3 84 p X (x) 22 8 22 27 22 22 y 2 3 56 2 48 p Y (y) 22 22 22 4 22 Problem 9.. (# 3, Sample Test) Let X and Y be discrete random variable with joint probability function p(x, y) = 2x+y 2 for (x, y) = (, ), (, 2), (, 2), (, 3) otherwise Determine the marginal probability function of X.

9.2. JOINTLY CONTINUOUS DISTRIBUTIONS 5 for x = 6 (A) p(x) = 5 for x = 6 otherwise 2 for x = 9 3 for x = 9 (D) p(x) = 4 for x = 9 otherwise Answer (B) Solution: We have that p(, ) = 2 for x = 4 (B) p(x) = 3 for x = 4 otherwise y for x = 2 2+y (E) p(x) = for x = 2 otherwise for x = 3 (C) p(x) = 2 for x = 3 otherwise 2 4 5, p(, 2) =, p(, 2) =, p(, 3) = 2 = 2 3. 2 4 p X () = p(, ) + p(, 2) = 3 2 = 4 and p X() = p(, 2) + p(, 3) = 9 2. So, Problem 9.2. (# 27, May 2) A car dealership sells,, or 2 luxury cars on any day. When selling a car, the dealer also tries to persuade the customer to buy an extended warranty for the car. Let X denote the number of luxury cars sold in a given day, and let Y denote the number of extended warranties sold. What is the variance of X? Answer:.58 P[X =, Y = ] = 6 P[X =, Y = ] = 2 P[X =, Y = ] = 6 P[X = 2, Y = ] = 2 P[X = 2, Y = ] = 3 P[X = 2, Y = 2] = 6 Solution: We find the marginal probability mass function of X and then, the variance of X. We have that P[X = ] = p(, ) = 3, P[X = ] = p(, ) + p(, ) = and 6 2 P[X = 2] = p(2, ) + p(2, ) + p(2, 2) = 7. So, 2 and E[X] = () 6 + () 3 2 + (2) 7 2 = 7 2 E[X 2 ] = () 2 6 + ()2 3 2 + (2)2 7 2 = 3 2 Var(X) = E[X 2 ] (E[X]) 2 = 3 2 9.2 Jointly continuous distributions ( ) 2 7 =.5764. 2 Definition 9.2.. Two r.v. s X and Y defined in the same probability space are said to be jointly continuous if there exists a function f : R 2 R such that for each A R 2, P[(X, Y ) A] = f(x, y) dx dy. A

52 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE The function f above is called the joint probability density function of X and Y. Theorem 9.2. Let f be the joint probability density function of the r.v. s X and Y. Then, (i) for each x, y, f(x, y), (ii) f(x, y) dx dy =. Definition 9.2.2. Let X and Y be two jointly continuous r.v. s X and Y defined in the same probability space with joint density function f X,Y. The marginal density function of X is the function f X (x) = The marginal density function of Y is the function f Y (y) = f(x, y) dy. f(x, y) dx. The marginal pdf f X of X is defined so that for each A R, f X,Y (x, y) dy dx = f X (x) dx. A Similarly, the marginal pdf f Y of Y is defined so that for each A R, f X,Y (x, y) dy dx = f Y (y) dy. A Problem 9.3. (# 4, Sample Test) Let X and Y be random losses with joint density function e (x+y) for x > and y >. An insurance policy is written to reimburse X + Y. Calculate the probability that the reimbursement is less than. Answer: 2e. A A Figure 9.: 4, Sample Test

9.2. JOINTLY CONTINUOUS DISTRIBUTIONS 53 Solution: We have to find P[X + Y ] = x e (x+y) dy dx = ( e x y ) = (e x e ) dx = e e = 2e. Problem 9.4. (# 36, November, 2) An insurance company insures a large number of drivers. Let X be the random variable representing the company s losses under collision insurance, and let Y represent the company s losses under liability insurance. X and Y have joint density function 2x+2 y for < x < and < y < 2 4 otherwise What is the probability that the total loss is at least? Answer:.7 x dx Figure 9.2: 36, November, 2 Solution: We need to find P[X + Y ] = = 2 (2x+2 y) 2 8 x ( ) = (3x+) 3 x3 72 6 2 2x+2 y x dx = dy dx 4 ( (3x+) 2 (2x)2 8 8 ) dx = 64 72 72 6 = 5 72 =.783 Problem 9.5. (# 5, May 2) A company is reviewing tornado damage claims under a farm insurance policy. Let X be the portion of a claim representing damage to the house and

54 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE let Y be the portion of the same claim representing damage to the rest of the property. The joint density function of X and Y is 6[ (x + y)] for x >, y >, x + y < otherwise Determine the probability that the portion of a claim representing damage to the house is less than.2.answer:.488 Figure 9.3: 5, May 2 Solution: We have to find P[X.2] =.2 x (6 6x 6y) dy dx =.2 x ( 3)( x y) 2 =.2 3( x) 2 dx = (.8) 3 =.488 Problem 9.6. (# 24, May 2) A device contains two components. The device fails if either component fails. The joint density function of the lifetimes of the components, measured in hours, is f(s, t), where < s < and < t <. What is the probability that the device fails during the first half hour of operation?.5.5 (A).5 f(s, t) ds dt (B) f(s, t) ds dt (C) f(s, t) ds dt (D).5.5.5 (E).5 f(s, t) ds dt +.5 f(s, t) ds dt.5 Answer: (E) f(s, t) ds dt + dx.5 f(s, t) ds dt

9.3. INDEPENDENT RANDOM VARIABLES. 55 Figure 9.4: 24, May 2 Problem 9.7. (# 2, November 2) The future lifetimes (in months) of two components of a machine have the following joint density function: 6 (5 x y) for < x < 5 y < 5 25, otherwise What is the probability that both components are still functioning 2 months from now? (A) 6 25, (C) 6 25, (E) 6 25, Answer: (B) 2 2 3 5 x y 2 5 2 (5 x y) dx dy (B) 6 5 x y 2 (5 x y) dx dy 2 (5 x y) dx dy (D) 6 3 5 x 25, 2 3 25, 2 (5 x y) dx dy 2 5 x (5 x y) dx dy 2 9.3 Independent random variables. Definition 9.3.. Two r.v. s X and Y are said to be independent if for each A, B R, P[X A, Y B] = P[X A]P[Y B]. Example 9.2. Two fair dice are rolled. Let X be the biggest value of the two dice and the Y be the smallest value. Are X and Y independent random variables. Why? Solution: The r.v. s X and Y are not independent because the relation p X,Y (x, y) = p X (x)p y (y) does not hold for each x, y. For example, for x = and y = 2, p X,Y (, 2) =, p X () = and p 36 Y (2) = 9. 36

56 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Figure 9.5: 2, November 2 Problem 9.8. (#2, November 2) An insurance company determines that N, the number of claims received in a week, is a random variable with P [N = n] =, where n. 2 n+ The company also determines that the number of claims received in a given week is independent of the number of claims received in any other week. Determine the probability that exactly seven claims will be received during a given two week period. Answer. 64 Solution: Let N be the number of claims in week. Let N 2 be the number of claims in week 2. We have that P [N + N 2 = 7] = P [(N, N 2 ) (, 7), (, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, ), (7, )}] P [(N, N 2 ) = (, 7)] + P [(N, N 2 ) = (, 6)] + P [(N, N 2 ) = (2, 5)] +P [(N, N 2 ) = (3, 4)] + P [(N, N 2 ) = (4, 3)] +P [(N, N 2 ) = (5, 2)] + P [(N, N 2 ) = (6, )] + P [(N, N 2 ) = (7, )] = P [N = ]P [N 2 = 7] + P []N = ]P []N 2 = 6] + P [N = 2]P [N 2 = 5] +P [N = 3]P [N 2 = 4] + P [N = 4]P [N 2 = 3] + P [N = 5]P []N 2 = 2] +P [N = 6]P [N 2 = ] + P [N = 7]P [N 2 = ] = 2 9 + 2 9 + 2 9 + 2 9 + 2 9 + 2 9 + 2 9 + 2 9 = 64. Theorem 9.3. Let X and let Y be two independent r.v. s. functions f and let g, E[f(X)g(Y )] = E[f(X)]E[g(Y )]. Then, for each (integrable) Theorem 9.4. Let X and let Y be two discrete r.v. s. Let p X,Y be the joint probability mass function of X and Y, let p X be the joint probability mass function of X and let p Y be the joint probability mass function of Y. Then, X and Y are independent if and only if for each

9.3. INDEPENDENT RANDOM VARIABLES. 57 x, y R, p X,Y (x, y) = p X (x)p Y (y). Theorem 9.5. Let X and let Y be two jointly continuous r.v. s. Let f X,Y be the joint probability mass function of X and Y, let f X be the joint probability mass function of X and let f Y be the joint probability mass function of Y. Then, X and Y are independent if and only if for each x, y R, f X,Y (x, y) = f X (x)f Y (y). Example 9.3. Suppose that X and Y are independent random variables with uniform distribution in (, 2). Find P X + Y }. Solution: The density of X is f X (x) = 2, for x 2. The density of Y is f Y (y) = 2, for y 2. Since X and Y are independent r.v s, the joint density of X and Y is f X,Y (x, y) =, for x, y 2. 4 Since the density is a constant, the probability of a region in [, 2] [, 2] is its area over 4. The region determined by (x, y) : x + y } and [, 2] [, 2] is a rectangular triangle with perpendicular sides of lenght each. So, P[X + Y ] = 2 4 = 8. Example 9.4. Suppose that X and Y are independent random variables with exponential distribution of parameter λ =. Find P X + Y }. Solution: The density of X is f X (x) = e x, for x. The density of Y is f Y (y) = e y, for y. Since X and Y are independent r.v s, the joint density of X and Y is We have that f X,Y (x, y) = e x y, for x, y. P[X + Y ] = x e x y dy dx == ( e x y ) = (e x e ) dx = e e = 2e Problem 9.9. The joint probability density function of X and Y is given by k(x 2 + y 2 ) if < x <, < y < ; f X,Y (x, y) = else. Find k. Find the marginal densities. Are X and Y independent r.v. s? Why? Solution: Since the total probability is one, = k(x2 + y 2 ) dy dx = 4k So, k = 3. The marginal density of X is 8 f X (x) = (x2 + y 2 ) dy dx = 8k x 3 8 (x2 + y 2 ) dy = 3x2 4 +, for x. 4 dx x2 dy dx = 8k 3.

58 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Similarly, the marginal density of Y is Since f Y (y) = 3 8 (x2 + y 2 ) dy = 3y2 4 +, for y. 4 f X,Y (x, y) = f X (x)f Y (y) do not hold for each x, y, X and Y are not independent r.v. s. Problem 9.. The random variables X and Y have joint density function x if < x <, < y < 2, f X,Y (x, y) = otherwise. Find the marginal densities. Are the random variables X and Y independent r.v. s? Solution: The marginal density of X is f X (x) = 2 Similarly, the marginal density of Y is For each x, y, f Y (y) = x dy = 2x, for < x <. x dx =, for < y < 2. 2 f X,Y (x, y) = f X (x)f Y (y) So, X and Y are independent r.v. s. It is possible to prove that if the joint density of X and Y has the form f X,Y (x, y) = g(x)h(y) for a < x < b and c < y < d, where g and h are two functions, then X and Y are independent r.v. s Problem 9.. (# 28, November 2) Two insurers provide bids on an insurance policy to a large company. The bids must be between 2 and 22. The company decides to accept the lower bid if the two bids differ by 2 or more. Otherwise, the company will consider the two bids further. Assume that the two bids are independent and are both uniformly distributed on the interval from 2 to 22. Determine the probability that the company considers the two bids further. Answer:.9 Solution: Let X be the first bid. Let Y be the second bid. The density of X is f X (x) =, for 2 x 22. The density of Y is f 2 Y (y) =, for 2 y 22. Since X 2 and Y are independent r.v s, the joint density of X and Y is f X,Y (x, y) =, for 2 x, y 22. 22 Since the density is a constant the probability of set in the region [2, 22] [2, 22] is its area times 2 2. We are finding the probability of P[ X Y 2] = P[X 2 Y

9.3. INDEPENDENT RANDOM VARIABLES. 59 Figure 9.6: 28, November 2 X + 2]. The complementary of this region is two rectangular triangles whose perpendicular sides are 8 each. The area of each triangle is 82. So, the probability we are looking for is 2 (8) 2 =.9 2 2 Problem 9.2. (# 3, May 2) A study is being conducted in which the health of two independent groups of ten policyholders is being monitored over a one-year period of time. Individual participants in the study drop out before the end of the study with probability.2 (independently of the other participants). What is the probability that at least 9 participants complete the study in one of the two groups, but not in both groups? Answer:.469 Solution: Let X be the number of participants with complete the study in the first group. Let Y be the number of participants with complete the study in the second group. We have to find P[X 9} [Y 9}}]. Both X and Y have a binomial distribution with parameters n = and p =.8. So, P[X 9] = P[X 9] + P[X ] = Since X and Y are independent r.v. s ( ) (.8) 9 (.2) + 9 ( ) (.8) =.3758. P[X 9} Y 9}] = P[X 9] + P[Y 9} P[X 9]P[Y 9} =.3758 +.3758 (.3758)(.3758) =.4694. Problem 9.3. (# 22, May 2) The waiting time for the first claim from a good driver and the waiting time for the first claim from a bad driver are independent and follow exponential distributions with means 6 years and 3 years, respectively. What is the probability that the first claim from a good driver will be filed within 3 years and the first claim from a bad driver will be filed within 2 years? Answer: e 2/3 e /2 + e 7/6

6 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Solution: Let X be the waiting time for the first claim from a good driver. The density of X is f X (x) = 6 e x/6, for x. Let Y be the waiting time for the first claim from a bad driver. The density of Y is f Y (y) = 3 e y/3, for y. We need to find P[X 3, Y 2] = P[X 3]P[Y 2] = 3 6 e x/6 dx 3 3 e y/3 dy = ( e /2 )( e 2/3 ) = e 2/3 e /2 + e 7/6. Problem 9.4. (#, May 2) An insurance company sells two types of auto insurance policies: Basic and Deluxe. The time until the next Basic Policy claim is an exponential random variable with mean two days. The time until the next Deluxe Policy claim is an independent exponential random variable with mean three days. What is the probability that the next claim will be a Deluxe Policy claim? Answer:.4 Figure 9.7:, May 2 Solution: Let X be the time of the next Basic Policy claim. The density of X is f X (x) = 2 e x/2, for x. Let Y be the time until the next Deluxe Policy claim. The density of Y is f Y (y) = 3 e y/3, for y. Since X and Y are independent r.v. s, the joint density of X and Y is We need to find f X,Y (x, y) = 6 e x 2 y 3, for x, y. P[Y X] = = ( 2 )e x 2 y 3 x x 6 e x 2 y 3 dy dx dx = 2 e x 2 ( e x/3 ) dx = 2. 5 = ( 2 e x 2 2 e 5x 6 ) dx = ( e x 2 + 3 5 e 5x 6 )

9.4. EXPECTATION OF A FUNCTION OF SEVERAL RANDOM VARIABLES 6 9.4 Expectation of a function of several random variables Given discrete r.v. s X and Y with joint pmf p, and a function h : R 2 R, E[h(X, Y )] = x,y h(x, y)p(x, y). Given two jointly continuous r.v. s X and Y with joint pdf f, and a function h : R 2 R, E[h(X, Y )] = h(x, y)f(x, y) dy dx. Problem 9.5. (# 4, November, 2) A device contains two circuits. The second circuit is a backup for the first, so the second is used only when the first has failed. The device fails when and only when the second circuit fails. Let X and Y be the times at which the first and second circuits fail, respectively. X and Y have joint probability density function 6e x e 2y for < x < y < otherwise What is the expected time at which the device fails? Answer:.83 Figure 9.8: 4, November, 2 Solution: By a change of variable 2y = u, 3x = v, x E[Y ] = y6e x e 2y dy dx = u 2 6e x e u du dx = 2 3 2 e x ( e u )( + u) dx = 3 2 e x e 2x ( + 2x) dx = ) u=2x dx = ( + 2) = 5 =.83. 3 2 3 6 = 3 2 e v ( + 2v 3 2x 2x 3 2 e x e u u du dx 3 2 e 3x ( + 2x) dx

62 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE Problem 9.6. (# 3, November 2) Let T be the time between a car accident and reporting a claim to the insurance company. Let T 2 be the time between the report of the claim and payment of the claim. The joint density function of T and T 2, f(t, t 2 ), is constant over the region < t < 6, < t 2 < 6, t + t 2 <, and zero otherwise. Determine E[T + T 2 ], the expected time between a car accident and payment of the claim. Answer: 5.7 Figure 9.9: 3, November 2 Solution: The region consists of a square of side 6 with a triangle removed. The area of the region is 6 2 2 2 = 34. So, the density is f(t 2, t 2 ) = in the region < t 34 < 6, < t 2 < 6, t + t 2 <. By symmetry E[T + T 2 ] = 2E[T ] = 2 4 6 t dt 34 2dt + 2 6 t t dt 4 34 2dt = 6 7 = 5.72 4 t dt + 7 6 4 t ( t )dt Problem 9.7. (# 3, Sample Test) Let X and Y be random losses with joint density function 2x for < x < and < y < an insurance policy is written to cover the loss X + Y. The policy has a deductible of. Calculate the expected payment under the policy. Answer: 4. Solution: Let Z = g(x, Y ) be the payment, where if X + Y < g(x, Y ) = X + Y if X + Y So, the expected payment is E[g(X, Y )] = (x + y )2x dy dx = x(x + y x )2 x dx = x3 dx = 4.

9.4. EXPECTATION OF A FUNCTION OF SEVERAL RANDOM VARIABLES 63 Figure 9.: 3, Sample Test Problem 9.8. (# 5, May 2) Let T and T 2 represent the lifetimes in hours of two linked components in an electronic device. The joint density function for T and T is uniform over the region defined by t t 2 L where L is a positive constant. Determine the expected value of the sum of the squares of T and T. Answer: 2 3 L2 Figure 9.: 5, May 2 Solution: Since the density is uniform in a triangle with area L2. The density is 2 f(t, t 2 ) = 2 L 2 for t t 2 L. Hence, E[T 2 + T2 2 ] = L L t (t 2 + t 2 2) 2 dt L 2 2 dt = L L (t2 t 2 + t3 2 3 ) 2 dt L 2 t = L ( 2t2 + 2L 2t3 2t3 L 3 L 2 ) dt 3L 2 = 2L2 + 2L2 L2 L2 = 2 3 3 2 6 3 L2.

64 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE 9.5 Covariance. Definition 9.5.. Given two r.v. s X and Y, the covariance of X and Y is Cov(X, Y ) = E[(X E[X])(Y E[Y ])] As a rule of thumb, the covariance is positive is whenever one variable increases so does the other. For example, your score in a test X and the amount of hours you study for this test Y have positive covariance. The covariance of two r.v. s X and Y measures the linear dependence between the two variables. Definition 9.5.2. Given two r.v. s X and Y, the coefficient of correlation between of X and Y is Cov(X, Y ) ρ(x, Y ) =. Var(X)Var(Y ) ρ = if and only if there are constants a > and b R such that Y = ax + b. ρ = if and only if there are constants a < and b R such that Y = ax + b. Theorem 9.6. Given two r.v. s X and Y, the covariance of X and Y is Cov(X, Y ) = E[XY ] E[X]E[Y ]. Proof. Let µ X = E[X] and let µ Y = E[Y ]. Then, Cov(X, Y ) = E[(X µ X )(Y µ X )] = E[XY Xµ Y µ X Y + µ X µ Y ] = E[XY ] µ X µ Y µ X µ Y + µ X µ Y = E[XY ] E[X]E[Y ]. Theorem 9.7. Given r.v. s X, Y, X,..., X n, Cov(X, Y ) = Cov(Y, X) Cov(X, X) = Var(X) Cov(aX, by ) = abcov(x, Y ) Cov(X, Y + Z) = Cov(X, Y ) + Cov(X, Z) i= Cov(X i, Y ) Cov( n i= X i, Y ) = n Cov( n i= X i, m j= Y j) = n Theorem 9.8. Given r.v. s X, Y, X,..., X n and a, b R, i= m j= Cov(X i, Y j ) Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ) Var(aX + by + c) = a 2 Var(X) + a 2 Var(Y ) + 2abCov(X, Y ) Var( n i= X i) = n i= Var(X i) + i j n Cov(X i, X j ) Var( n i= a ix i ) = n i= a2 i Var(X i ) + i j n a ia j Cov(X i, X j ) Theorem 9.9. If X and Y are independent r.v. s, then Cov(X, Y ) =. There are r.v. s which are not independent, but Cov(X, Y ) =. Q.E.D.

9.5. COVARIANCE. 65 Theorem 9.. If X,..., X n are independent r.v. s, then Var( n i= X i) = n i= Var(X i) Var( n i= a ix i ) = n i= a2 i Var(X i ) Problem 9.9. (# 38, November, 2) The profit for a new product is given by Z = 3X Y 5. X and Y are independent random variables with Var(X) = and Var(Y ) = 2. What is the variance of Z? Answer:. Solution: We have that Var(Z) = Var(3X Y 5) = 9Var(X) + Var(Y ) =. Problem 9.2. (# 32, May 2) A company has two electric generators. The time until failure for each generator follows an exponential distribution with mean. The company will begin using the second generator immediately after the first one fails. What is the variance of the total time that the generators produce electricity? Answer: 2 Solution: Let X and Y be the times until failure of the two generators. X and Y are independent r.v. s. E[X] = E[Y ] = and Var(X) = Var(Y ) = 2 =. So, Var(X + Y ) = Var(X) + Var(Y ) = + = 2. Problem 9.2. (# 7, May 2) A joint density function is given by kx for < x <, < y < otherwise where k is a constant. What is Cov(X, Y )? Answer: Solution: Since = kx dy dx = k 2 k = 2. The density of X is f X (x) = 2x, for x. The density of Y is f Y (y) =, for y. Since the joint density f X,Y satisfies f X,Y (x, y) = f X (x)f Y (y), for each x, y, X and Y are independent. So, Cov(X, Y ) =. Problem 9.22. (# 35, Sample test) Suppose the remaining lifetimes of a husband and wife are independent and uniformly distributed on the interval [, 4]. An insurance company offers two products to married couples: One which pays when the husband dies; and One which pays when both the husband and wife have died. Calculate the covariance of the two payment times. Answer: 66.7. Solution: Let X be the lifetime of the husband and let Y be the lifetime of the wife. We need to find Cov(X, max(x, Y )) = E[X max(x, Y )] E[X]E[max(X, Y )].

66 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE We have that E[X] = 2, and So, 4 E[max(X, Y )] = 4 max(x, y) dy dx 6 = 4 x x dy dx + 4 4 y dy dx 6 x 6 = 4 x 2 dx + 4 (6 6 32 x2 ) dx = 4 + 2 4 = 8 3 6 3 4 E[X max(x, Y )] = 4 x max(x, y) dy dx 6 = 4 x x x dy dx + 4 4 xy dy dx 6 x 6 = 4 x 3 dx + 4 x (6 6 32 x2 ) dx = (4)4 + (4)2 44 = 4 + 4 2 = 6. 64 4 28 Cov(X, max(x, Y )) = 6 (2) 8 3 = 2 3 = 86.67 Problem 9.23. (# 7, November 2) A stock market analyst has recorded the daily sales revenue for two companies over the last year and displayed them in the histograms below. Figure 9.2: 7, November 2 The analyst noticed that a daily sales revenue above for Company A was always accompanied by a daily sales revenue below for Company B, and vice versa. Let X denote the daily sales revenue for Company A and let Y denote the daily sales revenue for Company B, on some future day. Assuming that for each company the daily sales revenues are independent and identically distributed, which of the following is true? (A) Var(X) > Var and Var(X + Y ) > Var(X) + Var(Y ). (B) Var(X) > Var(Y ) and Var(X + Y ) < Var(X) + Var(Y ). (C) Var(X) > Var(Y ) and Var(X + Y ) = Var(X) + Var(Y ). (D) Var(X) < Var(Y ) and Var(X + Y ) > Var(X) + Var(Y ).

9.6. MULTINOMIAL DISTRIBUTION 67 (E) Var(X) < Var(Y ) and Var(X + Y ) < Var(X) + Var(Y ). Answer: (E) Solution: Looking at the graphs, we see that Var(Y ) > Var(X) The information: The analyst noticed that a daily sales revenue above for Company A was always accompanied by a daily sales revenue below for Company B means that the covariance Co(X < Y ) <. This is equivalent to Var(X) + Var(Y ) > Var(X + Y ). So, the answer is (E). Problem 9.24. (# 7, November 2) Let X denote the size of a surgical claim and let Y denote the size of the associated hospital claim. An actuary is using a model in which E(X) = 5, E(X 2 ) = 27.4, E(Y) = 7, E(Y 2 ) = 5.4, and V ar(x + Y ) = 8. Let C = X + Y denote the size of the combined claims before the application of a 2% surcharge on the hospital portion of the claim, and let C 2 denote the size of the combined claims after the application of that surcharge. Calculate Cov(C, C 2 ). Answer:8.8 Solution: First, we find the variance of X and Y : Var(X) = E[X 2 ] (E[X]) 2 = 27.4 (5) 2 = 2.4 and Var(Y ) = E[Y 2 ] (E[Y ]) 2 = 5.4 (7) 2 = 2.4. We find the covariance of X and Y using that Var(X + Y ) = 8, 8 = Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X, Y ) = 2.4 + 2.4 + 2Cov(X, Y ). So, Cov(X, Y ) = 8 2.4 2.4 2 =.6. We have that C = X + Y and C 2 = X + Y + Y 5 = X + 6Y 5. So, Cov(C, C 2 ) = Cov ( X + Y, X + 6Y 5 = 2.4 + 6(2.4) + (.6) = 8.8. 5 5 ) = Var(X) + 6 5 Var(Y ) + 5 Cov(X, Y ) 9.6 Multinomial distribution Consider a series of n independent trials with possible outcomes,..., k. Suppose that the probability that a trial results in the i th is p i, for i k and each trial. Let X i be the number of trials resulting in the i th outcome. Then, (X,..., X n ) has a multinomial distribution. We have that ( ) n P[(X,..., X n ] = (j,..., j k )) = p j j,..., j p j k k, k for j + j k = n. In the case k = 2, the distribution is called the binomial distribution. In the case k = 3, the distribution is called the trinomial distribution. Example 9.5. A fair die is thrown out n times. The possible outcomes are, 2, 3, 4, 5, 6. Let p be the probability of obtaining in a throw. p 2, p 3, p 4, p 5 and p 6 are defined similarly.

68 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE We have that p = p 2 = p 3 = p 4 = p 5 = p 6 =. Let X 6 be the number of s in the n throws. X 2, X 3, X 4, X 5 and X 6 are defined similarly, then if j + j 2 + j 3 + j 4 + j 5 + j 6 = n, then ( ) ( ) n n P[X = j, X 2 = j 2, X 3 = j 3, X 4 = j 4, X 5 = j 5, X 6 = j 6 ] =. j, j 2, j 3, j 4, j 5, j 6 6 Problem 9.25. (# 29, May 2) A large pool of adults earning their first driver s license includes 5% low-risk drivers, 3% moderate-risk drivers, and 2% high-risk drivers. Because these drivers have no prior driving record, an insurance company considers each driver to be randomly selected from the pool. This month, the insurance company writes 4 new policies for adults earning their first driver s license. What is the probability that these 4 will contain at least two more high-risk drivers than low-risk drivers? Answer:.49 Solution: Let X be the number of low risk drivers. Let Y be the number of moderate risk drivers. Let Z be the number of high risk drivers. Then, (X, Y, Z) has a trinomial distribution with parameters n = 4, p =.5, p 2 =.3 and p 3 =.2. Then, the probability that there will contain at least two more high-risk drivers than low-risk drivers 9.7 Problems. P[Z X + 2] = P[X =, Y =, Z = 4] + P[X =, Y =, Z = 3] +P[X =, Y = 2, Z = 2] + P[X =, Y +, Z = 3] = 4!!4! (.5)4 (.3) (.2) 4 + 4!!3! (.5) (.3) (.2) 3 + 4!2!2! (.5) (.3) 2 (.2) 2 + 4!!3! (.5) (.3) (.2) 3 =.488. Suppose that X and Y are independent random variables, X is uniformly distributed on (, ) and Y is uniformly distributed on (, ). Find P (X Y ). 2. Let 6x, < x < y <, and zero otherwise. Find the marginal densities of X and Y. Are X and Y independent random variables? 3. Let Y and Y 2 be two jointly continuous random variables with joint density function have density function cy if y, y 2 y + y 2, f(y, y 2 ) = else. (a) Find the value of c. (b) Find P (Y 3/4, Y 2 /2). (c) Find P (Y Y 2 ). 4. Let X and Y be two jointly continuous random variables with density ( ) x 2 + xy if < x <, < y < 2; 3 else. Find P2X > Y }.

9.7. PROBLEMS. 69 5. The joint probability density function of X and Y is given by 6 7 Are X and Y independent? Why? ( ) x 2 + xy 2 if < x <, < y < 2; else. 6. Let Y and Y 2 be two jointly continuous random variables with joint density function have density function y e (y +y 2 ) if y >, y 2 >, f(y, y 2 ) = else. (a) Find the marginal probability density functions of Y and Y 2. (b) Are Y and Y 2 independent random variables? Justify your answer. (c) Find P (Y + Y 2 ). (d) Find P (Y Y 2 2). 7. The random variables Y and Y 2 have joint density function c f(y, y 2 ) = y y 2, if < y <, < y 2 < 2, elsewhere. Find c and P [Y 2 2Y ] 8. The joint probability density function of X and Y is given by 24xy if < x <, < y < x; else. Find the marginal densities of X and Y. Are X and Y independent random variables? 9. Let X and let Y be two jointly continuous random variables with joint density 6( x y) if x, y, x + y f X,Y (x, y) = else. Find the marginal densities. Are X and Y independent random variables?. Let Y and Y 2 be two jointly continuous random variables with joint density function 24y y 2 if y, y 2, y + y 2, f(y, y 2 ) = else. (a) Find the mean and the variance of Y and Y 2. (b) Find the covariance and the correlation coefficient of Y and Y 2. (c) Find the mean and the variance of U = 2 + 3Y 4Y 2. (d) Find Cov( 2 + 3Y 4Y 2, 4 2Y + 3Y 2 ).

7 CHAPTER 9. MULTIVARIATE DISTRIBUTIONS, COVARIANCE. The random variables X and Y have joint density function 3x 8 + 3y2 if < x < 2, < y <, 8 otherwise. Find P (X + Y 2). 2. The random variables X and Y have joint density function c(2x 2 + xy) if < x <, < y < 2, otherwise Find c and the covariance of X and Y. 3. The random variables X and Y have joint density function cx 2 y if < x <, < y < 2, otherwise. Find P [2X < Y ]. 4. The random variables X and Y have joint density function x if < x <, < y < 2, otherwise. Find P [X + Y ]. 5. Let X and Y be continuous random variables with joint density function for < y < x and x else What is Var(X)? 6. Let X and Y have the pdf cx if x y x, x else Find c, the mean and the variance of X and Y, and the covariance of X and Y. 7. Let X and Y be two independent r.v. s. Suppose that X is uniform distributed in the interval (, 2), and Y is an exponential r.v. with parameter λ =. Find Cov(X, Y ). 8. If X and Y are independent random variables with Find the variance of 3X 2Y +. E[X] =, Var(X) = 2, E[Y ] = 3, Var(Y ) = 4.

9.7. PROBLEMS. 7 9. Let X and X 2 be two independent random variables, both with mean and variance 4. Find E[(X 2 + 2)(X 2 )]. 2. Let X and Y be two independent random variables each having mean and variance 2. Find the covariance of the random variables X and 2X Y. 2. Let X and Y be two independent random variables each having mean and variance 5. Find the variance of the random variable 2X Y + 3. 22. Let Y and Y 2 be two random variables satisfying that Find the covariance of Y and Y 2. Var(Y ) = and Var(Y 2 ) = Var(2Y Y 2 ) = 4. 23. Let X and X 2 be 2 random variables. Suppose that each E[X ] =, E[X 2 ] = 2, Var(X ) = 2, Var(X 2 ) = 8 and the correlation coefficient of X and X 2 is /2. Find the mean and the variance of Y = 3 + X 2X 2. 24. The joint probability mass function of X and Y is given by p(, ) = /8, p(, 2) = /4, p(, ) = /8, p(2, ) = /8, p(2, 2) = /2. Find the cumulative distribution function of the random variable X. 25. A jar contains 25 pieces of candy, of which are chocolate, 8 are mint and 7 are latte. A group of 5 pieces of candy are chosen at random. Let X equal the number of chocolate candies in the random sample of 5. Let Y equal the number of latte candies in the random sample of 5. Find the joint probability mass function of X and Y.