Notes for Math 324, Part 20

Size: px
Start display at page:

Download "Notes for Math 324, Part 20"

Transcription

1 7 Notes for Math 34, Part

2 Chapter Conditional epectations, variances, etc.. Conditional probability Given two events, the conditional probability of A given B is defined by P[A B] = P[A B]. P[B] P[A B] is the probability of A assuming that B have happened. To find a conditional probability, we restrict the sample space to B. Given a r.v. X and A, B R, P[X A X B] = P[X A B]. P[X B] Problem.. (# 7, November ) The loss due to a fire in a commercial building is modeled by a random variable X with density function.5( ) for < < f() otherwise Given that a fire loss eceeds 8, what is the probability that it eceeds 6? Answer: 9 Now, and Solution: We have to find P[X > 6) = 6 P[X > 6 X > 8] =.5( ) d = P[X > 8] = 8 ( ) 6 d = So, P(X > 6 X > 8) = 4 = P[X > 6] P[X > 8]. ( ) d = ( ) 4 ( ) 4 8 = 4. 6 = 4 4

3 74 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. Problem.. (# 39, May ) An insurance company insures a large number of homes. The insured value, X, of a randomly selected home is assumed to follow a distribution with density function 3 4 for > f() = otherwise Given that a randomly selected home is insured for at least.5, what is the probability that it is insured for less than? Answer:.578 Now, and So, Solution: We have to find P[X X.5] = P[.5 X ] = P[.5 X] = P[X X.5] =.5.5 P[.5 X ]. P[X.5] 3 4 d = ( 4 ) 3 4 d = ( 4 ) =.5.5 = =.5 3. ( ) 3 3 = =.578. Problem.3. (# 7, November ) A group insurance policy covers the medical claims of the employees of a small company. The value, V, of the claims made in one year is described by V =,Y where Y is a random variable with density function k( y) 4 for < y < f(y) = otherwise where k is a constant. What is the conditional probability that V eceeds 4,, given that V eceeds,? Answer:.3 Solution: To make f a density k = 5. We have to find Now, and So, P[V 4, V,] = P[,Y 4,,Y,] = P[Y.4 Y.] = P[Y.4] = P[Y.] =.4. P[Y.4] P[Y.]. 5( y) 4 dy = ( y) 5 5( y) 4 dy = ( y) 5.4. = (.6) 5 = (.9) 5 P[V 4, V,] = (.6)5 (.9) 5 = 3 43 =.37.

4 .. CONDITIONAL PROBABILITY 75 Problem.4. (# 3, November ) The number of injury claims per month is modeled by a random variable N with P [N = n] =, where n. Determine the probability (n+)(n+) of at least one claim during a particular month, given that there have been at most four claims during that month. Answer:. 5 Now, and So, Solution: We have to find P[N N 4] = P[ N 4]. P[N 4] P[ N 4] = P[N = ] + P[N = ] + P[N = 3] + P[N = 4] = = P[N 4] = P[N = ] + P[N = ] + P(N = ) + P[N = 3] + P[N = 4] = = P[N N 4] = 5. We also can define the conditional epectation and the conditional variance given that X B. For a discrete r.v. X, the conditional pmf of X given X B, is p X X B () = p() P[X B] if B otherwise Hence, the conditional epectation is defined by E[X X B] = p X X B () = p() P(X B) = B B B p() B p(). For a continuous r.v. X, the conditional pmf of X given X B, is f X X B () = f() P(X B) if B otherwise The conditional epectation of X given X B is f() E[X X B] = P[X B] d = B B B f() d. f() d E[X X B] is the average value of X assuming that B happened. We restrict the sample space to B. To find the conditional epectation given X B, we do the usual epectation in B and we divide over P [X B]. Problem.5. (# 9, Sample Test) The distribution of loss due to fire damage to a warehouse is:

5 76 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. Amount of Loss Probability.9 5.6,.3,.8 5,.,. Given that a loss is greater than zero, calculate the epected amount of the loss. Answer:, 9 Solution: We need to find E[X X > ]. We condition on X >. Since P(X > ) = P(X = ) =., we get the following conditional distribution 5 5 p X X> () = P[X=] P[X>] Finally, E[X X > ] = > p X X>() = (5)(.6) + ()(.3) + ()(.8) + (5)(.) + ()(.) = 9.. Conditional probability mass functions and conditional densities For two r.v. s we can have conditional probabilities in the usual way. Given two r.v.s X and Y and A, B R, P[X A, Y B] P[X A Y B] =. P[Y B] Eample.. The joint probability density functions of X and Y is given by Find P[Y > X /]. f(, y) = for < <, < y <. Solution: By definition of conditional probability, P[Y > X /] = P[Y >, X /] P[X /] = / dy d /8 / = dy d /4 =. Given two discrete r.v. s X and Y, the conditional mass function of X given Y = y is defined by p X Y ( y) = P[X = Y = y] = p X,Y (, y), if p Y (y) >. p Y (y)

6 .. CONDITIONAL PROBABILITY MASS FUNCTIONS AND CONDITIONAL DENSITIES 77 Similarly, the conditional mass function of Y given X = is defined by p Y X (y ) = P[Y = y X = ] = p X,Y (, y), if p X () >. p X () From these formulas, it follows that p X,Y (, y) = p X ()p Y X (y ) = p Y (y)p X Y ( y). If p Y (y) =, p X Y ( y) is not defined. Since p Y (y) =, Y = y cannot happen. We can find probabilities and epectations assuming that something happens, if the probability of that something is positive. Given two jointly continuous r.v. s X and Y, the conditional density of X given Y = y is defined by f X Y ( y) = f X,Y (, y), if f Y (y) >. f Y (y) If f Y (y) =, f X Y ( y) is not defined. Similarly, the conditional mass function of Y given X = is defined by f Y X (y ) = f X,Y (, y), if f X () >. f X () From these formulas, it follows that f X,Y (, y) = f X ()f Y X (y ) = f Y (y)f X Y ( y). Eample.. Let X and be Y discrete random variables with joint probability function (+)(y+) for =,, ; y =,, 54 p(, y) = else Find the conditional pmf of Y given X =. Solution: We have that p X () = P(X = ) = p(, ) + p(, ) + p(, ) = = 8 54 = 3. The conditional conditional pmf of Y given X = is p Y X ( ) = p X,Y (, ) p X () p Y X ( ) = p X,Y (, ) p X () p Y X ( ) = p X,Y (, ) p X () = 4/54 8/54 = 4 8 = 6/54 8/54 = 6 8 = 8/54 8/54 = 8 8 Eample.3. Let X and Y have the pdf 6(y ) if y f(, y) = else Find the marginal pfd s, find the conditional pdf s f Y X (y ) and f X Y ( y).

7 78 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. Solution: The marginal density of X is f X () = The conditional density of Y given X = is f Y X (y ) = f X,Y (, y) f X () 6(y ) dy = 3(y ) = 3( ) for < < = 6(y ) (y ) = for y. 3( ) ( ) The marginal density of Y is y y f Y (y) = 6(y ) d = (6y 3 ) = 6y 3y = 3y for < y < The conditional density of X given Y = y is f X Y ( y) = f X,Y (, y) f Y (y) = 6(y ) 3y = (y ) y for y. Figure.: 3, May Problem.6. (# 3, May ) An insurance policy is written to cover a loss X where X has density function 3 8 f() = for otherwise The time (in hours) to process a claim of size, where, is uniformly distributed on the interval from to. Calculate the probability that a randomly chosen claim on this policy is processed in three hours or more. Answer:.7

8 .. CONDITIONAL PROBABILITY MASS FUNCTIONS AND CONDITIONAL DENSITIES 79 Solution: Let Y be the time to process a claim. The conditional distribution of Y given X = is f Y X (y ) = if < y <. The joint density of X and y is if < < y < <. Hence, f X,Y (, y) = f Y X (y )f X () = 3 8 = 3 8 P(Y 3) = 4 3 = y/ 3 y/ 8 dy = 4 3 d dy ( 3 3y 4 64 ) dy = 64 =.78. Figure.: 34, November Problem.7. (# 34, November ) An auto insurance policy will pay for damage to both the policyholder s car and the other driver s car in the event that the policyholder is responsible for an accident. The size of the payment for damage to the policyholder s car, X, has a marginal density function of for < <. Given X =, the size of the payment for damage to the other driver s car, Y, has conditional density of for < y < +. If the policyholder is responsible for an accident, what is the probability that the payment for damage to the other driver s car will be greater than.5? Answer: 7 8. Solution: The marginal density of X is f X () = for. The conditional density of X given Y is f Y X (y ) = if < < and < y < +. So, the joint density of X and Y is f X,Y (, y) = f X ()f Y X (y ) = if < < and < y < +

9 8 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. So, [ P Y ] = P[Y / ] = / dy d = / ( ) d = Conditional epectations Given two discrete r.v. s X and Y and let A R, the conditional probability that X A given Y = y is defined by P[X A Y = y] = A p X Y ( y). Similarly, the conditional probability that Y A given X = is defined by P[Y A X = ] = y A yp Y X (y ). Given two jointly continuous r.v. s X and Y, and let A R, the conditional probability that X A given Y = y is defined by P[X A Y = y] = f X Y ( y) d. Similarly, the conditional probability that Y A given X = is defined by P[Y A X = ] = yf Y X (y ) dy. Problem.8. (#, May ) A company offers a basic life insurance policy to its employees, as well as a supplemental life insurance policy. To purchase the supplemental policy, an employee must first purchase the basic policy. Let X denote the proportion of employees who purchase the basic policy, and Y the proportion of employees who purchase the supplemental policy. Let X and Y have the joint density function f(, y) = ( + y) on the region where the density is positive. Given that % of the employees buy the basic policy, what is the probability that fewer than 5% buy the supplemental policy? Answer:.47 Solution: The joint density of X and Y is f X,Y (, y) = ( + y), for y. The marginal density of X is f X () = So, the conditional density of Y given X = is For =., we have f Y X (y ) = f X,Y (, y) f X () f Y X (y.) =. + y.3 = A y A ( + y) dy = 3. + y 3 for y. = 3 + y for. y. 3

10 .3. CONDITIONAL EXPECTATIONS 8 So, the conditional probability we are looking for is.5 ( P(Y.5 X =.) = 3 + ) 3 y dy = 5 =.466. Problem.9. (# 5, November ) Once a fire is reported to a fire insurance company, the company makes an initial estimate, X, of the amount it will pay to the claimant for the fire loss. When the claim is finally settled, the company pays an amount, Y, to the claimant. The company has determined that X and Y have the joint density function f(, y) = ( ) y ( )/( ), >, y >. Given that the initial claim estimated by the company is, determine the probability that the final settlement amount is between and 3. Answer: 8 9 Solution: We need to find P[ Y 3 X = ]. The marginal density of X is f X () = f X,Y (, y) dy = = for >. 3 = y ( )/( ) ( ) + The conditional density of Y given X = is For =, we have f Y X (y ) = f X,Y (, y) f X () = ( ) y ( )/( ) dy ( )y ( )/( ) for y >. f Y X (y ) = y 3 for y >. The conditional probability we are looking for is P[ Y 3 X = ] = 3 f Y X (y ) dy = 3 y 3 dy = 8 9. Given two discrete r.v. s X and Y, the conditional epectation of X given Y = y is defined by E[X Y = y] = p X Y ( y). The conditional epectation of Y given X = is defined by E[Y X = ] = y yp Y X (y ). Given two jointly continuous r.v. s X and Y, the conditional epectation of X given Y = y is defined by E[X Y = y] = f X Y ( y) d. The conditional epectation of Y given X = is defined by E[Y X = ] = yf Y X (y ) dy.

11 8 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. Eample.4. Let X and be Y discrete random variables with joint probability function (+)(y+) for =,, ; y =,, 54 p(, y) = else Find E[Y X = ]. Solution: By a previous eample, the conditional pmf of Y given X = is p Y X ( ) = 4 p Y X ( ) = 6, p 8 Y X( ) = 8. So, 8 E[Y X = ] = () () () 8 8 = 8. Eample.5. Let X and Y have the pdf 6(y ) if y f(, y) = else Find E[X Y = y] and E[Y X = ]. So, So, Solution: By a previous eample, f Y X (y ) = 6(y ) (y ) = for y. 3( ) ( ) E[Y X = ] = y (y ) ( ) dy = = 3( ) ( ) 3 + 3( ) The conditional density of X given Y = y is f X Y ( y) = ( y 3 3( ) 3 ( ) ) y ( ) = 3+3 = + 3( ) 3 (y ) y for y. E[X Y = y] = ( ) y y (y ) d = y 3 = y. y y 3y 3 Problem.. (# 37, Sample test) An insurance contract reimburses a family s automobile accident losses up to a maimum of two accidents per year. The joint probability distribution for the number of accidents of a three person family (X, Y, Z) is p(, y, z) = k( + y + z), where =,, y =,,, z =,, and, y, z are the number of accidents incurred by X, Y, and Z, respectively. Determine the epected number of unreimbursed accidents losses given that is not involved in any accidents. Answer: 7 9. Solution: First, we find P[X = ] = p(,, ) + p(,, ) + p(,, ) + p(,, ) + p(,, ) +p(,, ) + p(,, ) + p(,, ) + p(,, ) = k + k + k + 3k + 4k + 4k + 5k + 6k = 7k 8,

12 .4. CONDITIONAL VARIANCES 83 (y, z) (,) (,) (,) (,) (,) (,) (,) (,) (,) P[Y = y, Z = z X = ] The number of unreimbursed accident losses is ma(x + Y + Z, ). Assuming that X =, the number of unreimbursed accident losses is ma(y + Z, ). So, E[ma(Y + Z, ) X = ] = P[Y =, Z = X = ] + P(Y =, Z = X = ) + P[Y =, Z = X = ] = () 4 + () 4 + () 4 = Conditional variances Given two discrete r.v. s X and Y, the conditional variance of X given Y = y is defined by Var(X Y = y) = E[(X E[X Y = y]) Y = y] = ( E[X Y = y]) p X Y ( y). It can be shown that Var(X Y = y) = E[X Y = y] (E[X Y = y]) = ( p X Y ( y) p X Y ( y)). Similarly, the conditional variance of Y given X = is defined by Var(Y X = ) = E[(Y E[Y X = ]) X = ] = y (y E[Y X = ]) p Y X (y ). It can be shown that Var(Y X = ) = E[Y X = ] (E[Y X = ]) = y ( y p Y X (y ) yp Y X (y )). Given two jointly continuous r.v. s X and Y, the conditional variance of X given Y = y is defined by Var(X Y = y) = E[(X E[X Y = y]) Y = y] = Usually, it is easier to do Var(X Y = y) = E[X Y = y] (E[X Y = y]) = ( f X Y ( y) d f X Y ( y) d). Similarly, the conditional variance of Y given X = is defined by Var(Y X = ) = E[(Y E[Y X = ]) X = ] = Usually, it is easier to do Var(Y X = ) = E[Y X = ] (E[Y X = ]) = ( y f Y X (y ) dy y f Y X (y ) dy). ( E[X Y = y]) f X Y ( y) d. (y E[Y X = ]) f Y X (y ) d. y

13 84 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. Problem.. (# 4, May ) The stock prices of two companies at the end of any given year are modeled with random variables X and Y that follow a distribution with joint density function for < <, < y < + f(, y) = otherwise What is the conditional variance of Y given that X =? Answer: Solution: f X () = + d =, for < <. The conditional density of Y given X = is f Y X (y ) = for < y < +. This is a uniform distribution on the interval (, + ). So, its variance is. Problem.. (# 4, November ) A diagnostic test for the presence of a disease has two possible outcomes: for disease present and for disease not present. Let X denote the disease state of a patient, and let Y denote the outcome of the diagnostic test. The joint probability function of X and Y is given by: P [X =, Y = ] =.8, P [X =, Y = ] =.5, P [X =, Y = ] =.5, P [X =, Y = ] =.5, Calculate Var(Y X = ). Answer:. Solution; First, we find the conditional pmf of Y given X =. We have that P[X = ] =.75. So, P[Y = X = ] =.5.75 = 7 and P[Y = X = ] =.5.75 = 5 7. Now, E[Y X = ] = = 7 E[Y X = ] = = 7 and Var(Y X = ) = 7 ( 7) =.4. Problem.3. (#, May ) An actuary determines that the annual numbers of tornadoes in counties P and Q are jointly distributed as follows: Annual number of tornadoes in county P Annual number of tornadoes in county Q

14 .5. FURTHER PROPERTIES OF THE CONDITIONAL EXPECTATIONS 85 Calculate the conditional variance of the annual number of tornadoes in county Q, given that there are no tornadoes in county P. Answer:.99 Solution: Since P[P = ] =.5, the conditional pmf of Q given P =, is q 3 P[Q = q P = ] So, E[Q P = ] = ().48 + ().4 + (). + (3).8 =.88 E[Q P = ] = ().48 + ().4 + (). + (3).8 =.76 Var(Q P = ) = E[Q P = ] (E[Q P = ]) = (.76).88 = Further properties of the conditional epectations Given two r.v. s X and Y and a function g : R R, we may define E[g(X, Y ) X = ]. We have the following properties of the conditional epectation: Theorem.. Let X and let Y be two r.v. s, let g, g : R R and let h : R R. Then, E[ag (X, Y ) + bg (X, Y ) X = ] = ae[g (X, Y ) X = ] + be[g (X, Y ) X = ] and E[ag (X, Y ) + bg (X, Y ) Y = y] = ae[g (X, Y ) Y = y] + be[g (X, Y ) Y = y]. E[h(X) X = ] = h() and E[h(Y ) Y = y] = h(y). If X and Y are independent, then E[h(Y ) X = ] = E[h(Y )] and E[h(X) Y = y] = E[h(X)]. Given two r.v.s X and Y, E[X Y = y] is a function of y. So, we find the epectation and the variance of E[X Y = y]. Theorem.. (Double epectation theorem for epectations) Given two r.v.s X and Y, E[E[X Y = y]] = E[X] and E[E[Y X = ]] = E[Y ]. Proof. We only to the continuous case, E[X Y = y] is a function on y. So, E[E[X Y = y]] = By definition E[X Y = y] = f X Y ( y) d. Hence, E[E[X Y = y]] = f X Y ( y) df Y (y) dy = E[X Y = y]f Y (y) dy. By definition of conditional density, f Y (y)f X Y ( y) = f X,Y (, y). So, E[E[X Y = y]] = f X,Y (, y) d dy = E[X]. f Y (y)f X Y ( y) d dy. Q.E.D.

15 86 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. Theorem.3. (Double epectation theorem for variances) Given two r.v.s X and Y, Var(X) = Var(E[X Y = y]) + E[Var(X Y = y)] and Var(Y ) = Var(E[Y X = ]) + E[Var(Y X = )]. Problem.4. (# 4, Sample Test) An automobile insurance company divides its policyholders into two groups: good drivers and bad drivers. For the good drivers, the amount of an average claim is 4, with a variance of 4,. For the bad drivers, the amount of an average claim is, with a variance of 5,. Sity percent of the policyholders are classified as good drivers. Calculate the variance of the amount of a claim for a policyholder. Answer:,4 Solution: Let X be the amount of a claim. Let if the driver is good Y = if the driver is bad We know that P(Y = ) =.6, P(Y = ) =.4, E[X Y = ] = 4, Var(X Y = ) = 4, E[X Y = ] =, Var(X Y = ) = 5. We use the formula of the double epectation theorem for the variance, Var(X) = E[Var(X Y )] + Var(E[X Y ]). So, Net, and E[Var(X Y )] = P[Y = ]Var(X Y = )] + P[Y = ]Var(X Y = ) = (.6)(4) + (.4)(5) = 4. E[E[X Y ]] = P[Y = ]E[X Y = ] + P[Y = ]E[X Y = ] = ()(.6) + (4)(.4) = 76, E[(E[X Y ]) ] = P[Y = ](E[X Y ]) + P[Y = ](E[X Y ]) = () (.6) + (4) (.4) = 384 Finally, Var(E[X Y ]) = E[(E[X Y ]) ] (E[E[X Y = ]]) = 384 (76) = 864. Var(X) = E[Var(X Y )] + Var(E[X Y ]) = = 4. Problem.5. (#, May ) Let X and Y denote the values of two stocks at the end of a five year period. X is uniformly distributed on the interval (, ). Given X =, Y is uniformly distributed on the interval (, ). Determine Cov(X, Y ) according to this model. Answer: 6

16 .5. FURTHER PROPERTIES OF THE CONDITIONAL EXPECTATIONS 87 Solution : Since X is uniformly distributed on the interval (, ), then E[X] = 6. Y given X = has a uniform distribution on the interval (, ), E[Y X = ] =. So, [ ] X E[Y ] = E[E[Y X = ]] = E = 3 E[XY ] = E[E[XY X = ]] = E[XE[Y X = ]] = E[X X ] = 3 d = 7 = 4. Finally, Cov(X, Y ) = E[XY ] E[X]E[Y ] = 4 (6)(3) = 6. Solution : Since X is uniformly distributed on the interval (, ), f X () =, for. The conditional density of Y given X = is f Y X (y ) = for y (, ). So, the joint density is So, Finally, f X,Y ( y) = f X ()f Y X (y ) =, for y. E[X] = E[Y ] = E[XY ] = dy d = 6 y dy d = 3 y dy d = 4 Cov(X, Y ) = E[XY ] E[X]E[Y ] = 4 (6)(3) = 6. Problem.6. (#, May ) Two life insurance policies, each with a death benefit of, and a one-time premium of 5, are sold to a couple, one for each person. The policies will epire at the end of the tenth year. The probability that only the wife will survive at least ten years is.5, the probability that only the husband will survive at least ten years is., and the probability that both of them will survive at least ten years is.96. What is the epected ecess of premiums over claims, given that the husband survives at least ten years? Answer: 897 Solution: Let and let Y = Y = if the husband dies before years if the husband survives at least years if the wife dies before years if the wife survives at least years The ecess of premiums over claims is Y Y. We know that P [Y =, Y = ] =.5, P [Y =, Y = ] =., P [Y =, Y = ] =.96. The ecess of

17 88 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. premium over claims is Y Y. The epected ecess of premiums over claims, given that the husband survives at least ten years is because E[ Y Y Y = ] = E[Y Y = ] = P[Y = Y = ] =..97 = 896.8, P[Y = Y = ] = P[Y =, Y = ] P[Y = ] =..97. Problem.7. (# 9, Sample Test) An insurance company designates % of its customers as high risk and 9% as low risk. The number of claims made by a customer in a calendar year is Poisson distributed with mean θ and is independent of the number of claims made by a customer in the previous calendar year. For high risk customers θ =.6, while for low risk customers θ =.. Calculate the epected number of claims made in calendar year 998 by a customer who made one claim in calendar year 997. Answer:.4 Solution: Let Y = if the customer is high risk if the customer is low risk Let X be the number of claims in 997 and let X be the number of claims in 998. We have to find E[X X = ] = E[X I(X = )]. P[X = ] Conditioning on Y, and So, E[X I(X = )] = (.)E[X I(X = ) Y = ] + (.9)E[X I(X = ) Y = ] = (.)e.6 (.6)(.6) + (.9)e. (.)(.) =.79 P[X = ] = (.)P[X = Y = ] + (.9)P(X = ) Y = ) = (.)e.6 (.6) + (.9)e. (.) =.44, E[X X = ] = =.439. Eample.6. The number of claims received in a year by an insurance company is a Poisson random variable with λ = 5. The claim amounts are independent and uniformly distributed over [, 5]. If the company has $4, available to pay claims, what is the probability that it will have enough to pay all the claims that come in?.6 Miture of densities Given densities f,..., f n and real numbers α,..., α n with n j= α j =, then f() = n j= α jf j () is a density. f is a miture of the densities f,..., f n. Alternatively, a r.v. with the density f can obtained as follows. Let Y be a discrete r.v. with values,..., n

18 .6. MIXTURE OF DENSITIES 89 and P(Y = j) = p j for j n. Let X be a r.v. such that the distribution of X given Y = j is continuous with density f j. Then, for each A R, n n P[X A] = P[Y = j]p[x A Y = j] = p j f j () d = f() d. j= Problem.8. (# 7, May ) An auto insurance company insures an automobile worth 5, for one year under a policy with a, deductible. During the policy year there is a.4 chance of partial damage to the car and a. chance of a total loss of the car. If there is partial damage to the car, the amount X of damage (in thousands) follows a distribution with density function.53e / if < < 5, f() = otherwise. What is the epected claim payment? Answer:38 Solution: Let j= if there is no damage to the car Y = if there is partial damage to the car if there is total damage to the car Let X be the amount of damage of the car. Then, X given Y = is zero. X given Y = has the density above. X given Y = is 5,. The epected payment is By a change of variables, 5 E[ma(X, )] = E[E[ma(X, ) Y ]] = P[Y = ]P[X Y = ] + P[Y = ]P[ma(X, ) Y = ] +P[Y = ]P[ma(X, ) Y = ] = (.4) 5 ( ( )(.53)e / d + (.)(4) 5 =. e / d ) 5 e / d + 8 e / d = ue u d = ( 4)e u (u + ) = 6e.5 34e 7.5 A A and 5 e / d = (e.5 e 7.5 ) The epected payment is. ( 6e.5 34e 7.5 (e.5 e 7.5 ) ) + 8 = Problem.9. (# 33, May ) For Company A there is a 6% chance that no claim is made during the coming year. If one or more claims are made, the total claim amount is normally distributed with mean, and standard deviation,. For Company B there is a 7% chance that no claim is made during the coming year. If one or more claims are made, the total claim amount is normally distributed with mean 9, and standard deviation

19 9 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC.,. Assume that the total claim amounts of the two companies are independent. What is the probability that, in the coming year, Company B s total claim amount will eceed Company A s total claim amount? Answer:.3 Solution: Company B s total claim amount will eceed Company A s total claim amount if either Company A has claim and B does or both companies have claims and the claim for B is bigger than the claim for A. We do (.6)(.3) + (.4)(.3)P[X < Y ] =.8 = (.)P [ =.8 + (.)P N(, ) () +() ] [ N(, ) = (.8) + (.)Φ(.35) = (.8) + (.)(.63) =.68. ] () +() Problem.. (# 6, November ) You are given the following information about N, the annual number of claims for a randomly selected insured P [N = ] =, P [N = ] = 3, P [N > ] = 6 Let S denote the total annual claim amount for an insured. When N =, S is eponentially distributed with mean 5. When N >, S is eponentially distributed with mean 8. Determine P [4 < S < 8]. Answer:. Solution: We have that P[4 < S < 8] = P[N = ]P[4 < S < 8 N = ] + P[N > ]P[4 < S < 8 N > ] 5 e /5 d e /8 d = 3 (e 4/5 e 8/5 ) + 6 (e 4/8 e 8/8 ) =.. = Problem.. (#, Sample Test) An insurance policy covers the two employees of ABC Company. The policy will reimburse ABC for no more than one loss per employee in a year. It reimburses the full amount of the loss up to an annual company-wide maimum of 8. The probability of an employee incurring a loss in a year is 4%. The probability that an employee incurs a loss is independent of the other employee s losses. The amount of each loss is uniformly distributed on [, 5]. Given that one of the employees has incurred a loss in ecess of, determine the probability that losses will eceed reimbursements. Answer: 5 Solution: Let X be the amount of the loss of employee. Let if the employee has a loss Y = if the employee does not have a loss If Y =, then X =. If Y =, then X has the density f X ( ) = 4, for 5.

20 .7. PROBLEMS 9 Figure.3:, Sample Test Let X be the amount of the loss of employee. Let if the employee has a loss Y = if the employee does not have a loss If Y =, then X =. If Y =, then X has the density Then, f X ( ) = 4, for 5. P[X + Y 8 X ] = P[X +X 8] P[X ], and So, P[X + X 8] = ( ) d 4 d = 5 3 d 3 () = ( 3) 5 = (5 3) = () () 5 3 P[X ] = 4 5 d 4 = 4 P[X + X 8 X ] = = 3. 4 = 5..7 Problems. A player throws a fair die and simultaneously flips a fair coin. If the coin lands heads, then she wins twice the value that appears on the die, and if tails, then she wins one half of the value that appears on the die. Determine her epected winnings.. Let X be a random variable with uniform distribution on the interval [, ]. Find P X 5 X 3}.

21 9 CHAPTER. CONDITIONAL EXPECTATIONS, VARIANCES, ETC. 3. Let X and Y have the pdf f(, y) = 6(y ) if y else Find the marginal pfd s, the conditional pdf s and E[X Y = y] and E[Y X = ]. 4. Let X and be Y discrete random variables with joint probability function (+)(y+) for =,, ; y =,, 54 p(, y) = else What is E[Y X = ]? 5. Let X and Y be continuous random variables with joint density fucntion y for and y f(, y) = else What is P ( X Y X)? 6. Let X and Y be discrete random variables with joint probability function + y for =, and y =, 9 p(, y) = else Calculate E [ ] X Y 7. Let X and Y have the pdf f(, y) = 6(y ) if y else Find the marginal pfd s, f Y X (y ), E[X Y = y] and V(Y X = ). Check that E[E[Y X = ]] = E[Y ] and that V(Y ) = E[V(Y X)] + V(E[Y X]). 8. Let X and Y have a jointly continuous distribution with λe λ(y ) if y f Y X (y ) = else and f X () = Find the correlation between X and Y. λe λ if else

22 .7. PROBLEMS Let Y and Y be two jointly continuous random variables with joint density function have density function 3y if y, y, y + y, f(y, y ) = else. Find P (Y 3/4 Y /).. Let f(, y) = 6, < < y <, and zero otherwise. Find f X Y ( y).. Let X and Y be two random variables satisfying that Var(X) = and Var(Y ) = Var(X Y ) = 4. Find the covariance of the random variables X and Y.

Notes for Math 324, Part 19

Notes for Math 324, Part 19 48 Notes for Math 324, Part 9 Chapter 9 Multivariate distributions, covariance Often, we need to consider several random variables at the same time. We have a sample space S and r.v. s X, Y,..., which

More information

Math438 Actuarial Probability

Math438 Actuarial Probability Math438 Actuarial Probability Jinguo Lian Department of Math and Stats Jan. 22, 2016 Continuous Random Variables-Part I: Definition A random variable X is continuous if its set of possible values is an

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

November 2000 Course 1. Society of Actuaries/Casualty Actuarial Society

November 2000 Course 1. Society of Actuaries/Casualty Actuarial Society November 2000 Course 1 Society of Actuaries/Casualty Actuarial Society 1. A recent study indicates that the annual cost of maintaining and repairing a car in a town in Ontario averages 200 with a variance

More information

Probability and Statistics Notes

Probability and Statistics Notes Probability and Statistics Notes Chapter Five Jesse Crawford Department of Mathematics Tarleton State University Spring 2011 (Tarleton State University) Chapter Five Notes Spring 2011 1 / 37 Outline 1

More information

3 Conditional Expectation

3 Conditional Expectation 3 Conditional Expectation 3.1 The Discrete case Recall that for any two events E and F, the conditional probability of E given F is defined, whenever P (F ) > 0, by P (E F ) P (E)P (F ). P (F ) Example.

More information

Review of Probability. CS1538: Introduction to Simulations

Review of Probability. CS1538: Introduction to Simulations Review of Probability CS1538: Introduction to Simulations Probability and Statistics in Simulation Why do we need probability and statistics in simulation? Needed to validate the simulation model Needed

More information

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1

Joint probability distributions: Discrete Variables. Two Discrete Random Variables. Example 1. Example 1 Joint probability distributions: Discrete Variables Two Discrete Random Variables Probability mass function (pmf) of a single discrete random variable X specifies how much probability mass is placed on

More information

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY P SAMPLE EXAM SOLUTIONS

SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY P SAMPLE EXAM SOLUTIONS SOCIETY OF ACTUARIES/CASUALTY ACTUARIAL SOCIETY EXAM P PROBABILITY P SAMPLE EXAM SOLUTIONS Copyright 9 by the Society of Actuaries and the Casualty Actuarial Society Some of the questions in this study

More information

Course 1 Solutions November 2001 Exams

Course 1 Solutions November 2001 Exams Course Solutions November Exams . A For i =,, let R = event that a red ball is drawn form urn i i B = event that a blue ball is drawn from urn i. i Then if x is the number of blue balls in urn, ( R R)

More information

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable

CHAPTER 4 MATHEMATICAL EXPECTATION. 4.1 Mean of a Random Variable CHAPTER 4 MATHEMATICAL EXPECTATION 4.1 Mean of a Random Variable The expected value, or mathematical expectation E(X) of a random variable X is the long-run average value of X that would emerge after a

More information

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) = 1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find

More information

Dept. of Linguistics, Indiana University Fall 2015

Dept. of Linguistics, Indiana University Fall 2015 L645 Dept. of Linguistics, Indiana University Fall 2015 1 / 34 To start out the course, we need to know something about statistics and This is only an introduction; for a fuller understanding, you would

More information

Notes 12 Autumn 2005

Notes 12 Autumn 2005 MAS 08 Probability I Notes Autumn 005 Conditional random variables Remember that the conditional probability of event A given event B is P(A B) P(A B)/P(B). Suppose that X is a discrete random variable.

More information

STA 256: Statistics and Probability I

STA 256: Statistics and Probability I Al Nosedal. University of Toronto. Fall 2017 My momma always said: Life was like a box of chocolates. You never know what you re gonna get. Forrest Gump. There are situations where one might be interested

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

MATH/STAT 3360, Probability

MATH/STAT 3360, Probability MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each

More information

Ch. 5 Joint Probability Distributions and Random Samples

Ch. 5 Joint Probability Distributions and Random Samples Ch. 5 Joint Probability Distributions and Random Samples 5. 1 Jointly Distributed Random Variables In chapters 3 and 4, we learned about probability distributions for a single random variable. However,

More information

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24 STAT/MATH 395 A - Winter Quarter 17 - Midterm - February 17, 17 Name: Student ID Number: Problem #1 # #3 #4 Total Points /5 /7 /8 /4 /4 Directions. Read directions carefully and show all your work. Define

More information

ACTEX Seminar Exam P

ACTEX Seminar Exam P ACTEX Seminar Exam P Written & Presented by Matt Hassett, ASA, PhD 1 Remember: 1 This is a review seminar. It assumes that you have already studied probability. This is an actuarial exam seminar. We will

More information

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014

MATH 151, FINAL EXAM Winter Quarter, 21 March, 2014 Time: 3 hours, 8:3-11:3 Instructions: MATH 151, FINAL EXAM Winter Quarter, 21 March, 214 (1) Write your name in blue-book provided and sign that you agree to abide by the honor code. (2) The exam consists

More information

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) =

Math 151. Rumbos Fall Solutions to Review Problems for Exam 2. Pr(X = 1) = ) = Pr(X = 2) = Pr(X = 3) = p X. (k) = Math 5. Rumbos Fall 07 Solutions to Review Problems for Exam. A bowl contains 5 chips of the same size and shape. Two chips are red and the other three are blue. Draw three chips from the bowl at random,

More information

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation)

MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) MATHEMATICS 154, SPRING 2009 PROBABILITY THEORY Outline #11 (Tail-Sum Theorem, Conditional distribution and expectation) Last modified: March 7, 2009 Reference: PRP, Sections 3.6 and 3.7. 1. Tail-Sum Theorem

More information

Lecture 2: Repetition of probability theory and statistics

Lecture 2: Repetition of probability theory and statistics Algorithms for Uncertainty Quantification SS8, IN2345 Tobias Neckel Scientific Computing in Computer Science TUM Lecture 2: Repetition of probability theory and statistics Concept of Building Block: Prerequisites:

More information

Exam 1 - Math Solutions

Exam 1 - Math Solutions Exam 1 - Math 3200 - Solutions Spring 2013 1. Without actually expanding, find the coefficient of x y 2 z 3 in the expansion of (2x y z) 6. (A) 120 (B) 60 (C) 30 (D) 20 (E) 10 (F) 10 (G) 20 (H) 30 (I)

More information

SOCIETY OF ACTUARIES EXAM P PROBABILITY EXAM P SAMPLE SOLUTIONS

SOCIETY OF ACTUARIES EXAM P PROBABILITY EXAM P SAMPLE SOLUTIONS SOCIETY OF ACTUARIES EXAM P PROBABILITY EXAM P SAMPLE SOLUTIONS Copyright 8 by the Society of Actuaries Some of the questions in this study note are taken from past examinations. Some of the questions

More information

18.440: Lecture 26 Conditional expectation

18.440: Lecture 26 Conditional expectation 18.440: Lecture 26 Conditional expectation Scott Sheffield MIT 1 Outline Conditional probability distributions Conditional expectation Interpretation and examples 2 Outline Conditional probability distributions

More information

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf) Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline.

Random Variables. Cumulative Distribution Function (CDF) Amappingthattransformstheeventstotherealline. Random Variables Amappingthattransformstheeventstotherealline. Example 1. Toss a fair coin. Define a random variable X where X is 1 if head appears and X is if tail appears. P (X =)=1/2 P (X =1)=1/2 Example

More information

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam

Math 151. Rumbos Fall Solutions to Review Problems for Final Exam Math 5. Rumbos Fall 23 Solutions to Review Problems for Final Exam. Three cards are in a bag. One card is red on both sides. Another card is white on both sides. The third card in red on one side and white

More information

ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 2010 EDITION.

ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 2010 EDITION. A self published manuscript ARCONES MANUAL FOR THE SOA EXAM P/CAS EXAM 1, PROBABILITY, SPRING 21 EDITION. M I G U E L A R C O N E S Miguel A. Arcones, Ph. D. c 28. All rights reserved. Author Miguel A.

More information

STAT 430/510 Probability Lecture 7: Random Variable and Expectation

STAT 430/510 Probability Lecture 7: Random Variable and Expectation STAT 430/510 Probability Lecture 7: Random Variable and Expectation Pengyuan (Penelope) Wang June 2, 2011 Review Properties of Probability Conditional Probability The Law of Total Probability Bayes Formula

More information

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours

EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, Statistical Theory and Methods I. Time Allowed: Three Hours EXAMINATIONS OF THE HONG KONG STATISTICAL SOCIETY GRADUATE DIPLOMA, 008 Statistical Theory and Methods I Time Allowed: Three Hours Candidates should answer FIVE questions. All questions carry equal marks.

More information

More than one variable

More than one variable Chapter More than one variable.1 Bivariate discrete distributions Suppose that the r.v. s X and Y are discrete and take on the values x j and y j, j 1, respectively. Then the joint p.d.f. of X and Y, to

More information

Bivariate Distributions

Bivariate Distributions STAT/MATH 395 A - PROBABILITY II UW Winter Quarter 17 Néhémy Lim Bivariate Distributions 1 Distributions of Two Random Variables Definition 1.1. Let X and Y be two rrvs on probability space (Ω, A, P).

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

STT 441 Final Exam Fall 2013

STT 441 Final Exam Fall 2013 STT 441 Final Exam Fall 2013 (12:45-2:45pm, Thursday, Dec. 12, 2013) NAME: ID: 1. No textbooks or class notes are allowed in this exam. 2. Be sure to show all of your work to receive credit. Credits are

More information

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015.

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015. EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2015 Abhay Parekh February 17, 2015 Midterm Exam Last name First name SID Rules. You have 80 mins (5:10pm - 6:30pm)

More information

STAT 414: Introduction to Probability Theory

STAT 414: Introduction to Probability Theory STAT 414: Introduction to Probability Theory Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical Exercises

More information

The mean, variance and covariance. (Chs 3.4.1, 3.4.2)

The mean, variance and covariance. (Chs 3.4.1, 3.4.2) 4 The mean, variance and covariance (Chs 3.4.1, 3.4.2) Mean (Expected Value) of X Consider a university having 15,000 students and let X equal the number of courses for which a randomly selected student

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

ENGG2430A-Homework 2

ENGG2430A-Homework 2 ENGG3A-Homework Due on Feb 9th,. Independence vs correlation a For each of the following cases, compute the marginal pmfs from the joint pmfs. Explain whether the random variables X and Y are independent,

More information

STAT 418: Probability and Stochastic Processes

STAT 418: Probability and Stochastic Processes STAT 418: Probability and Stochastic Processes Spring 2016; Homework Assignments Latest updated on April 29, 2016 HW1 (Due on Jan. 21) Chapter 1 Problems 1, 8, 9, 10, 11, 18, 19, 26, 28, 30 Theoretical

More information

M378K In-Class Assignment #1

M378K In-Class Assignment #1 The following problems are a review of M6K. M7K In-Class Assignment # Problem.. Complete the definition of mutual exclusivity of events below: Events A, B Ω are said to be mutually exclusive if A B =.

More information

Statistics 427: Sample Final Exam

Statistics 427: Sample Final Exam Statistics 427: Sample Final Exam Instructions: The following sample exam was given several quarters ago in Stat 427. The same topics were covered in the class that year. This sample exam is meant to be

More information

Algorithms for Uncertainty Quantification

Algorithms for Uncertainty Quantification Algorithms for Uncertainty Quantification Tobias Neckel, Ionuț-Gabriel Farcaș Lehrstuhl Informatik V Summer Semester 2017 Lecture 2: Repetition of probability theory and statistics Example: coin flip Example

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Lecture Notes 2 Random Variables. Random Variable

Lecture Notes 2 Random Variables. Random Variable Lecture Notes 2 Random Variables Definition Discrete Random Variables: Probability mass function (pmf) Continuous Random Variables: Probability density function (pdf) Mean and Variance Cumulative Distribution

More information

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not?

Week 04 Discussion. a) What is the probability that of those selected for the in-depth interview 4 liked the new flavor and 1 did not? STAT Wee Discussion Fall 7. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they lied the new flavor, and the remaining 6 indicated they did not.

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Chapter 5 Class Notes

Chapter 5 Class Notes Chapter 5 Class Notes Sections 5.1 and 5.2 It is quite common to measure several variables (some of which may be correlated) and to examine the corresponding joint probability distribution One example

More information

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else

ECE 450 Homework #3. 1. Given the joint density function f XY (x,y) = 0.5 1<x<2, 2<y< <x<4, 2<y<3 0 else ECE 450 Homework #3 0. Consider the random variables X and Y, whose values are a function of the number showing when a single die is tossed, as show below: Exp. Outcome 1 3 4 5 6 X 3 3 4 4 Y 0 1 3 4 5

More information

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces.

Probability Theory. Introduction to Probability Theory. Principles of Counting Examples. Principles of Counting. Probability spaces. Probability Theory To start out the course, we need to know something about statistics and probability Introduction to Probability Theory L645 Advanced NLP Autumn 2009 This is only an introduction; for

More information

CME 106: Review Probability theory

CME 106: Review Probability theory : Probability theory Sven Schmit April 3, 2015 1 Overview In the first half of the course, we covered topics from probability theory. The difference between statistics and probability theory is the following:

More information

Notes for Math 324, Part 12

Notes for Math 324, Part 12 72 Notes for Math 324, Part 12 Chapter 12 Definition and main properties of probability 12.1 Sample space, events We run an experiment which can have several outcomes. The set consisting by all possible

More information

Part (A): Review of Probability [Statistics I revision]

Part (A): Review of Probability [Statistics I revision] Part (A): Review of Probability [Statistics I revision] 1 Definition of Probability 1.1 Experiment An experiment is any procedure whose outcome is uncertain ffl toss a coin ffl throw a die ffl buy a lottery

More information

STAT 430/510: Lecture 16

STAT 430/510: Lecture 16 STAT 430/510: Lecture 16 James Piette June 24, 2010 Updates HW4 is up on my website. It is due next Mon. (June 28th). Starting today back at section 6.7 and will begin Ch. 7. Joint Distribution of Functions

More information

STAT/MATH 395 PROBABILITY II

STAT/MATH 395 PROBABILITY II STAT/MATH 395 PROBABILITY II Bivariate Distributions Néhémy Lim University of Washington Winter 2017 Outline Distributions of Two Random Variables Distributions of Two Discrete Random Variables Distributions

More information

Jointly Distributed Random Variables

Jointly Distributed Random Variables Jointly Distributed Random Variables CE 311S What if there is more than one random variable we are interested in? How should you invest the extra money from your summer internship? To simplify matters,

More information

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term

Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term Tutorial for Lecture Course on Modelling and System Identification (MSI) Albert-Ludwigs-Universität Freiburg Winter Term 2016-2017 Tutorial 3: Emergency Guide to Statistics Prof. Dr. Moritz Diehl, Robin

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Discussion 03 Solutions

Discussion 03 Solutions STAT Discussion Solutions Spring 8. A new flavor of toothpaste has been developed. It was tested by a group of people. Nine of the group said they liked the new flavor, and the remaining indicated they

More information

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution Pengyuan (Penelope) Wang June 15, 2011 Review Discussed Uniform Distribution and Normal Distribution Normal Approximation

More information

Chapter 4 Multiple Random Variables

Chapter 4 Multiple Random Variables Review for the previous lecture Theorems and Examples: How to obtain the pmf (pdf) of U = g ( X Y 1 ) and V = g ( X Y) Chapter 4 Multiple Random Variables Chapter 43 Bivariate Transformations Continuous

More information

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions

Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions Statistics for Managers Using Microsoft Excel/SPSS Chapter 4 Basic Probability And Discrete Probability Distributions 1999 Prentice-Hall, Inc. Chap. 4-1 Chapter Topics Basic Probability Concepts: Sample

More information

Exam 1 Review With Solutions Instructor: Brian Powers

Exam 1 Review With Solutions Instructor: Brian Powers Exam Review With Solutions Instructor: Brian Powers STAT 8, Spr5 Chapter. In how many ways can 5 different trees be planted in a row? 5P 5 = 5! =. ( How many subsets of S = {,,,..., } contain elements?

More information

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R In probabilistic models, a random variable is a variable whose possible values are numerical outcomes of a random phenomenon. As a function or a map, it maps from an element (or an outcome) of a sample

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance.

Chapter 2. Random Variable. Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. Chapter 2 Random Variable CLO2 Define single random variables in terms of their PDF and CDF, and calculate moments such as the mean and variance. 1 1. Introduction In Chapter 1, we introduced the concept

More information

STA 584 Supplementary Examples (not to be graded) Fall, 2003

STA 584 Supplementary Examples (not to be graded) Fall, 2003 Page 1 of 8 Central Michigan University Department of Mathematics STA 584 Supplementary Examples (not to be graded) Fall, 003 1. (a) If A and B are independent events, P(A) =.40 and P(B) =.70, find (i)

More information

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS

Math 426: Probability MWF 1pm, Gasson 310 Exam 3 SOLUTIONS Name: ANSWE KEY Math 46: Probability MWF pm, Gasson Exam SOLUTIONS Problem Points Score 4 5 6 BONUS Total 6 Please write neatly. You may leave answers below unsimplified. Have fun and write your name above!

More information

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, Second Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

Bivariate distributions

Bivariate distributions Bivariate distributions 3 th October 017 lecture based on Hogg Tanis Zimmerman: Probability and Statistical Inference (9th ed.) Bivariate Distributions of the Discrete Type The Correlation Coefficient

More information

Creating New Distributions

Creating New Distributions Creating New Distributions Section 5.2 Stat 477 - Loss Models Section 5.2 (Stat 477) Creating New Distributions Brian Hartman - BYU 1 / 18 Generating new distributions Some methods to generate new distributions

More information

ACM 116: Lectures 3 4

ACM 116: Lectures 3 4 1 ACM 116: Lectures 3 4 Joint distributions The multivariate normal distribution Conditional distributions Independent random variables Conditional distributions and Monte Carlo: Rejection sampling Variance

More information

n x u x v n x = (u+v) n. p x (1 p) n x x = ( e t p+(1 p) ) n., x = 0,1,2,... λe = e λ t ) x = e λ e λet = e λ(et 1).

n x u x v n x = (u+v) n. p x (1 p) n x x = ( e t p+(1 p) ) n., x = 0,1,2,... λe = e λ t ) x = e λ e λet = e λ(et 1). Eercise 8 We can write E(X b) 2 = E ( X EX +EX b ) 2 = E ( (X EX)+(EX b) ) 2 = E(X EX) 2 +(EX b) 2 +2E ( (X EX)(EX b) ) = E(X EX) 2 +(EX b) 2 +2(EX b)e(x EX) = E(X EX) 2 +(EX b) 2 as E(X EX) = This is

More information

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition)

Exam P Review Sheet. for a > 0. ln(a) i=0 ari = a. (1 r) 2. (Note that the A i s form a partition) Exam P Review Sheet log b (b x ) = x log b (y k ) = k log b (y) log b (y) = ln(y) ln(b) log b (yz) = log b (y) + log b (z) log b (y/z) = log b (y) log b (z) ln(e x ) = x e ln(y) = y for y > 0. d dx ax

More information

Final Exam # 3. Sta 230: Probability. December 16, 2012

Final Exam # 3. Sta 230: Probability. December 16, 2012 Final Exam # 3 Sta 230: Probability December 16, 2012 This is a closed-book exam so do not refer to your notes, the text, or any other books (please put them on the floor). You may use the extra sheets

More information

Appendix A : Introduction to Probability and stochastic processes

Appendix A : Introduction to Probability and stochastic processes A-1 Mathematical methods in communication July 5th, 2009 Appendix A : Introduction to Probability and stochastic processes Lecturer: Haim Permuter Scribe: Shai Shapira and Uri Livnat The probability of

More information

Math 365 Final Exam Review Sheet. The final exam is Wednesday March 18 from 10am - 12 noon in MNB 110.

Math 365 Final Exam Review Sheet. The final exam is Wednesday March 18 from 10am - 12 noon in MNB 110. Math 365 Final Exam Review Sheet The final exam is Wednesday March 18 from 10am - 12 noon in MNB 110. The final is comprehensive and will cover Chapters 1, 2, 3, 4.1, 4.2, 5.2, and 5.3. You may use your

More information

Math Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them

Math Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them Math 302.102 Fall 2010 Some Old Math 302 Exams There is always a danger when distributing old exams for a class that students will rely on them solely for their final exam preparations. The final exam

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Lecture 1. ABC of Probability

Lecture 1. ABC of Probability Math 408 - Mathematical Statistics Lecture 1. ABC of Probability January 16, 2013 Konstantin Zuev (USC) Math 408, Lecture 1 January 16, 2013 1 / 9 Agenda Sample Spaces Realizations, Events Axioms of Probability

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx

f X, Y (x, y)dx (x), where f(x,y) is the joint pdf of X and Y. (x) dx INDEPENDENCE, COVARIANCE AND CORRELATION Independence: Intuitive idea of "Y is independent of X": The distribution of Y doesn't depend on the value of X. In terms of the conditional pdf's: "f(y x doesn't

More information

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA (krzysio@krzysio.net) Errata Effective July 5, 3, only the latest edition of this manual will have its errata

More information

Expectation of Random Variables

Expectation of Random Variables 1 / 19 Expectation of Random Variables Saravanan Vijayakumaran sarva@ee.iitb.ac.in Department of Electrical Engineering Indian Institute of Technology Bombay February 13, 2015 2 / 19 Expectation of Discrete

More information

MATH/STAT 3360, Probability FALL 2018 Hong Gu

MATH/STAT 3360, Probability FALL 2018 Hong Gu MATH/STAT 3360, Probability FALL 2018 Hong Gu Lecture Notes and In Class Examples August 19, 2018 1 / 142 Basic Principal of Counting Question 1 A statistics textbook has 8 chapters. Each chapter has 50

More information

MATH 3200 PROBABILITY AND STATISTICS M3200SP081.1

MATH 3200 PROBABILITY AND STATISTICS M3200SP081.1 MATH 3200 PROBABILITY AND STATISTICS M3200SP081.1 This examination has twenty problems, of which most are straightforward modifications of the recommended homework problems. The remaining problems are

More information

Functions of two random variables. Conditional pairs

Functions of two random variables. Conditional pairs Handout 10 Functions of two random variables. Conditional pairs "Science is a wonderful thing if one does not have to earn a living at it. One should earn one's living by work of which one is sure one

More information

Lecture 10. Variance and standard deviation

Lecture 10. Variance and standard deviation 18.440: Lecture 10 Variance and standard deviation Scott Sheffield MIT 1 Outline Defining variance Examples Properties Decomposition trick 2 Outline Defining variance Examples Properties Decomposition

More information

Discrete random variables and probability distributions

Discrete random variables and probability distributions Discrete random variables and probability distributions random variable is a mapping from the sample space to real numbers. notation: X, Y, Z,... Example: Ask a student whether she/he works part time or

More information

Lecture 11. Probability Theory: an Overveiw

Lecture 11. Probability Theory: an Overveiw Math 408 - Mathematical Statistics Lecture 11. Probability Theory: an Overveiw February 11, 2013 Konstantin Zuev (USC) Math 408, Lecture 11 February 11, 2013 1 / 24 The starting point in developing the

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5

Stochastic processes Lecture 1: Multiple Random Variables Ch. 5 Stochastic processes Lecture : Multiple Random Variables Ch. 5 Dr. Ir. Richard C. Hendriks 26/04/8 Delft University of Technology Challenge the future Organization Plenary Lectures Book: R.D. Yates and

More information

Midterm Exam 1 Solution

Midterm Exam 1 Solution EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2015 Kannan Ramchandran September 22, 2015 Midterm Exam 1 Solution Last name First name SID Name of student on your left:

More information

Class 26: review for final exam 18.05, Spring 2014

Class 26: review for final exam 18.05, Spring 2014 Probability Class 26: review for final eam 8.05, Spring 204 Counting Sets Inclusion-eclusion principle Rule of product (multiplication rule) Permutation and combinations Basics Outcome, sample space, event

More information

Math 218 Supplemental Instruction Spring 2008 Final Review Part A

Math 218 Supplemental Instruction Spring 2008 Final Review Part A Spring 2008 Final Review Part A SI leaders: Mario Panak, Jackie Hu, Christina Tasooji Chapters 3, 4, and 5 Topics Covered: General probability (probability laws, conditional, joint probabilities, independence)

More information

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999.

Math st Homework. First part of Chapter 2. Due Friday, September 17, 1999. Math 447. 1st Homework. First part of Chapter 2. Due Friday, September 17, 1999. 1. How many different seven place license plates are possible if the first 3 places are to be occupied by letters and the

More information