Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different experiments: 1. You choose one coin at random and toss it repeatedly. 2. You repeatedly choose a coin at random and toss it. In both cases calculate the average number of Heads among the first n tosses, and the average time you have to wait for the first Head. Solution. (b) 1. (a) EN = ET = = 1 3 E[N p = p i ]P(p = p i ) = E(T p = p i )P(p = p i ) = 1 3 n=1 n(1 p i ) n 1 p i = 1 3 np i 1 3 = 1 3 (p 1 + p 2 + p i ) n=1 p 2 i np(t = n p = p i ) p i = 1 3 ( 1 p 1 + 1 p 2 + 1 p 3 ). 2. (a) Let X i = 1 if the coin shows a Head on the ith turn, and zero otherwise, for i = 1,, n. Then N = X 1 + + X n EN = E(X 1 + + X n ) = EX 1 + + EX n. Since EX k = 3 E(X i p = p i )P(p = p i ) = 1 3 3 p i, then EN = 1 3 p i + + 1 3 p i = n 3 (p 1 + p 2 + p 3 ). (b) Since P(X i = 1) = 3 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3 3 is independent of i, and X i are independent, then T is a geometric random variable with parameter p = p1+p2+p3 3. Therefore ET = 1 p = 1 p 1 + p 2 + p 3. 1
Solutions of Final Exam 2 2. (15 points) Let X 1,, X 4 be four independent random variables, and g i : R 2 R functions for i = 1, 2. Show that Y 1 = g 1 (X 1, X 2 ) and Y 2 = g 2 (X 3, X 4 ) are independent. Proof. We want to show that P(Y 1 F, Y 2 G) = P(Y 1 F )P(Y 2 G), for any event F and G. There are sets A 1, A 2 such that {Y 1 F } := {g 1 (X 1, X 2 ) F } = {X 1 A 1, X 2 A 2 }. Similarly, there are sets A 3, A 4, such that Therefore, {Y 2 G} := {g 2 (X 3, X 4 ) G} = {X 3 A 3, X 4 A 4 }. P(Y 1 F, Y 2 G) = P(X i A i, for i = 1,, 4) = P(X 1 A 1, X 2 A 2 )P(X 3 A 3, X 4 A 4 ) = P(Y 1 F )P(Y 2 G).
Solutions of Final Exam 3 3. (2 points) Let Z 1, Z 2, be a sequence of independent Bernoulli p. Let X 1, X2, denote the time of the first, second, success. (e.g. for the outcome, 1, 1,,, 1,, 1, we have X 1 = 2, X 2 = 3, X 3 = 6, ) 1. Find the joint PMF of X 1,, X n, i.e., P(X 1 = k 1,, X n = k n ). 2. Use (a) to calculate the PMF of X n, the time of the n-th success. 3. Use (a) to calculate the PMF of the RVs Y 1 = X 1, Y 2 = X 2 X 1,, Y n = X n X n 1, and show that they are independent geometric RVs. Solution. 1. Since X 1 < < X n, then P(X n = k 1,, X n = k n ) = if k 1 < < k n doesn t hold. If k 1 < < k n, then P(X 1 = k 1,, X n = k n ) = P(Z 1 =,, Z k1 1 =, Z k1 = 1,, Z kn 1+1 =, Z kn 1 =, Z kn=1) = P(Z 1 = ) P(Z k1 1 = )P(Z k 1 = 1) P(Z kn 1+1 = ) P(Z kn 1 = )P(Z kn = 1) = (1 p) kn n p n 2. Let Λ = {(k 1,, k n 1 ) N n 1 : k 1 < < k n 1 < k}. Then P(X n = k) = P(X 1 = k 1,, X n 1 = k n 1, X n = k) 3. = = (k 1,,k n 1) Λ (k 1,,k n 1) Λ ( k 1 n 1 (1 p) k n p n = (1 p) k n p n #Λ ) (1 p) k n p n, where N n is the set of n-tuples with coordinates in N and #Λ denotes the number of elements in Λ. P(Y 1 = l 1, Y 2 = l 1,, Y n = l n ) = P(X 1 = l 1, X 2 = l 1 + l 2,, X n = l 1 + + l n ) = (1 p) l1+ +ln n p n = p(1 p) l1 1 p(1 p) l2 1 p(1 p) ln 1, Next we compute P(Y n = l n ), as follows. P(Y n = l n ) = P(Y 1 = l 1,, Y n = l n ) l 1,,l n 1 = l 1=1 l n 1=1 (1 p) l1+ +ln n p n = p(1 p) ln 1 p n 1 (1 p) l1 1 = p(1 p) ln 1. l 1=1 l n 1=1 (1 p) ln 1 1
Solutions of Final Exam 4 This shows that Y n is a geometric random variable, with parameter p. Furthermore, we can write P(Y 1 = l 1, Y 2 = l 1,, Y n = l n ) = P(Y 1 = l 1 ) P(Y n = l n ). Therefore Y 1,, Y n are independent.
Solutions of Final Exam 5 4. (2 points) Suppose that every day can be described by either S (sunny) or R (rainy). The probability for a day to be sunny is 4/8 if the preceding day was rainy, 6/8 if the preceding day was sunny, but 7/8 if both of the preceding days were sunny. 1. Let S = {S, R}, and let X n = the weather at nth day. Is X n a Markov chain? 2. Define a Markov chain on the state space S = {SS, SR, RS, RR} that describes the above weather model, and determine the corresponding transition matrix. 3. Given that the weekend(saturday and Sunday) was sunny calculate the probability for the next five days to be SSRSS. Solution. 1. No, because knowing the weather status of today, tomorrow s weather is not independent of weather of yesterday. 7/8 1/8 SS 2/8 6/8 SR 4/8 S 4/8 RS 4/8 P = 7/8 1/8 4/8 4/8 6/8 2/8 4/8 4/8 RR 2. 3. 4/8 P(SSRSS SS) = P(X 1 = SS, X 2 = SS, X 3 = SR, X 4 = RS, X 5 = SS X = SS) = 7 7 1 4 6 8 8 8 8 8 =.35
Solutions of Final Exam 6 5. (1 points) How often do you have to roll a fair die on the average until you have seen all possible values? (Hint: Let X n denote the number of different values you have seen at time n.) Solution. Let T = inf{n : X n = 6}, be the hitting time of the state 6, and let t i = E i T, where E i denotes the expectation with respect to the conditional probability law P i (A) := P(A X = i). Then 1/6 2/6 3/6 4/6 5/6 1 1 5/6 4/6 3/6 2/6 1/6 1 2 3 4 5 6 t i = E i (T ) = 1 + E i (T X 1 = i)p i (X 1 = i) + E i (T X i = i + 1)P i (X 1 = i + 1) Remember that E i (T X 1 = i) = t i, therefore we have t = 1 + t 1 t 1 = 1 + 1 6 t 1 + 5 6 t 2 t 2 = 1 + 2 6 t 2 + 4 6 t 3 t 3 = 1 + 3 6 t 3 + 3 6 t 4 t 4 = 1 + 4 6 t 4 + 2 6 t 5 t = 14.7 t 5 = 1 + 5 6 t 5 + 1 6 t 6 t 6 =
Solutions of Final Exam 7 6. (2 points) Consider the following dice game. A pair of dice are rolled. If the sum is 7 then the game ends and you win. If the sum is not 7, then you have the option of either stopping the game and receiving an amount equal to that sum or starting over again. For each value of i, i = 1,, 12 find your expected return if you employ the strategy of stopping the first time that a value at least as large as i appears. What value of i leads to the largest expected return? (Hint: Let X i denote the return when you use the critical value i. To compute EX i, condition on the first sum) Solution. Assume E i G denote the expected gain, when our strategy is to stop as soon as the sum is at least i, where possible i s are 2,, 12. Let s calculate E i G. E i G = E i [G S < i]p(s < i) + E i [G S i]p(s i) i 1 12 = E i [G S = k]p(s = k) + E i [G S = k]p(s = k) k=2 = E i [G] i 1 k=2,k P(S = k) + k=i 12 k=i,k 7 kp(s = k), where we exclude k = 7 because E i [G S = k] =, when k = 7. We can solve this equation for E i G. If we observe that P(S < i) = E i G = 12 k=i,k 7 kp(s = k) 1 i 1 k=2,k P(S = k). The numerator, and the denominator, are both finite sums, and P(S = k) is easy to calculate. Therefore the numerical value of E i G is computable. The result will be E i attains its largest value when i = 7 and i = 8.
Solutions of Final Exam 8 7. (2 points) The number of accidents that a person has in a given year is a Poisson random variable with mean λ. However, suppose that the value of λ varies in from person to person. Assume the proportion of the population having a value of λ less than x is equal to 1 e x. If a person is chosen at random what is the probability that he will have 1. accidents in a year. 2. accidents in a year given he had no accident the preceding year (Hint: instead of P(λ < x) = 1 e x, first solve the problem for an easier case where P(λ = 2) =. and P(λ = 2.5) =.7. ) Solution. 1. Let L denote the number accidents of this person. We don t know the mean λ associated to this person. But given λ = x, L is a Poisson random variable N with mean x. Therefore, P(L = λ = x) = P(N = ) = e x. (.1) By the total probability formula for the continuous random variable λ, P(L = ) = P(L = λ = x)p λ (x)dx, where p λ (x) is the density of λ. Since P(λ < x) = 1 e x, then λ is an exponential random variable with mean 1. Then p λ (x) = e x. We also know that given λ = x, then N is a Poisson random variable with mean x therefore P(L = ) = 2. Define two events A and B as follows e x e x dx =.5 A := {This driver will not have an accident this year}, B := {This driver didn t have any accident last year}. Notice that P(A λ = x) = P(B λ = x) = e x, as in (.1). Then P(A B) = P(A B) P(B) Observe that A, and B are not independent, (if somebody had 1 accidents last year, he probably has a larger λ, and this means he is likely to have a few accidents this year too), however, conditionally on λ they are independent (Compare this with Example 1.21 on page 37 of your text). Therefore P(A B) = = P(A B λ = x)p λ (x)dx P(A λ = x)p(b λ = x)p λ (x)dx = Similarly, P(B) = 1 3, then the final answer is 2/3. e 3x dx = 1 3.