MCS 341 Probability Theory Name. Final Exam: Probability Theory 17 December 2010 This is a closed-book test. You may, however, use one new 3 -by-5 note card, your previous note cards, the tables provided, and a calculator (for calculation/graphing only). As usual, show support as appropriate. Do FIVE problems. Cross out the number of the one you do not want graded. Honor pledge: On my honor, I pledge that I have not given, received, or tolerated others use of unauthorized aid in completing this work. Please sign when you are finished: (signed) 1. (20%) BASIC CONCEPTS (a) What are the components of the basic probability space model (S, F,P)? (b) What are the postulates (of Kolmogorov) for the probability measure? (c) What is a random variable? (d) What is the distribution of a random variable?
2. (20%) In the Poisson process, the random number N t of arrivals during the time interval [0,t] (or any interval of duration t) has a Poisson distribution with parameter λt, where λ is a positive parameter (the average arrival rate). Also, the numbers of arrivals in nonoverlapping time intervals are independent random variables. (a) Let T 1 be the time (after t = 0) until the first arrival. Let t > 0. P(T 1 > t) =? (b) Let f(t) be the density function of the random time T 1. Using the previous part, deduce the simple formula for f(t) for t > 0, showing that T 1 has an Exponential distribution. (c) Let T 2 be the length of time after T 1 until the next arrival. Then T 1 and T 2 are independent random variables, and T 1 + T 2 is the time of the second arrival. What is the distribution of T 1 + T 2? Explain, or calculate the density.
3. (20%) (Jointly distributed discrete r.v.s) Assume that X and Y are discrete random variables whose joint probability function s nonzero values are given in the following table. x y 1 0 1 1.30.18.12 1.20.02.18 (a) What is the (marginal) probability function of X? (b) What is the (marginal) probability function of Y? (c) What is the conditional probability function of Y given X? (Give defined non-zero values only.) (d) Are X and Y independent random variables? Explain. Continue on the next page.
The problem continues. Here, again, is the joint probability table. x y 1 0 1 1.30.18.12 1.20.02.18 (e) Calculate the expected values of X and Y. (f) Calculate the variances of X and Y. (g) Calculate the variance of X Y.
4. (20%) Let X 1,X 2,X 3,... be independent random variables all having the Gamma(α,λ) distribution for some fixed α > 0 and λ > 0. (a) Show the calculation that leads to this formula for the moment generating function: for s < λ. m Xj (s) = ( λ ) α λ s (b) For n = 1,2,3,..., let S n = X 1 + X 2 + + X n. Deduce the formula for the moment generating function of S n, and thence deduce the distribution of S n. (c) Now assume α = 4, λ = 0.1, and n = 49. Calculate a good approximation of the value of P(37 S n n 44).
5. (20%) (Probability calculations) The dice game craps is played with two presumably fair dice as follows: The player rolls the two dice. If the result is 7 or 11, the player wins, while if the result is 2, 3, or 12, the player loses. If the result is 4, 5, 6, 8, 9, or 10, then the player rolls the dice again, repeatedly if necessary, until the first result is matched a win for the player or a 7 is obtained a loss for the player. (a) Let X be the result of one roll of the two dice. Complete the following table giving the probability function of X, p X (x) = P(X = x) x 2 3 4 5 6 7 8 9 10 11 12 p X (x) (b) Give a formula for the conditional probability that the player wins on the k th roll after the initial roll if 4 was the result of the initial roll. (c) Calculate the probability that the player first rolls a 4 and then ultimately wins. (d) (Extra credit) Show that the probability that the player wins is 244/495. (There s a relatively easy way to do it, and there s an arduous way.)
6. (20%) (Stochastic processes) (a) Suppose that X 1,X 2,X 3,... are independent identically distributed random variables, and suppose that P(X 1 = 1) = p, P(X 1 = 0) = q, P(X 1 = 1) = r where p,q,r are nonnegative numbers whose sum is 1. Let S 0 = X 0 and, for n > 0, let S n = X 0 + X 1 + X 2 + + X n. Let T 0 denote the first time that S n = 0, i.e., T 0 = min{n : S n = 0}. Likewise, given a positive integer c, let T c denote the first time that S n = c, i.e., T c = min{n : S n = c}. (If S n is never c, we allow T c = ; likewise for T 0.) Let R k = P(T 0 < T c S 0 = k), which may be interpreted as the probability, starting at k, of getting to 0 before getting to c Deduce a recurrence equation for R k valid for integers k satisfying 0 < k < c. (b) Suppose that an electronic component has two states, zero (0), and one (1). Furthermore, at each tick of a clock, it may change state, depending on its current state. Let X n denote the state at time n, and assume that for n = 0,1,2,..., [ P(Xn+1 = 0 X n = 0) P(X n+1 = 1 X n = 0) P(X n+1 = 0 X n = 1) P(X n+1 = 1 X n = 1) Calculate [ P(X4 = 0 X 0 = 0) P(X 4 = 1 X 0 = 0) P(X 4 = 0 X 0 = 1) P(X 4 = 1 X 0 = 1) ] = ] [ 0.8 0.2 0.1 0.9. ].
(Extra page) Merry Christmas! Happy Hanukkah! Habari Gani! Have a great break! References [1] Evans and Rosenthal, Probability and Statistics: The Science of Uncertainty 2/e, W.H. Freeman, New York, 2009.