1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions in the test/exam to range over the entire set of notes and exercises for the course. If a proof or any other part of the material we have discussed is not examinable then this is clearly stated in the notes. Most of the questions 1-9 are included here in order to help you to revise the Intro to Probability material which is used in this course. Answers to some of these questions should be looked up in your Intro to Probability notes (IPN for short). Introduction 1. What is a Probability Space? 2. What is the definition of a random variable? 3. What is the definition of a discrete random variable? List the main properties of the probabilities P X = x i } p i. Examples of discrete probability mass functions: Bernoulli, binomial, Poisson, geometric, negative binomial. (See your IPN.) 4. What is the definition of the expectation of a discrete random variable? 5. How and under what conditions can we compute E(g(X)), where g : R R is a function? 6. What is the definition of a conditional probability P (A/B)? 7. Define independence of two events; of any finite number of events. 8. State and prove the Theorem of Total Probability (the total probability formula). Be able to use it as in examples. The Voting Problem - This section in the Notes is for those who want to know more about applications of the Theorem of Total Probability. It is not examinable. 9. Explain what is the voting problem. Prove the main result concerned with this problem and be bale to use it. Random Walks 10. Give the definition of a random walk on a line. 11. What is the gambler s ruin problem? Give the re-formulation of it in terms of the behaviour of a random walk on a finite interval.you are supposed to be able to solve problems stated in terms of the gambler s ruin by reducing them to problems about random walks. Examples include finding the probability that one of the players will win the game (and the other will be ruined) or that, when playing in a casino, the gambler wins an infinite amount of money. 12. Let X t be a simple random walk on [M, N]. Let X 0 = n and r n be the probability that the random walk starting from n, M n N, will reach N before reaching M. Prove that rn = pr n+1 + qr n 1 if M < n < N r M = 0, r N = 1
2 13. Know the statements of results about the Second Order Difference Equations. Proofs are not examinable. 14. Solve the equations stated in question 12 and prove that r n = λ n λ M λ N λ M n M if p q N M if p = q = 0.5 where λ = q/p. You are supposed to remember this formula. (1) 15. Suppose the random walk starts from n and M n. What is the probability that it would reach + before visiting M? Know the derivation of the corresponding formula. 16. Suppose that X 0 = 0. Prove that the probability for a random walk to return to 0 is 2p if p < q and 2q if q p. 17. Be able to state and prove the Theorem of Total Probability for Expectations. 18. Suppose that a random walk is starting from n, M n N. The walk stops once it reaches M or N. Let E n be the expected duration of the walk. Prove that E 0 = E N = 0, (2) E n = pe n+1 + qe n 1 + 1 for 0 < n < N. (3) 19. Know how to solve equations (2)-(3) and remember the result in the case p = q = 1/2. 20. Prove the following statement: Suppose that p = q = 1/2 and a random walk starts from position 1. Then the expected time until it reaches zero is infinite! Conditional Expectations as Random Variables 21. Define E(X Y ), Var (X Y ). 22. Prove the Tower law for expectations. 23. Prove that Var (X) = E(Var (X Y )) + Var (E(X Y )). 24. Define what is a Random sum of Random Variables. Prove the following Theorem 1. Suppose X 1, X 2, X 3,... are independent identically distributed random variables with mean µ and variance σ 2, and that N is another independent non-negative integer-valued random variable. Let Y = N i=1 X i. Then E(Y ) = E(N)µ Var (Y ) = σ 2 E(N) + µ 2 Var (N). 25. Generating Functions. Know all definitions and theorems about the probability generating functions. Branching processes. 26. What is the definition of a branching process? 27. Prove the following theorem TheoremSuppose that X is a random variable with mean µ and Y 0, Y 1, Y 2... is the branching process generated by X. Then E(Y n ) = µ n, n 1.
3 28. Prove the following statement:suppose Y 0, Y 1, Y 2... is a branching process generated by a random variable X with mean µ < 1. Then lim P(Y n = 0) = 1. n 29. Let G n (t) = E ( t Yn) be the probability generating function of Y n. Prove the following theorem. Theorem Suppose Y 0, Y 1, Y 2... is a branching process generated by a random variable X with probability generating function G. Then 30. Prove that equation (4) implies G n (t) = G n 1 (G(t)). (4) G n+1 (t) = G(G n (t)). (5) 31. Denote by θ n = P(Y n = 0) - the probability of extinction of the branching process by time n. Prove that θ n = G(θ n 1 ) with θ 0 = 0. 32. How do you find the probability of ultimate extinction of a branching process? State and prove the related theorem. Example. Suppose that P(X = 0) = 0.3, P(X = 2) = 0.7. Find θ 1, θ 2, θ 3. Find the probability of ultimate extinction in this case. 33. State, in terms of the mean value of the generating random variable, the necessary and sufficient condition for the probability of ultimate extinction of a branching process to be equal to 1. Continuous random variables. 34. What is the definition of a continuous random variable? 35. What is the definition of the probability density function? 36. State the main properties of probability density functions. 37. Know the following examples of probability density functions: uniform, exponential, Gamma, Normal. 38. Define what is a cumulative distribution function (c.d.f.) of a random variable X? How do you find the the c.d.f. of X if its p.d.f. f X (x) is given? And how do you find the p.d.f. of X if the c.d.f. F X is given? How do you find E(g(X)) in terms of f X (x)? 39. Let X N(µ, σ 2 ) be a normal random variable. Prove that E(X) = µ, Var (X) = σ 2. 40. Suppose that X and Y are random variables. Define what does it mean to say that X and Y are jointly continuous? 41. Define the joint probability density function of two random variables. 42. What are the main properties of the joint probability density function of two random variables?
4 43. Define the joint distribution function F X,Y of two random variables X, Y. Express F X,Y (x, y) in terms of the joint p.d.f. f X,Y of X, Y. 44. Prove that F X (x) = F X,Y (x, ), F Y (y) = F X,Y (, y). 45. Prove that f X,Y (x, y) = 2 x y F X,Y (x, y). 46. Prove that the marginal densities can be found as follows: f X (x) = f X,Y (x, y) dy and f Y (y) = f X,Y (x, y) dx. 47. Give the definition of independence of two random variables. 48. What is the necessary and sufficient condition for two continuous r. v.s to be independent expressed in terms of probability density functions f X,Y (x, y), f X (x), f Y (y)? 49. Prove that two continuous random variables X and Y are independent if and only if there are functions g and h such that f X,Y (x, y) = g(x)h(y) for all x, y. Conditional distributions (continuous case). 50. Let X and Y be jointly continuous random variables with joint density function f X,Y. What is the conditional density function of X given Y = y, f X Y =y (x)? What is f Y X=x (y)? 51. What is the definition of E(X Y = y) and of E(X Y )? 52. Prove that E(X) = E(E(X Y )). 53. Let g(x, y) be a function of two real variables x, y. How do you find E(g(X, Y )) in terms of f X,Y (x, y)? Express E((X µ 1 ) k (Y µ 2 ) m ) in terms of f X,Y (x, y). 54. Exercise. Prove that if random variables X, Y are independent then E(X k Y m ) = E(X k ) E(Y m ). 55. What is the definition of covariance and correlation of two random variables? 56. The bivariate normal distribution. You are not asked to remember the formula for the joint p.d.f. of X, Y. Rather, you will be told what f X,Y (x, y) is. But, given f X,Y (x, y), you are supposed to be able to prove all statements concerning the bivariate normal distribution. Exercise. Prove that two normal random variables are independent if and only if the parameter ρ of the normal distribution is zero. Poisson Processes 57. Give the definition of the Poisson process N(t) with rate λ > 0. 58. What is the joint distribution of the values of the Poisson process at times t 1, t 2,..., t n, where t 1 < t 2 <... < t n?
5 59. What is the definition of the arrival time T n of a Poisson process? λe λx if x > 0 Prove that f T1 (x) = 0 if x 0. λ 2 xe λx if x > 0, Prove that f T2 (x) = 0 if x 0. Be able to state f Tn (x). 60. Prove that f T1,T 2 (x, y) = λ 2 e λy if y x 0 0 otherwise. 61. Prove that the random variable T 1 T 2 = y is uniformly distributed on [0, y]. 62. What is the definition of the inter-arrival time W n,, n 1? Know the statement of the theorem describing the joint distribution of n inter-arrival times W 1, W 2,..., W n. 63. Prove that W 1, W 2 are independent random variables each having the exponential distribution with parameter λ. Inequalities, Law of Large Numbers, Central Limit Theorem. 64. State and prove Markov s inequality. 65. State and prove Chebyshev s inequality. 66. State and prove the Law of Large Numbers. 67. State and prove the Bernoulli Law of Large Numbers. 68. State the Central Limit Theorem (CLT). Be able to apply the Central Limit Theorem. Be able to give answers in terms of an integral or the Φ function (examples in the Notes and exercise sheet 11)