Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Similar documents
STAT Chapter 5 Continuous Distributions

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Random Variables and Their Distributions

Probability and Distributions

2 Functions of random variables

THE QUEEN S UNIVERSITY OF BELFAST

STAT 414: Introduction to Probability Theory

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

4 Branching Processes

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

1 Presessional Probability

STA2603/205/1/2014 /2014. ry II. Tutorial letter 205/1/

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

1.1 Review of Probability Theory

Lecture 1: August 28

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

STAT 418: Probability and Stochastic Processes

Actuarial Science Exam 1/P

Chapter 5. Chapter 5 sections

Probability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9

Limiting Distributions

Exercises and Answers to Chapter 1

4. Distributions of Functions of Random Variables

MATH Notebook 4 Fall 2018/2019

MAS223 Statistical Inference and Modelling Exercises

Joint Distribution of Two or More Random Variables

We introduce methods that are useful in:

1 Review of Probability and Distributions

S n = x + X 1 + X X n.

SDS 321: Practice questions

APPM/MATH 4/5520 Solutions to Exam I Review Problems. f X 1,X 2. 2e x 1 x 2. = x 2

LIST OF FORMULAS FOR STK1100 AND STK1110

6.041/6.431 Fall 2010 Quiz 2 Solutions

CDA5530: Performance Models of Computers and Networks. Chapter 2: Review of Practical Random Variables

This exam is closed book and closed notes. (You will have access to a copy of the Table of Common Distributions given in the back of the text.

Continuous Random Variables

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

Northwestern University Department of Electrical Engineering and Computer Science

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Formulas for probability theory and linear models SF2941

The Geometric Random Walk: More Applications to Gambling

DO NOT OPEN THIS QUESTION BOOKLET UNTIL YOU ARE TOLD TO DO SO

Tom Salisbury

Probability and Statistics

Chapter 5 continued. Chapter 5 sections

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

Stat 426 : Homework 1.

MATH Notebook 5 Fall 2018/2019

Distributions of Functions of Random Variables. 5.1 Functions of One Random Variable

Course: ESO-209 Home Work: 1 Instructor: Debasis Kundu

Joint Probability Distributions, Correlations

Math Spring Practice for the Second Exam.

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Math 3215 Intro. Probability & Statistics Summer 14. Homework 5: Due 7/3/14

Chapter 3: Random Variables 1

ASM Study Manual for Exam P, First Edition By Dr. Krzysztof M. Ostaszewski, FSA, CFA, MAAA Errata

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

(Practice Version) Midterm Exam 2

2 (Statistics) Random variables

Random Variables. Random variables. A numerically valued map X of an outcome ω from a sample space Ω to the real line R

Probability Review. Yutian Li. January 18, Stanford University. Yutian Li (Stanford University) Probability Review January 18, / 27

Class 8 Review Problems solutions, 18.05, Spring 2014

Chapter 4 Multiple Random Variables

STAT 302 Introduction to Probability Learning Outcomes. Textbook: A First Course in Probability by Sheldon Ross, 8 th ed.

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 8 10/1/2008 CONTINUOUS RANDOM VARIABLES

Chapter 2: Random Variables

Moments. Raw moment: February 25, 2014 Normalized / Standardized moment:

CS145: Probability & Computing

Chapter 3: Random Variables 1

PCMI Introduction to Random Matrix Theory Handout # REVIEW OF PROBABILITY THEORY. Chapter 1 - Events and Their Probabilities

Bivariate Transformations

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Things to remember when learning probability distributions:

Limiting Distributions

Ch3. Generating Random Variates with Non-Uniform Distributions

This exam contains 6 questions. The questions are of equal weight. Print your name at the top of this page in the upper right hand corner.

1 Random Variable: Topics

Slides 8: Statistical Models in Simulation

Final Exam # 3. Sta 230: Probability. December 16, 2012

FINAL EXAM: Monday 8-10am

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Probability and Statistics

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

Math Review Sheet, Fall 2008

Math/Stats 425, Sec. 1, Fall 04: Introduction to Probability. Final Exam: Solutions

STAT/MATH 395 PROBABILITY II

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

18.440: Lecture 28 Lectures Review

Basics of Stochastic Modeling: Part II

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Chapter 6: Random Processes 1

6.041/6.431 Fall 2010 Final Exam Wednesday, December 15, 9:00AM - 12:00noon.

Probability and Statistics Notes

CME 106: Review Probability theory

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

1. Let X be a random variable with probability density function. 1 x < f(x) = 0 otherwise

IE 230 Probability & Statistics in Engineering I. Closed book and notes. 60 minutes.

Transcription:

1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions in the test/exam to range over the entire set of notes and exercises for the course. If a proof or any other part of the material we have discussed is not examinable then this is clearly stated in the notes. Most of the questions 1-9 are included here in order to help you to revise the Intro to Probability material which is used in this course. Answers to some of these questions should be looked up in your Intro to Probability notes (IPN for short). Introduction 1. What is a Probability Space? 2. What is the definition of a random variable? 3. What is the definition of a discrete random variable? List the main properties of the probabilities P X = x i } p i. Examples of discrete probability mass functions: Bernoulli, binomial, Poisson, geometric, negative binomial. (See your IPN.) 4. What is the definition of the expectation of a discrete random variable? 5. How and under what conditions can we compute E(g(X)), where g : R R is a function? 6. What is the definition of a conditional probability P (A/B)? 7. Define independence of two events; of any finite number of events. 8. State and prove the Theorem of Total Probability (the total probability formula). Be able to use it as in examples. The Voting Problem - This section in the Notes is for those who want to know more about applications of the Theorem of Total Probability. It is not examinable. 9. Explain what is the voting problem. Prove the main result concerned with this problem and be bale to use it. Random Walks 10. Give the definition of a random walk on a line. 11. What is the gambler s ruin problem? Give the re-formulation of it in terms of the behaviour of a random walk on a finite interval.you are supposed to be able to solve problems stated in terms of the gambler s ruin by reducing them to problems about random walks. Examples include finding the probability that one of the players will win the game (and the other will be ruined) or that, when playing in a casino, the gambler wins an infinite amount of money. 12. Let X t be a simple random walk on [M, N]. Let X 0 = n and r n be the probability that the random walk starting from n, M n N, will reach N before reaching M. Prove that rn = pr n+1 + qr n 1 if M < n < N r M = 0, r N = 1

2 13. Know the statements of results about the Second Order Difference Equations. Proofs are not examinable. 14. Solve the equations stated in question 12 and prove that r n = λ n λ M λ N λ M n M if p q N M if p = q = 0.5 where λ = q/p. You are supposed to remember this formula. (1) 15. Suppose the random walk starts from n and M n. What is the probability that it would reach + before visiting M? Know the derivation of the corresponding formula. 16. Suppose that X 0 = 0. Prove that the probability for a random walk to return to 0 is 2p if p < q and 2q if q p. 17. Be able to state and prove the Theorem of Total Probability for Expectations. 18. Suppose that a random walk is starting from n, M n N. The walk stops once it reaches M or N. Let E n be the expected duration of the walk. Prove that E 0 = E N = 0, (2) E n = pe n+1 + qe n 1 + 1 for 0 < n < N. (3) 19. Know how to solve equations (2)-(3) and remember the result in the case p = q = 1/2. 20. Prove the following statement: Suppose that p = q = 1/2 and a random walk starts from position 1. Then the expected time until it reaches zero is infinite! Conditional Expectations as Random Variables 21. Define E(X Y ), Var (X Y ). 22. Prove the Tower law for expectations. 23. Prove that Var (X) = E(Var (X Y )) + Var (E(X Y )). 24. Define what is a Random sum of Random Variables. Prove the following Theorem 1. Suppose X 1, X 2, X 3,... are independent identically distributed random variables with mean µ and variance σ 2, and that N is another independent non-negative integer-valued random variable. Let Y = N i=1 X i. Then E(Y ) = E(N)µ Var (Y ) = σ 2 E(N) + µ 2 Var (N). 25. Generating Functions. Know all definitions and theorems about the probability generating functions. Branching processes. 26. What is the definition of a branching process? 27. Prove the following theorem TheoremSuppose that X is a random variable with mean µ and Y 0, Y 1, Y 2... is the branching process generated by X. Then E(Y n ) = µ n, n 1.

3 28. Prove the following statement:suppose Y 0, Y 1, Y 2... is a branching process generated by a random variable X with mean µ < 1. Then lim P(Y n = 0) = 1. n 29. Let G n (t) = E ( t Yn) be the probability generating function of Y n. Prove the following theorem. Theorem Suppose Y 0, Y 1, Y 2... is a branching process generated by a random variable X with probability generating function G. Then 30. Prove that equation (4) implies G n (t) = G n 1 (G(t)). (4) G n+1 (t) = G(G n (t)). (5) 31. Denote by θ n = P(Y n = 0) - the probability of extinction of the branching process by time n. Prove that θ n = G(θ n 1 ) with θ 0 = 0. 32. How do you find the probability of ultimate extinction of a branching process? State and prove the related theorem. Example. Suppose that P(X = 0) = 0.3, P(X = 2) = 0.7. Find θ 1, θ 2, θ 3. Find the probability of ultimate extinction in this case. 33. State, in terms of the mean value of the generating random variable, the necessary and sufficient condition for the probability of ultimate extinction of a branching process to be equal to 1. Continuous random variables. 34. What is the definition of a continuous random variable? 35. What is the definition of the probability density function? 36. State the main properties of probability density functions. 37. Know the following examples of probability density functions: uniform, exponential, Gamma, Normal. 38. Define what is a cumulative distribution function (c.d.f.) of a random variable X? How do you find the the c.d.f. of X if its p.d.f. f X (x) is given? And how do you find the p.d.f. of X if the c.d.f. F X is given? How do you find E(g(X)) in terms of f X (x)? 39. Let X N(µ, σ 2 ) be a normal random variable. Prove that E(X) = µ, Var (X) = σ 2. 40. Suppose that X and Y are random variables. Define what does it mean to say that X and Y are jointly continuous? 41. Define the joint probability density function of two random variables. 42. What are the main properties of the joint probability density function of two random variables?

4 43. Define the joint distribution function F X,Y of two random variables X, Y. Express F X,Y (x, y) in terms of the joint p.d.f. f X,Y of X, Y. 44. Prove that F X (x) = F X,Y (x, ), F Y (y) = F X,Y (, y). 45. Prove that f X,Y (x, y) = 2 x y F X,Y (x, y). 46. Prove that the marginal densities can be found as follows: f X (x) = f X,Y (x, y) dy and f Y (y) = f X,Y (x, y) dx. 47. Give the definition of independence of two random variables. 48. What is the necessary and sufficient condition for two continuous r. v.s to be independent expressed in terms of probability density functions f X,Y (x, y), f X (x), f Y (y)? 49. Prove that two continuous random variables X and Y are independent if and only if there are functions g and h such that f X,Y (x, y) = g(x)h(y) for all x, y. Conditional distributions (continuous case). 50. Let X and Y be jointly continuous random variables with joint density function f X,Y. What is the conditional density function of X given Y = y, f X Y =y (x)? What is f Y X=x (y)? 51. What is the definition of E(X Y = y) and of E(X Y )? 52. Prove that E(X) = E(E(X Y )). 53. Let g(x, y) be a function of two real variables x, y. How do you find E(g(X, Y )) in terms of f X,Y (x, y)? Express E((X µ 1 ) k (Y µ 2 ) m ) in terms of f X,Y (x, y). 54. Exercise. Prove that if random variables X, Y are independent then E(X k Y m ) = E(X k ) E(Y m ). 55. What is the definition of covariance and correlation of two random variables? 56. The bivariate normal distribution. You are not asked to remember the formula for the joint p.d.f. of X, Y. Rather, you will be told what f X,Y (x, y) is. But, given f X,Y (x, y), you are supposed to be able to prove all statements concerning the bivariate normal distribution. Exercise. Prove that two normal random variables are independent if and only if the parameter ρ of the normal distribution is zero. Poisson Processes 57. Give the definition of the Poisson process N(t) with rate λ > 0. 58. What is the joint distribution of the values of the Poisson process at times t 1, t 2,..., t n, where t 1 < t 2 <... < t n?

5 59. What is the definition of the arrival time T n of a Poisson process? λe λx if x > 0 Prove that f T1 (x) = 0 if x 0. λ 2 xe λx if x > 0, Prove that f T2 (x) = 0 if x 0. Be able to state f Tn (x). 60. Prove that f T1,T 2 (x, y) = λ 2 e λy if y x 0 0 otherwise. 61. Prove that the random variable T 1 T 2 = y is uniformly distributed on [0, y]. 62. What is the definition of the inter-arrival time W n,, n 1? Know the statement of the theorem describing the joint distribution of n inter-arrival times W 1, W 2,..., W n. 63. Prove that W 1, W 2 are independent random variables each having the exponential distribution with parameter λ. Inequalities, Law of Large Numbers, Central Limit Theorem. 64. State and prove Markov s inequality. 65. State and prove Chebyshev s inequality. 66. State and prove the Law of Large Numbers. 67. State and prove the Bernoulli Law of Large Numbers. 68. State the Central Limit Theorem (CLT). Be able to apply the Central Limit Theorem. Be able to give answers in terms of an integral or the Φ function (examples in the Notes and exercise sheet 11)