Problem Points S C O R E Total: 120

Size: px
Start display at page:

Download "Problem Points S C O R E Total: 120"

Transcription

1 PSTAT 160 A Final Exam Solution December 10, 2015 Name Student ID # Problem Points S C O R E Total: 120

2 1. (10 points) Take a Markov chain with the state space {1, 2, 3, 4} and transition matrix A = Find P(X 2 = 4), if the initial distribution is P(X 0 = 1) = 0.3, P(X 0 = 2) = 0.4, P(X 0 = 3) = 0, P(X 0 = 4) = 0.3. Solution: The distribution of X 1 is given by the vector p(1) = p(0)a, where p(0) = [ ] Calculating this, we get: p(1) = [ ] Similarly, the distribution of X 2 is given by the vector p(2) = p(1)a. The fourht component of this vector is equal to P(X 2 = 4). Calculating this, we get: 0.132

3 2. (10 points) Fix the parameters N 1 = 10000, N 2 = 5000, N 3 = 2000, λ 1 = 2, λ 2 = 3, λ 3 = 5. Assume that an insurance company has N 1 clients with claims distributed as Poisson with parameters λ 1 : Poi(λ 1 ), N 2 clients with claims Poi(λ 2 ), and N 3 clients with claims Poi(λ 3 ). Suppose that this company wants to assign a premium to each client proportional to the mean amount (expected value) of his claim, so that this client pays this premium. The company wants to collect enough money from the premiums so that it can pay all the claims with probability greater than or equal to 99%. Find the premium for each client. Solution: Use the normal approximation (Central Limit Theorem). Let X 1,..., X N1 Poi(λ 1 ) i.i.d., Y 1,..., Y N2 Poi(λ 2 ) i.i.d., Z 1,..., Z N3 Poi(λ 3 ) i.i.d. The total amount of claims is equal to S := X X N1 + Y Y N2 + Z Z N3. By the Central Limit Theorem, S ES Var S N (0, 1). Because for ξ Poi(λ), we have: Eξ = Var ξ = λ, we get: Therefore, ES = EX EX N1 + EY EY N2 + EZ EZ N3 = = N 1 λ 1 + N 2 λ 2 + N 3 λ 3 = 45000, Var S = Var X Var X N2 + Var Y Var Y N2 + Var Z Var Z N3 = = N 1 λ 1 + N 2 λ 2 + N 3 λ 3 = ( ) S S N (0, 1), and P = We can rewrite this as ( P S ) = 0.99 In other words, the company needs to collect the amount = of money from premiums. If the premium for a client with Poi(λ i ) claim is λ i u, for i = 1, 2, 3, then the total amount of premiums is λ 1 u N 1 + λ 2 u N 2 + λ 3 u N 3 = 45000u. Therefore, we need: 45000u = 45493, and u = The clients with Poi(2) claims are charged 2u = the clients with Poi(3) claims are charged 3u = the clients with Poi(5) claims are charged 5u = 5.055

4 3. (10 points) Find all stationary distributions for the Markov chain with the transition matrix A = Solution: A stationary distribution p = [ p 1 p 2 ] p 3 must satisfy p = pa. Write this as a system of equations: p 1 = 0.4p p 3 { 0.6p 1 = 0.1p 3 p 2 = p 2 0.1p 3 = 0.6p 1 p 3 = 0.6p p 3 6p 1 = p 3. Combining this with p 1 + p 2 + p 3 = 1, we get: denote p 2 = u. Then { p 1 + p 3 = 1 u 6p 1 = p 3 { p 1 = 1 (1 u) 7 p 3 = 6 (1 u) 7 The final answer is (where 0 u 1): [ 1 (1 u) u 6 (1 u)] 7 7

5 4. (10 points) Toss a fair coin repeatedly. Let F n be the σ-subalgebra generated by the first n results, F 0 := {, Ω}. Let X n be the number of Heads during the first n tosses for n = 1, 2,... and X 0 := 0. Find a constant c such that the process (Y n ) n 0 is an (F n ) n 0 -martingale: Y n := 3X n cn, n = 0, 1, 2,... Solution: We can represent X n as follows. For n = 1, 2,... let { 1, if Heads on the nth toss; Z n = 0, otherwise. Then X n = Z Z n. Now, Z 1, Z 2,... are i.i.d. Bernoulli: P(Z i = 0) = P(Z i = 1) = 1/2. Also, F n is generated by Z 1,..., Z n. Therefore, Y n = 3X n cn = 3 (Z Z n ) cn, and Y n+1 = Y n + 3Z n+1 c. Therefore, because Y n is F n -measurable, and Z n+1 is independent of F n, we have: E (Y n+1 F n ) = E (Y n + 3Z n+1 c F n ) = Y n + E (3Z n+1 c). To make (Y n ) n 0 a martingale, we need E (3Z n+1 c) = 0. This can be rewritten as ( ) c = 0 c =

6 5. (10 points) Consider the probability space Ω = {0, 1, 2,..., 11} with three random variables { { { 1, ω 5; 1, ω is even; 12, ω = 6, a, b; X(ω) = Y (ω) = Z(ω) = 2, ω 6, 10, ω is odd, 12, else. Here, a, b Ω are some elements, a < b. Find the values of a and b such that Z is measurable with respect to the σ-sublagebra F := σ(x, Y ) generated by X and Y. Solution: The σ-subalgebra F contains events {X = 1, Y = 1} = {0, 2, 4}, {X = 2, Y = 1} = {6, 8, 10}, {X = 1, Y = 10} = {1, 3, 5}, {X = 2, Y = 10} Other events in F (except the empty set ) are unions of these events. For Z to be F-measurable, the event {Z = 12} has to belong to this σ-subalgebra, that is, to be a union of some of these events. But {Z = 12} = {6, a, b}, so this is possible only if {6, a, b} = {6, 8, 10}, that is, a = 8, b = 10

7 6. (10 points) Consider the following independent random variables: X n N (0, 3 n ), n = 1, 2,... Show that for all N = 1, 2,..., we have: P ( max ( 0, X 1, X 1 + X 2,..., X X N ) 6 ) Solution: The process (S n ) n 0, defined by S 0 := 0, S n := X X n, n = 1, 2,... is an (F n ) n 0 -martingale, where F 0 := {, Ω}, and F n is generated by X 1,..., X n. Apply the martingale inequality for λ = 6 and p = 2: ( P max S n 1 ) E S N 2. n=0,...,n 6 λ 2 Now, ES N = EX EX N = 0, and so E S N 2 = ES 2 N = ES 2 N (ES N ) 2 = Var S N = Var X Var X N = Therefore, This completes the proof. = N 3 n = 1 2. n=1 E S N 2 λ 2 1/2 6 2 = 1 72.

8 7. (10 points) Take i.i.d. random variables X 1, X 2,..., as well as a Poisson random variable τ Poi(λ), independent of X 1, X 2,... Assume EX i = µ, Var X i = σ 2, Ee tx i = ϕ(t) for t R. Consider the following sum: τ S = X k. k=1 Show that the Ee ts = e λ(ϕ(t) 1) for t R. Using this, express ES and Var S as combinations of µ, σ 2, and λ. Solution: Because τ Poi(λ), we have for k = 0, 1, 2,... Therefore (because X 1, X 2,... are i.i.d.) P(τ = k) = λk k! e λ. ψ(t) := Ee ts = k=0 Ee t(x X k ) λk k! e λ = ( Ee tx i ) k λ k k! e λ = k=0 Next, = ( Ee tx i ) k λ k k! e λ = k=0 Letting t = 0, we have: ψ (t) = λϕ (t)ψ(t), k=0 ϕ k (t) λk k! e λ = e λ = e λ e λϕ(t) = e λ(ϕ(t) 1). (ϕ(t)λ) k k=0 k! ψ (t) = λϕ (t)ψ(t) + λ 2 ϕ 2 (t)ψ(t). = ψ(0) = 1, ϕ (0) = EX i = µ, ϕ (0) = EX 2 i = µ 2 + σ 2, and therefore ES = ψ (0) = λϕ (0)ψ(0) = λµ ES 2 = ψ (0) = λ ( µ 2 + σ 2) + λ 2 µ 2 = µ 2 (λ + λ 2 ) + σ 2 λ

9 8. (10 points) Consider the following Markov chain: A = For each transient state i, find the mean time m i spent in i if you start from i. Solution: There are three transient states: i = 1, 2, 5. The corresponding submatrix is P = Let us calculate (I 3 P ) 1 = We can calculate this inverse matrix by splitting into two blocks: the upper-left corner element and the 2 2-lower-right corner matrix. Let us invert this latter matrix: [ ] = [ ] = ( 0.2) [ 10/7 ] 0 4/7 2 Therefore, Therefore, 5/3 0 0 (I 3 P ) 1 = 0 10/ /7 2 m 1 = 5 3, m 2 = 10 7, m 4 = 2

10 9. (10 points) For a random walk (X n ) n 0, starting from X 0 = 2, with p = 0.4, q = 0.6, find P ( X 6 = 0, X 12 = 2, X n 2, n = 0,..., 12 ). Solution: First, let us find the number of trajectories of random walk from (0, 2) to (6, 0) which stay above the line y = 3. This number is equal to the total number of trajectories of random walk from (0, 2) to (6, 0) minus the number of such trajectories which intersect or cross the line y = 3. The latter number, by the reflection principle, is equal to the number of trajectories from (0, 2) to the symmetric reflection of (6, 0) with respect to the line y = 3, that is, to (6, 6). The number of trajectories from (0, 2) to (6, 0): if they make a steps up and b steps down, then a + b = 6 and a b = 0 ( 2) = 2, so a = 4 and b = 2. We need to choose 2 downward steps from the total of 6 steps. This can be done in ( ) 6 2 = 6 5 = 15 ways. 2 The number of trajectories from (0, 2) to (6, 6): if they make a steps up and b steps down, then a + b = 6 and a b = 6 ( 2) = 4, so a = 1 and b = 5. We need to choose 1 upward step from the total of 6 steps. This can be done in 6 ways. Thus, the number of trajectories from (0, 2) to (6, 0) which stay above y = 3 is equal to 15 6 = 9. Each such trajectory has 4 steps upward and 2 steps downward, so its probability is p 4 q 2. Therefore, the probability P (X 6 = 0, X n 2, n = 0,..., 6) = 9p 4 q 2. Now, consider the next half of the trajectory. Let us find the number of trajectories of random walk from (6, 0) to (12, 2) which stay above the line y = 3. Every such trajectory makes a steps upward and b steps downward, and so a + b = 6, a b = 2. So a = 4 and b = 2. The trajectory makes only 2 steps downward, so it simply cannot touch the line y = 3 if it starts from (6, 0). So we need simply to find the number of trajectories from (6, 0) to (12, 2). This is equivalent to choosing 4 upward steps from 6 total steps, which can be done in ( ) 6 4 = 6 5 = 15 2 ways. So there are 15 such trajectories. Each has probability p 4 q 2, so the total probability that the random walk goes from (6, 0) to (12, 2) is 15p 4 q 2. Multiplying it by the total probability 9p 4 q 2 above, we get the answer: 135p 8 q 4

11 10. (10 points) Find the probability that the Markov chain, starting from 3, hits 4 before hitting 1: A = Solution: Suppose that p(x) is the probability that, starting from x, the Markov chain hits 4 before 1. Then p(4) = 1, p(1) = 0, and by the Markov property we have: p(3) = 0.5p(2) + 0.1p(3) + 0.4p(4), p(2) = 0.5p(1) + 0.5p(3). We can rewrite these latter equations as 0.9p(3) = 0.5p(2) + 0.4, p(2) = 0.5p(3). Solve them: 0.9p(3) = 0.25p(3) p(3) = 0.4 p(3) = = 8 13

12 11. (10 points) Chebyshev s inequality states that, for λ > 0 and a random variable X, we have: P ( X EX λ) Var X λ 2. For λ = 2, find an example of X which turns it into an equality, but with positive left- and right-hand sides. Solution: Try X with the distribution P(X = 2) = P(X = 2) = 0.5. Then EX = ( 2) 0.5 = 0, and Var X = EX 2 (EX) 2 = EX 2 = 4, because X 2 4. Therefore, the left and right sides of this inequality are equal to 1.

13 12. (10 points) Suppose the company has N 1 = clients, which have car insurance for n 1 = 10 years, and N 2 = clients, which have car insurance for n 2 = 12 years. In each group, three-quarters of the clients are careful drivers, and one-quarter are wild drivers. Each month, a careful driver can have an accident with probability p 1 = , and a wild driver can have an accident with probability p 2 = All accidents occur independently. For each accident, the company has to pay 1000$. The company wants to collect enough money from the premiums so that it can pay all the claims with probability greater than or equal to 95%. Find the amount of money needed to do this. Solution: First, let us model the total number X of accidents with a Poisson random variable with parameter λ = n 1N 1 p n 2N 2 p n 1N 1 p n 2N 2 p 1 = ( 1 = 12 4 p ) 4 p 1 (n 1 N 1 + n 2 N 2 ) = = Let us find n such that P(X n) 95%. We have: P(X = k) = λk k! e λ, k = 0, 1, 2,... and after calculations we find n = 3. So the company needs 3000$

14 Cumulative Probabilities of the Standard Normal Distribution The table gives the probabilities α = Φ(z) to the left of given z values for the standard normal distribution. For example, the probability that a standard normal random variable Z is less than 1.53 is found at the intersection of the 1.5 rows and the 0.03 column, thus Φ(1.53) = P (Z 1.53) = Due to symmetry it holds Φ( z) = 1 Φ(z) for all z. z Quantiles of the Standard Normal Distribution For selected probabilities α, the table shows the values of the quantiles z α such that Φ(z α ) = P (Z z α ) = α, where Z is a standard normal random variable. The quantiles satisfy the relation z 1 α = z α. α z α

Problem Points S C O R E Total: 120

Problem Points S C O R E Total: 120 PSTAT 160 A Final Exam December 10, 2015 Name Student ID # Problem Points S C O R E 1 10 2 10 3 10 4 10 5 10 6 10 7 10 8 10 9 10 10 10 11 10 12 10 Total: 120 1. (10 points) Take a Markov chain with the

More information

MATH/STAT 3360, Probability

MATH/STAT 3360, Probability MATH/STAT 3360, Probability Sample Final Examination This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are provided after each

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

MAT 135B Midterm 1 Solutions

MAT 135B Midterm 1 Solutions MAT 35B Midterm Solutions Last Name (PRINT): First Name (PRINT): Student ID #: Section: Instructions:. Do not open your test until you are told to begin. 2. Use a pen to print your name in the spaces above.

More information

Introduction to Stochastic Processes

Introduction to Stochastic Processes 18.445 Introduction to Stochastic Processes Lecture 1: Introduction to finite Markov chains Hao Wu MIT 04 February 2015 Hao Wu (MIT) 18.445 04 February 2015 1 / 15 Course description About this course

More information

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3 Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different

More information

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions)

CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) CSE 312: Foundations of Computing II Quiz Section #10: Review Questions for Final Exam (solutions) 1. (Confidence Intervals, CLT) Let X 1,..., X n be iid with unknown mean θ and known variance σ 2. Assume

More information

Introduction and Overview STAT 421, SP Course Instructor

Introduction and Overview STAT 421, SP Course Instructor Introduction and Overview STAT 421, SP 212 Prof. Prem K. Goel Mon, Wed, Fri 3:3PM 4:48PM Postle Hall 118 Course Instructor Prof. Goel, Prem E mail: goel.1@osu.edu Office: CH 24C (Cockins Hall) Phone: 614

More information

MATH/STAT 3360, Probability Sample Final Examination Model Solutions

MATH/STAT 3360, Probability Sample Final Examination Model Solutions MATH/STAT 3360, Probability Sample Final Examination Model Solutions This Sample examination has more questions than the actual final, in order to cover a wider range of questions. Estimated times are

More information

Solutions For Stochastic Process Final Exam

Solutions For Stochastic Process Final Exam Solutions For Stochastic Process Final Exam (a) λ BMW = 20 0% = 2 X BMW Poisson(2) Let N t be the number of BMWs which have passes during [0, t] Then the probability in question is P (N ) = P (N = 0) =

More information

Essentials on the Analysis of Randomized Algorithms

Essentials on the Analysis of Randomized Algorithms Essentials on the Analysis of Randomized Algorithms Dimitris Diochnos Feb 0, 2009 Abstract These notes were written with Monte Carlo algorithms primarily in mind. Topics covered are basic (discrete) random

More information

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam.

This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided with the exam. Probability and Statistics FS 2017 Session Exam 22.08.2017 Time Limit: 180 Minutes Name: Student ID: This exam contains 13 pages (including this cover page) and 10 questions. A Formulae sheet is provided

More information

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions 18.175: Lecture 8 Weak laws and moment-generating/characteristic functions Scott Sheffield MIT 18.175 Lecture 8 1 Outline Moment generating functions Weak law of large numbers: Markov/Chebyshev approach

More information

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014 Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note

More information

Chapter IV: Random Variables - Exercises

Chapter IV: Random Variables - Exercises Bernardo D Auria Statistics Department Universidad Carlos III de Madrid GROUP 89 - COMPUTER ENGINEERING 2010-2011 The time to repair a machine expressed in hour is a random variable with distribution function

More information

Homework 4 Solution, due July 23

Homework 4 Solution, due July 23 Homework 4 Solution, due July 23 Random Variables Problem 1. Let X be the random number on a die: from 1 to. (i) What is the distribution of X? (ii) Calculate EX. (iii) Calculate EX 2. (iv) Calculate Var

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

Martingale Theory for Finance

Martingale Theory for Finance Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.

More information

Notes for Math 324, Part 17

Notes for Math 324, Part 17 126 Notes for Math 324, Part 17 Chapter 17 Common discrete distributions 17.1 Binomial Consider an experiment consisting by a series of trials. The only possible outcomes of the trials are success and

More information

Mathematical Methods for Computer Science

Mathematical Methods for Computer Science Mathematical Methods for Computer Science Computer Science Tripos, Part IB Michaelmas Term 2016/17 R.J. Gibbens Problem sheets for Probability methods William Gates Building 15 JJ Thomson Avenue Cambridge

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Solutionbank S1 Edexcel AS and A Level Modular Mathematics

Solutionbank S1 Edexcel AS and A Level Modular Mathematics Heinemann Solutionbank: Statistics S Page of Solutionbank S Exercise A, Question Write down whether or not each of the following is a discrete random variable. Give a reason for your answer. a The average

More information

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf

Qualifying Exam in Probability and Statistics. https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part : Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Stat 605. Homework I. Due Feb. 1, 2011 The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due

More information

Math 493 Final Exam December 01

Math 493 Final Exam December 01 Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

Week 2. Review of Probability, Random Variables and Univariate Distributions

Week 2. Review of Probability, Random Variables and Univariate Distributions Week 2 Review of Probability, Random Variables and Univariate Distributions Probability Probability Probability Motivation What use is Probability Theory? Probability models Basis for statistical inference

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Assignment 4: Solutions

Assignment 4: Solutions Math 340: Discrete Structures II Assignment 4: Solutions. Random Walks. Consider a random walk on an connected, non-bipartite, undirected graph G. Show that, in the long run, the walk will traverse each

More information

Lectures on Markov Chains

Lectures on Markov Chains Lectures on Markov Chains David M. McClendon Department of Mathematics Ferris State University 2016 edition 1 Contents Contents 2 1 Markov chains 4 1.1 The definition of a Markov chain.....................

More information

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1 Problem Sheet 1 1. Let Ω = {1, 2, 3}. Let F = {, {1}, {2, 3}, {1, 2, 3}}, F = {, {2}, {1, 3}, {1, 2, 3}}. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X :

More information

Lecture 1: Review on Probability and Statistics

Lecture 1: Review on Probability and Statistics STAT 516: Stochastic Modeling of Scientific Data Autumn 2018 Instructor: Yen-Chi Chen Lecture 1: Review on Probability and Statistics These notes are partially based on those of Mathias Drton. 1.1 Motivating

More information

18.175: Lecture 14 Infinite divisibility and so forth

18.175: Lecture 14 Infinite divisibility and so forth 18.175 Lecture 14 18.175: Lecture 14 Infinite divisibility and so forth Scott Sheffield MIT 18.175 Lecture 14 Outline Infinite divisibility Higher dimensional CFs and CLTs Random walks Stopping times Arcsin

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008 Justify your answers; show your work. 1. A sequence of Events. (10 points) Let {B n : n 1} be a sequence of events in

More information

Eleventh Problem Assignment

Eleventh Problem Assignment EECS April, 27 PROBLEM (2 points) The outcomes of successive flips of a particular coin are dependent and are found to be described fully by the conditional probabilities P(H n+ H n ) = P(T n+ T n ) =

More information

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172.

ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. ECE 302, Final 3:20-5:20pm Mon. May 1, WTHR 160 or WTHR 172. 1. Enter your name, student ID number, e-mail address, and signature in the space provided on this page, NOW! 2. This is a closed book exam.

More information

17. Convergence of Random Variables

17. Convergence of Random Variables 7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4.1 (*. Let independent variables X 1,..., X n have U(0, 1 distribution. Show that for every x (0, 1, we have P ( X (1 < x 1 and P ( X (n > x 1 as n. Ex. 4.2 (**. By using induction

More information

Lecture 4. David Aldous. 2 September David Aldous Lecture 4

Lecture 4. David Aldous. 2 September David Aldous Lecture 4 Lecture 4 David Aldous 2 September 2015 The specific examples I m discussing are not so important; the point of these first lectures is to illustrate a few of the 100 ideas from STAT134. Ideas used in

More information

STAT 516 Midterm Exam 2 Friday, March 7, 2008

STAT 516 Midterm Exam 2 Friday, March 7, 2008 STAT 516 Midterm Exam 2 Friday, March 7, 2008 Name Purdue student ID (10 digits) 1. The testing booklet contains 8 questions. 2. Permitted Texas Instruments calculators: BA-35 BA II Plus BA II Plus Professional

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

Twelfth Problem Assignment

Twelfth Problem Assignment EECS 401 Not Graded PROBLEM 1 Let X 1, X 2,... be a sequence of independent random variables that are uniformly distributed between 0 and 1. Consider a sequence defined by (a) Y n = max(x 1, X 2,..., X

More information

p. 4-1 Random Variables

p. 4-1 Random Variables Random Variables A Motivating Example Experiment: Sample k students without replacement from the population of all n students (labeled as 1, 2,, n, respectively) in our class. = {all combinations} = {{i

More information

Chapter 11 Advanced Topic Stochastic Processes

Chapter 11 Advanced Topic Stochastic Processes Chapter 11 Advanced Topic Stochastic Processes CHAPTER OUTLINE Section 1 Simple Random Walk Section 2 Markov Chains Section 3 Markov Chain Monte Carlo Section 4 Martingales Section 5 Brownian Motion Section

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

Discrete Distributions

Discrete Distributions A simplest example of random experiment is a coin-tossing, formally called Bernoulli trial. It happens to be the case that many useful distributions are built upon this simplest form of experiment, whose

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

6.041/6.431 Fall 2010 Quiz 2 Solutions

6.041/6.431 Fall 2010 Quiz 2 Solutions 6.04/6.43: Probabilistic Systems Analysis (Fall 200) 6.04/6.43 Fall 200 Quiz 2 Solutions Problem. (80 points) In this problem: (i) X is a (continuous) uniform random variable on [0, 4]. (ii) Y is an exponential

More information

Math 180B Homework 4 Solutions

Math 180B Homework 4 Solutions Math 80B Homework 4 Solutions Note: We will make repeated use of the following result. Lemma. Let (X n ) be a time-homogeneous Markov chain with countable state space S, let A S, and let T = inf { n 0

More information

1 Basic continuous random variable problems

1 Basic continuous random variable problems Name M362K Final Here are problems concerning material from Chapters 5 and 6. To review the other chapters, look over previous practice sheets for the two exams, previous quizzes, previous homeworks and

More information

Positive and null recurrent-branching Process

Positive and null recurrent-branching Process December 15, 2011 In last discussion we studied the transience and recurrence of Markov chains There are 2 other closely related issues about Markov chains that we address Is there an invariant distribution?

More information

18.175: Lecture 17 Poisson random variables

18.175: Lecture 17 Poisson random variables 18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More

More information

Inference for Stochastic Processes

Inference for Stochastic Processes Inference for Stochastic Processes Robert L. Wolpert Revised: June 19, 005 Introduction A stochastic process is a family {X t } of real-valued random variables, all defined on the same probability space

More information

Qualifying Exam in Probability and Statistics.

Qualifying Exam in Probability and Statistics. Part 1: Sample Problems for the Elementary Section of Qualifying Exam in Probability and Statistics https://www.soa.org/files/edu/edu-exam-p-sample-quest.pdf Part 2: Sample Problems for the Advanced Section

More information

Lecture 23. Random walks

Lecture 23. Random walks 18.175: Lecture 23 Random walks Scott Sheffield MIT 1 Outline Random walks Stopping times Arcsin law, other SRW stories 2 Outline Random walks Stopping times Arcsin law, other SRW stories 3 Exchangeable

More information

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions

University of Chicago Graduate School of Business. Business 41901: Probability Final Exam Solutions Name: University of Chicago Graduate School of Business Business 490: Probability Final Exam Solutions Special Notes:. This is a closed-book exam. You may use an 8 piece of paper for the formulas.. Throughout

More information

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 18-27 Review Scott Sheffield MIT Outline Outline It s the coins, stupid Much of what we have done in this course can be motivated by the i.i.d. sequence X i where each X i is

More information

BNAD 276 Lecture 5 Discrete Probability Distributions Exercises 1 11

BNAD 276 Lecture 5 Discrete Probability Distributions Exercises 1 11 1 / 15 BNAD 276 Lecture 5 Discrete Probability Distributions 1 11 Phuong Ho May 14, 2017 Exercise 1 Suppose we have the probability distribution for the random variable X as follows. X f (x) 20.20 25.15

More information

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y.

2. Variance and Covariance: We will now derive some classic properties of variance and covariance. Assume real-valued random variables X and Y. CS450 Final Review Problems Fall 08 Solutions or worked answers provided Problems -6 are based on the midterm review Identical problems are marked recap] Please consult previous recitations and textbook

More information

X = X X n, + X 2

X = X X n, + X 2 CS 70 Discrete Mathematics for CS Fall 2003 Wagner Lecture 22 Variance Question: At each time step, I flip a fair coin. If it comes up Heads, I walk one step to the right; if it comes up Tails, I walk

More information

EE514A Information Theory I Fall 2013

EE514A Information Theory I Fall 2013 EE514A Information Theory I Fall 2013 K. Mohan, Prof. J. Bilmes University of Washington, Seattle Department of Electrical Engineering Fall Quarter, 2013 http://j.ee.washington.edu/~bilmes/classes/ee514a_fall_2013/

More information

RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME

RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME RANDOM WALKS AND THE PROBABILITY OF RETURNING HOME ELIZABETH G. OMBRELLARO Abstract. This paper is expository in nature. It intuitively explains, using a geometrical and measure theory perspective, why

More information

MATH 118 FINAL EXAM STUDY GUIDE

MATH 118 FINAL EXAM STUDY GUIDE MATH 118 FINAL EXAM STUDY GUIDE Recommendations: 1. Take the Final Practice Exam and take note of questions 2. Use this study guide as you take the tests and cross off what you know well 3. Take the Practice

More information

Basic Probability space, sample space concepts and order of a Stochastic Process

Basic Probability space, sample space concepts and order of a Stochastic Process The Lecture Contains: Basic Introduction Basic Probability space, sample space concepts and order of a Stochastic Process Examples Definition of Stochastic Process Marginal Distributions Moments Gaussian

More information

Recitation 2: Probability

Recitation 2: Probability Recitation 2: Probability Colin White, Kenny Marino January 23, 2018 Outline Facts about sets Definitions and facts about probability Random Variables and Joint Distributions Characteristics of distributions

More information

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

18.440: Lecture 28 Lectures Review

18.440: Lecture 28 Lectures Review 18.440: Lecture 28 Lectures 17-27 Review Scott Sheffield MIT 1 Outline Continuous random variables Problems motivated by coin tossing Random variable properties 2 Outline Continuous random variables Problems

More information

More on Distribution Function

More on Distribution Function More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function F X. Theorem: Let X be any random variable, with cumulative distribution

More information

18.440: Lecture 19 Normal random variables

18.440: Lecture 19 Normal random variables 18.440 Lecture 19 18.440: Lecture 19 Normal random variables Scott Sheffield MIT Outline Tossing coins Normal random variables Special case of central limit theorem Outline Tossing coins Normal random

More information

On Finite-Time Ruin Probabilities in a Risk Model Under Quota Share Reinsurance

On Finite-Time Ruin Probabilities in a Risk Model Under Quota Share Reinsurance Applied Mathematical Sciences, Vol. 11, 217, no. 53, 269-2629 HIKARI Ltd, www.m-hikari.com https://doi.org/1.12988/ams.217.7824 On Finite-Time Ruin Probabilities in a Risk Model Under Quota Share Reinsurance

More information

The Geometric Random Walk: More Applications to Gambling

The Geometric Random Walk: More Applications to Gambling MATH 540 The Geometric Random Walk: More Applications to Gambling Dr. Neal, Spring 2008 We now shall study properties of a random walk process with only upward or downward steps that is stopped after the

More information

CS 246 Review of Proof Techniques and Probability 01/14/19

CS 246 Review of Proof Techniques and Probability 01/14/19 Note: This document has been adapted from a similar review session for CS224W (Autumn 2018). It was originally compiled by Jessica Su, with minor edits by Jayadev Bhaskaran. 1 Proof techniques Here we

More information

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x! Lectures 3-4 jacques@ucsd.edu 7. Classical discrete distributions D. The Poisson Distribution. If a coin with heads probability p is flipped independently n times, then the number of heads is Bin(n, p)

More information

Solutions to Problem Set 4

Solutions to Problem Set 4 UC Berkeley, CS 174: Combinatorics and Discrete Probability (Fall 010 Solutions to Problem Set 4 1. (MU 5.4 In a lecture hall containing 100 people, you consider whether or not there are three people in

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem 1 Gambler s Ruin Problem Consider a gambler who starts with an initial fortune of $1 and then on each successive gamble either wins $1 or loses $1 independent of the past with probabilities p and q = 1

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) =

1. If X has density. cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. f(x) = 1. If X has density f(x) = { cx 3 e x ), 0 x < 0, otherwise. Find the value of c that makes f a probability density. 2. Let X have density f(x) = { xe x, 0 < x < 0, otherwise. (a) Find P (X > 2). (b) Find

More information

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i := 2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]

More information

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24

Problem #1 #2 #3 #4 Total Points /5 /7 /8 /4 /24 STAT/MATH 395 A - Winter Quarter 17 - Midterm - February 17, 17 Name: Student ID Number: Problem #1 # #3 #4 Total Points /5 /7 /8 /4 /4 Directions. Read directions carefully and show all your work. Define

More information

4 Branching Processes

4 Branching Processes 4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

7 Poisson random measures

7 Poisson random measures Advanced Probability M03) 48 7 Poisson random measures 71 Construction and basic properties For λ 0, ) we say that a random variable X in Z + is Poisson of parameter λ and write X Poiλ) if PX n) e λ λ

More information

Recap of Basic Probability Theory

Recap of Basic Probability Theory 02407 Stochastic Processes? Recap of Basic Probability Theory Uffe Høgsbro Thygesen Informatics and Mathematical Modelling Technical University of Denmark 2800 Kgs. Lyngby Denmark Email: uht@imm.dtu.dk

More information

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1).

2. Suppose (X, Y ) is a pair of random variables uniformly distributed over the triangle with vertices (0, 0), (2, 0), (2, 1). Name M362K Final Exam Instructions: Show all of your work. You do not have to simplify your answers. No calculators allowed. There is a table of formulae on the last page. 1. Suppose X 1,..., X 1 are independent

More information

Random Processes. DS GA 1002 Probability and Statistics for Data Science.

Random Processes. DS GA 1002 Probability and Statistics for Data Science. Random Processes DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Modeling quantities that evolve in time (or space)

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

Formulas for probability theory and linear models SF2941

Formulas for probability theory and linear models SF2941 Formulas for probability theory and linear models SF2941 These pages + Appendix 2 of Gut) are permitted as assistance at the exam. 11 maj 2008 Selected formulae of probability Bivariate probability Transforms

More information

6.1 Moment Generating and Characteristic Functions

6.1 Moment Generating and Characteristic Functions Chapter 6 Limit Theorems The power statistics can mostly be seen when there is a large collection of data points and we are interested in understanding the macro state of the system, e.g., the average,

More information

Stochastic Models in Computer Science A Tutorial

Stochastic Models in Computer Science A Tutorial Stochastic Models in Computer Science A Tutorial Dr. Snehanshu Saha Department of Computer Science PESIT BSC, Bengaluru WCI 2015 - August 10 to August 13 1 Introduction 2 Random Variable 3 Introduction

More information

Joint Distribution of Two or More Random Variables

Joint Distribution of Two or More Random Variables Joint Distribution of Two or More Random Variables Sometimes more than one measurement in the form of random variable is taken on each member of the sample space. In cases like this there will be a few

More information

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is

Math 416 Lecture 3. The average or mean or expected value of x 1, x 2, x 3,..., x n is Math 416 Lecture 3 Expected values The average or mean or expected value of x 1, x 2, x 3,..., x n is x 1 x 2... x n n x 1 1 n x 2 1 n... x n 1 n 1 n x i p x i where p x i 1 n is the probability of x i

More information

Random Walks Conditioned to Stay Positive

Random Walks Conditioned to Stay Positive 1 Random Walks Conditioned to Stay Positive Bob Keener Let S n be a random walk formed by summing i.i.d. integer valued random variables X i, i 1: S n = X 1 + + X n. If the drift EX i is negative, then

More information

3. DISCRETE RANDOM VARIABLES

3. DISCRETE RANDOM VARIABLES IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose

More information