4 Branching Processes

Size: px
Start display at page:

Download "4 Branching Processes"

Transcription

1 4 Branching Processes Organise by generations: Discrete time. If P(no offspring) 0 there is a probability that the process will die out. Let X = number of offspring of an individual p(x) = P(X = x) = offspring prob. function Assume: (i) p same for all individuals (ii) individuals reproduce independently (iii) process starts with a single individual at time 0. Assumptions (i) and (ii) define the Galton-Watson discrete time branching process. Two random variables of interest: Z n = number of individuals at time n (Z 0 = 1 by (iii)) T n = total number born up to and including generation n e.g. p(0) = r p(1) = q p(z) = p What is the probability that the second generation will contain 0 or 1 member? P(Z 2 = 0) = P(Z 1 = 0) + P(Z 1 = 1) P(Z 2 = 0 Z 1 = 1) + P(Z 1 = 2) P(Z 2 = 0 Z 1 = 2) = r + qr + pr 2 P(Z 1 = 1) = P(Z 1 = 1) P(Z 2 = 1 Z 1 = 1) + P(Z 1 = 2) P(Z 2 = 1 Z 1 = 2) = q 2 + p(rq + qr) = q 2 + 2pqr. Note: things can be complicated because Z 2 = X 1 + X X Z1 with Z 1 a random variable. 20

2 4.1 Revision: Probability generating functions Suppose a discrete random variable X takes values in {0, 1, 2,...} and has probability function p(x). Then p.g.f. is Π X (s) = E(s X ) = p(x)s x x=0 Note: Π(0) = p(0) Π(1) = p(x) = 1 pgf uniquely determines the distribution and vice versa. 4.2 Some important pgfs Distribution pdf Range pgf Bernoulli(p) p x q 1 x 0, 1 q + ps Binomial(n, p) ( n x ) p x q n x 0, 1, 2,..., n (q + ps) n P oisson(μ) e μ μ x x! 0, 1, 2... e μ(1 s) Geometric, G 1 (p) q x 1 p 1, 2,... ps 1 qs N egative Binomial ( ) x 1 r 1 q x r p r r, r + 1,... ( ) ps r 1 qs 4.3 Calculating moments using pgfs Then Π(s) = p(x)s x = E(s X ). x=0 Π (s) = E(Xs X 1 ) Π (1) = E(X) 21

3 Likewise Π (s) = E [ X(X 1)s X 2] Π (1) = E [X(X 1)] = E(X 2 ) E(X). So var(x) = E(X 2 ) E 2 (X) = [ Π (1) + E(X) ] Π (1) 2 var(x) = Π (1) + Π (1) Π (1) 2 μ = Π (1); σ 2 = Π (1) + μ μ Distribution of sums of independent rvs X, Y independent discrete rvs on {0, 1, 2,...}, let Z = X + Y. Π Z (s) = E(s Z ) = E(s X+Y ) = E(s X )E(s Y ) (indep) = Π X (s)π Y (s) In general: If Z = n X i i=1 with X i independent discrete rvs with pgfs Π i (s), i = 1,..., n, then Π Z (s) = Π 1 (s)π 2 (s)... Π n (s). In particular, if X i are identically distributed with pgf Π(s), then Π Z (s) = [Π(s)] n. 22

4 e.g. Q: If X i G 1 (p), i = 1,..., n (number of trials up to and including the 1st success) are independent, find the pgf of Z = n i=1 X i, and hence identify the distribution of Z. A: Pgf of G 1 (p) So pgf of Z is Π X (s) = Π Z (s) = ps 1 qs. ( ) n ps, 1 qs which is the pgf of a negative binomial distribution. Intuitively: Neg. bin. = number trials up to and including nth success = sum of n sequences of trials each consisting of number of failures followed by a success. = sum of n geometrics. 4.5 Distribution of sums of a random number of independent rvs Let Z = X 1 + X X N N is a rv on {0, 1, 2,...} X i iid rvs on {0, 1, 2,...} (Convention Z = 0 when N = 0). Π Z (s) = P(Z = z)s z z=0 P(Z = z) = P(Z = z N = n)p(n = n) (Thm T.P.) n=0 Π Z (s) = P(N = n) P(Z = z N = n)s z n=0 z=0 = P(N = n) [Π X (s)] n (since X i iid) n=0 = Π N [Π X (s)] 23

5 If Z is the sum of N independent discrete rvs X 1, X 2,..., X N, each with range {0, 1, 2,...} each having pgf Π X (s) and where N is a rv with range {0, 1, 2,...} and pgf Π N (s) then and Z has a compound distribution. Π Z (s) = Π N [Π X (s)] e.g. Q: Suppose N G 0 (p) and each X i Binomial(1, θ) (independent). Find the distribution of Z = N i=1 X i. A: So Π N (s) = q 1 ps Π X (s) = 1 θ + θs Π Z (s) = = = q 1 p(1 θ + θs) q q + pθ pθs = q/(q + pθ) 1 (pθs/(q + pθ)) 1 pθ/(q + pθ) 1 (pθ/(q + pθ)) s ( ) which is the pgf of G pθ 0 q+pθ distribution. Note: even in the cases where Z = N i=1 X i does not have a recognisable pgf, we can still use the resultant pgf to find properties (e.g. moments) of the distribution of Z. We have probability generating function (pgf): Π X (s) = E(s X ). Also: moment generating function (mgf): Φ X (t) = E(e tx ), Take transformation s = e t : Π X (e t ) = E(e tx ) the mgf has many properties in common with the pgf but can be used for a wider class of distributions. 24

6 4.6 Branching processes and pgfs Recall Z n = number of individuals at time n (Z 0 = 1), and X i = number of offspring of individual i. We have Z 2 = X 1 + X X Z1. So, Π 2 (s) = Π 1 [Π 1 (s)] e.g. consider the branching process in which p(0) = r p(1) = q p(2) = p. So and 2 Π(s) = Π 1 (s) = p(x)s x = r + qs + ps 2, x=0 Π 2 (s) = Π 1 [Π 1 (s)] = r + qπ 1 (s) + pπ 1 (s) 2 = r + q(r + qs + ps 2 ) + p(r + qs + ps 2 ) 2 = r + qr + pr 2 + (q 2 + 2pqr)s + (pq + pq 2 + 2p 2 r)s 2 + 2p 2 qs 3 + p 3 s 4 Coefficients of s x (x = 0, 1, 2,..., 4) give probability Z 2 = x. What about the nth generation? Let Y i = number offspring of ith member of (n 1)th generation Then Z n = Y 1 + Y Y Zn 1, so, Π n (s) = Π n 1 [Π(s)] = Π n 2 [Π [Π(s)]] =. = Π[Π[... [Π(s)]...] }{{} n 25

7 writing this out explicitly can be complicated. But sometimes we get lucky: e.g. X Binomial(1, p), then Π X (s) = q + ps. So, Π 2 (s) = q + p(q + ps) = q + pq + p 2 s Π 3 (s) = q + p(q + p(q + ps)) = q + pq + p 2 q + p 3 s. Π n (s) = q + pq + p 2 q p n 1 q + p n s. Now Π n (1) = 1, so (1 p n ) = q + pq + p 2 q p n 1 q Π n (s) = 1 p n + p n s This is the pgf of a Binomial(1, p n ) distribution. The distribution of the number of cases in the nth generation is Bernoulli with parameter p n. i.e: P(Z n = 1) = p n P(Z n = 0) = 1 p n. 4.7 Mean and Variance of size of nth generation of a branching process mean: Let μ = E(X) and let μ n = E(Z n ). μ = Π (1) Π n (s) = Π n 1 [Π(s)] Π n(s) = Π n 1 [Π(s)] Π (s) Π n(1) = Π n 1 [Π(1)] Π (1) = Π n 1(1)Π (1) so μ = μ n 1 μ = μ n 2 μ 2 =... = μ n. 26

8 Note: as n μ n = μ n μ > 1 1 μ = 1 0 μ < 1 so, at first sight, it looks as if the generation size will either increase unboundedly (if μ > 1) or die out (if μ < 1) - slightly more complicated... variance: Let σ 2 = var(x) and let σ 2 n = var(z n ). Π n(s) = Π n 1 [Π(s)] Π (s) Π n(s) = Π n 1 [Π(s)] Π (s) 2 + Π n 1 [Π(s)] Π (s) (1) Now Π(1) = 1, Π (1) = μ, Π (1) = σ 2 μ + μ 2. Also, since σ 2 n = Π n(1) + μ n μ 2 n, we have Π n(1) = σ 2 n μ n + μ 2n and Π n 1(1) = σ 2 n 1 μ n 1 + μ 2n 2. From (1), Π n(1) = Π n 1(1)Π (1) 2 + Π n 1(1)Π (1) σ 2 n μ n + μ 2n = (σ 2 n 1 μ n 1 + μ 2n 2 )μ 2 + μ n 1 (σ 2 μ + μ 2 ) σ 2 n = μ 2 σ 2 n 1 + μ n 1 σ 2 Leading to σ 2 n = μ n 1 σ 2 (1 + μ + μ μ n 1 ) So, we have μ n 1 σ 2 1 μ n σn 2 = 1 μ μ 1 nσ 2 μ = Total number of individuals Let T n be the total number up to and including generation n. Then E(T n ) = E(Z 0 + Z 1 + Z Z n ) = 1 + E(Z 1 ) + E(Z 2 ) E(Z n ) 27

9 lim E(T n) = n = 1 + μ + μ μ n μ n+1 1 = μ 1 μ 1 n + 1 μ = 1 μ μ μ < Probability of ultimate extinction Necessary that P(X = 0) = p(0) 0. Let θ n = P(nth generation contains 0 individuals) = P(extinction occurs by nth generation) θ n = P(Z n = 0) = Π n (0) Now P(extinct by nth generation) = P(extinct by (n 1)th) + P(extinct at nth). So, θ n = θ n 1 + P(extinct at nth) θ n θ n 1. Now, Π n (s) = Π [Π n 1 (s)] Π n (0) = Π [Π n 1 (0)] θ n = Π(θ n 1 ). θ n is a non-decreasing sequence that is bounded above by 1 (it is a probability), hence, by the monotone convergence theorem lim n θ n = θ exists and θ 1. Now lim n θ n = Π(lim n θ n 1 ), so θ satisfies θ = Π(θ), θ [0, 1]. Consider Π(θ) = p(x)θ x x=0 Π(0) = p(0) (> 0), and Π(1) = 1, also Π (1) > 0 and for θ > 0, Π (θ) > 0, so Π(θ) is a convex increasing function for θ [0, 1] and so solutions of θ = Π(θ) are determined by slope of Π(θ) at θ = 1, i.e. by Π (1) = μ. 28

10 So, 1. If μ < 1 there is one solution at θ = 1. extinction is certain. 2. If μ > 1 there are two solutions: θ < 1 and θ = 1, as θ n is increasing, we want the smaller solution. extinction is NOT certain. 3. If μ = 1 solution is θ = 1. extinction is certain. Note: mean size of nth generation is μ n. So if extinction does not occur the size will increase without bound. Summary: P(ultimate extinction of branching process) = smallest positive solution of θ = Π(θ) 1. μ 1 θ = 1 ultimate extinction certain. e.g. 2. μ > 1 θ < 1 ultimate extinction not certain X Binomial(3, p); θ = Π(θ) θ = (q + pθ) 3. i.e. p 3 θ 3 + 3p 2 qθ 2 + (3pq 2 1)θ + q 3 = 0. (2) Now E(X) = μ = 3p i.e. μ > 1 when p > 1/3, and P(extinction) = smallest solution of θ = Π(θ). Since we know θ = 1 is a solution, we can factorise (2): (θ 1)(p 3 θ 2 + (3p 2 q + p 3 )θ q 3 ) = 0 e.g. if p = 1/2 (i.e p > 1/3 satisfied), we know that θ satisfies θ 2 + 4θ 1 = 0 θ = 5 2 =

11 4.10 Generalizations of simple branching process 1: k individuals in generation 0 Let Z ni = number individuals in nth generation descended from ith ancestor S n = Z n1 + Z n Z nk Then, Π Sn = [Π n (s)] k. 2: Immigration: W n immigrants arrive at nth generation and start to reproduce. Let pgf for number of immigrants be Ψ(s). Zn = size of nth generation (n = 0, 1, 2,...) with pgf Π n(s) Z0 = 1 Z1 = W 1 + Z 1 Π 1(s) = Ψ(s)Π(s) Z2 = W 2 + Z 2 Π 2(s) = Ψ(s)Π 1(Π(s)), as Π 1(Π(s)) is the pgf of the number of offspring of the Z 1 members of generation 1, each of these has offspring according to a distribution with pgf Π(s). In general Π n(s) = Ψ(s)Π n 1[Π(s)] = Ψ(s)Ψ[Π(s)]Π n 2[Π[Π(s)]] =... (n 1) Π s {}}{ = Ψ(s)Ψ[Π(s)]... Ψ[ Π[Π[... [Π(s)]...] Π[Π[Π... [Π(s)]...] }{{} n Π s e.g. Suppose that the number of offspring, X, has a Bernoulli distribution and the number of immigrants has a Poisson distribution. 1. Derive the pgf of size of nth generation Π n(s) and 30

12 2. investigate its behaviour as n. A: 1. X Binomial(1, p), W n P oisson(μ), n = 1, 2,... So, Π(s) = q + ps Ψ(s) = e μ(1 s). Π 1(s) = e μ(1 s) (q + ps) Π 2(s) = Ψ(s)Π 1(Π(s)) = e μ(1 s) Π 1(q + ps) = e μ(1 s) (q + p(q + ps))e μ(1 q ps) = e μ(1+p)(1 s) (1 p 2 + p 2 s) Π 3(s) = Ψ(s)Π 2(Π(s)) = e μ(1 s)(1+p+p2) (1 p 3 + p 3 s) =. Π n(s) = e μ(1 s)(1+p+p p n 1) (1 p n + p n s) = e μ(1 s)(1 pn )/(1 p) (1 p n + p n s). 2. As n, p n 0 (0 < p < 1), so Π n(s) e μ(1 s)/(1 p) as n. This is the pgf of a Poisson distribution with parameter μ/(1 p). Without immigration a branching process either becomes extinct of increases unboundedly. With immigration there is also the possibility that there is a limiting distribution for generation size. 31

13 5 Random Walks Consider a particle at some position on a line, moving with the following transition probabilities: with prob p it moves 1 unit to the right. with prob q it moves 1 unit to the left. with prob r it stays where it is. Position at time n is given by, X n = Z Z n Z n = A random process {X n ; n = 0, 1, 2,...} is a random walk if, for n 1 X n = Z Z n where {Z i }, i = 1, 2,... is a sequence of iid rvs. If the only possible values for Z i are 1, then the process is a simple random walk 5.1 Random walks with barriers Absorbing barriers Flip fair coin: p = q = 1 2, r = 0. H you win $1, T you lose $1. Let Z n = amount you win on nth flip. Then X n = total amount you ve won up to and including nth flip. BUT, say you decide to stop playing if you lose $50 State space = { 50, 49,...} and 50 is an absorbing barrier (once entered cannot be left). Reflecting barriers A particle moves on a line between points a and b (integers with b > a), with the following transition probabilities: P(X n = x + 1 X n 1 = x) = 2 3 P(X n = x 1 X n 1 = x) = a < x < b

14 P(X n = a + 1 X n 1 = a) = 1 P(X n = b 1 X n 1 = b) = 1 a and b are reflecting barriers. Can also have P(X n = a + 1 X n 1 = a) = p P(X n = a X n 1 = a) = 1 p and similar for b. Note: random walks satisfy the Markov property. i.e. the distribution of X n is determined by the value of X n 1 (earlier history gives no extra info.) A stochastic process in discrete time which has the Markov property is a Markov Chain. X a random walk X a Markov chain X a Markov chain X a random walk Since the Z i in a random walk are iid, the transition probabilities are independent of current position, i.e. P(X n = a + 1 X n 1 = a) = P(X n = b + 1 X n 1 = b). 5.2 Gambler s ruin Two players A and B. A starts with $j, B with $(a j). Play a series of indep. games until one or other is ruined. Z i = amount A wins in ith game = ±1. P(Z i = 1) = p P(Z i = 1) = 1 p = q. After n games A has X n = X n 1 + Z n, 0 < X n 1 < a. Stop if X n 1 = 0 A loses or X n 1 = a A wins. 33

15 Random walk with state space {0, 1,..., a} and absorbing barriers at 0 and a. What is the probability that A loses? Let R j = event A is ruined if he starts with $j q j = P(R j ) q 0 = 1 q n = 0. For 0 < j < a, P(R j ) = P(R j W )P(W ) + P(R j W )P(W ), where W = event that A wins first bet. Now P(W ) = p, P(W ) = q. P(R j W ) = P(R j+1 ) = q j+1 because, if he wins first bet he has $(j + 1). So, q j = q j+1 p + q j 1 q j = 1,..., (a 1) RECURRENCE RELATION To solve this, try q k = cx k cx j = pcx j+1 + qcx j 1 x = ps 2 + q AUXILIARY/CHARACTERISTIC EQUATIONS 0 = px 2 x + q 0 = (px q)(x 1) x = q/p x = 1 General solutions: case 1: If the roots are distinct (p q) q j = c 1 ( q p) j + c 2. case 2: If the roots are equal (p = q = 1 2 ) q j = c 1 + c 2 j. Particular solutions: using boundary conditions q 0 = 1, q a = 0 gives 34

16 case 1: p q q 0 = c 1 + c 2 q a = c 1 ( q p) a + c 2. Giving, q j = (q/p)j (q/p) a 1 (q/p) a (check!) case 2: p = q = 1 2 q 0 = c 1 q a = c 1 + ac 2 So, q j = 1 j a i.e. If A begins with $j, the probability that A is ruined is B with unlimited resources e.g. casino (q/p) j (q/p) a p q 1 (q/p) q j = a 1 j p = q = 1 a 2 Case 1: p q, q j = (q/p)j (q/p) a 1 (q/p) a. (a) p > q: As a, q j (q/p) j. (b) p < q: As a, q j = (p/q)a j 1 (p/q) a 1 1. case 2: p = q = 1/2. As a, q j = 1 j/a 1. So: If B has unlimited resources, A s probability of ultimate ruin when beginning with $j is 1 p q q j = (q/p) j p > q 35

17 5.2.2 Expected duration Let X= duration when A starts with $j. Let E(X) = D j. Let Y = A s winnings on first bet. So, P(Y = +1) = p P(Y = 1) = q. E(X) = E Y [E(X Y )] = y E(X Y = y)p(y = y) = E(X Y = 1)p + E(X Y = 1)q Now E(X Y = 1) = 1 + D j+1 E(X Y = 1) = 1 + D j 1 Hence, for 0 < j < a D j = (1 + D j+1 )p + (1 + D j 1 )q D j = pd j+1 + qd j second-order, non-homogeneous recurrence relation - so, add a particular solution to the general solution of the corresponding homogeneous recurrence relation. case 1: p q (one player has advantage) General solution for As before D j = c 1 + c 2 (q/p) j. D j = pd j+1 + qd j 1. Now find a particular solution for D j = pd j+1 +qd j 1 +1, try D j = j/(q p): j p(j + 1) = q p q p j = pj + qj + q(j 1) q p + 1 So general solution to non-homogeneous problem is: D j = c 1 + c 2 ( q p) j + 36 j (q p).

18 Find c 1 and c 2 from boundary conditions: case 2: p = q. 0 = D 0 = c 1 + c 2 0 = D a = c 1 + c 2 ( q p General solution for ) a + a [ q p c 2 1 c 2 = a (q p)[1 (q/p) a ] c 1 = a (q p)[1 (q/p) a ] D j = pd j+1 + qd j 1. As before D j = c 1 + c 2 j. A particular solution is D j = j 2. So general solution to non-homogeneous problem is: ( ) a ] q = a p q p. D j = c 1 + c 2 j j 2. Find c 1 and c 2 from boundary conditions: 0 = D 0 = c 1 0 = D a = a c 2 a c 1 = 0, c 2 = a. So, D j = j(a j). Note: this may not match your intuition. e.g. One player starts with $1000 and the other with $1. They each place $1 bets on a fair coin, until one or other is ruined. What is the expected duration of the game? We have a = 1001, j = 1, p = q = 1 2 Expected duration D j = j(a j) = 1(1001 1) = 1000 games! 37

19 5.3 Unrestricted random walks (one without barriers) Various questions of interest: what is the probability of return to the origin? is eventual return certain? how far from the origin is the particle likely to be after n steps? Let R = event that particle eventually returns to the origin. A = event that the first step is to the right. A = event that the first step is to the left. P(A) = p P(A) = q = 1 p P(R) = P(R A)P(A) + P(R A)P(A) Now: event R A is the event of eventual ruin when a gambler with a starting amount of $1 is playing against a casino with unlimited funds, so 1 p q P(R A) = q/p p > q Similarly, 1 p q P(R A) = p/q p < q (by replacing p with q). So p < q : P(R) = 2p; p = q : P(R) = 1; p > q : P(R) = 2q. i.e. return to the origin is certain only when p = q. p = q: the random walk is symmetric and in this case it is recurrent return to origin is certain. p q: return is not certain. There is a non-zero probability it will never return the random walk is transient. 38

20 Note: same arguments apply to every state all states are either recurrent or transient, the random walk is recurrent or transient. 5.4 Distribution of X n the position after n steps Suppose it has made x steps to the right and y to the left. Then x + y = n, so X n = x y = 2x n. So n even X n even n odd X n odd In particular P(X n = k) = 0 if n and k are not either both even or both odd. Let W n = number positive steps in first n steps Then W n Binomial(n, p). P(W n = x) = P(X n = 2x n) = P(X n = k) = ( ) n p x q n x 0 x n x ( ) n p x q n x x ( ) n p (n+k)/2 q (n k)/2 (n + k)/2 X n is sum of n iid rvs, Z i, so use CLT to see large n behaviour: CLT: X n is approx. N(E(X n ), var(x n )) large n. E(X n ) = E(Z i ) = [1 p + ( 1) q] = n(p q). var(x n ) = var(z i ) = [ E(Z 2 i ) E 2 (Z i ) ] = n[(1 p + 1 q) (p q) 2 ] = 4npq. So, If p > q the particle drifts to the right as n increases. this drift is faster, the larger p. 39

21 the variance increases with n. the variance is smaller the larger is p. 5.5 Return Probabilities Recall, probability of return of a SRW (simple random walk) with p + q = 1 is 1 if symmetric (p = q), < 1 otherwise (p q). When does the return occur? Let, f n = P(first return occurs at n) = P(X n = 0 and X r 0 for 0 < r < n) u n = P(some return occurs at n) = P(X n = 0) Since X 0 = 0: u 0 = 1 Define f 0 = 0 for convenience. We also have f 1 = u 1 = P(X 1 = 0). We have already found u n : Let u n = P(X n = 0) = ( ) n n/2 p n/2 q n/2 n even 0 n odd R = Event: return eventually occurs, f = P(R) R n = Event: first return is at n, f n = P(R n ) Then f = f 1 + f f n To decide if a RW is recurrent or not we could find the f n. Easier to find a relationship between f n and u n. Let F (s) = U(s) = f n s n n=0 u n s n n=0 a pgf if f n = 1: true if recurrent not a pgf because u n 1 in general. 40

22 For any random walk u 1 = f 1 u 2 = f 1 u 1 + f 2 = f 0 u 2 + f 1 u 1 + f 2 u 0 u 3 = f 0 u 3 + f 1 u 2 + f 2 u 1 + f 3 u 0. In general, Now, u n = f 0 u n + f 1 u n f n 1 u 1 + f n u 0 n 1. (3) F (s)u(s) = = = = ( r=0 n=0 n=1 n=0 f r s r ) u q s q q=0 (f 0 u n + f 1 u n f n u 0 )s n u n s n from 3 and f 0 u 0 = 0 u n s n u 0 = U(s) 1. That is U(s) = 1 + F (s)u(s); F (s) = 1 1, U(s) 0. U(s) Let s 1 to give f n = 1 1/ u n, So: f n = 1 iff u n = a RW is recurrent iff sum of return probabilities is. 41

Probability Models. 4. What is the definition of the expectation of a discrete random variable?

Probability Models. 4. What is the definition of the expectation of a discrete random variable? 1 Probability Models The list of questions below is provided in order to help you to prepare for the test and exam. It reflects only the theoretical part of the course. You should expect the questions

More information

Mathematical Methods for Computer Science

Mathematical Methods for Computer Science Mathematical Methods for Computer Science Computer Science Tripos, Part IB Michaelmas Term 2016/17 R.J. Gibbens Problem sheets for Probability methods William Gates Building 15 JJ Thomson Avenue Cambridge

More information

Lecture 4 - Random walk, ruin problems and random processes

Lecture 4 - Random walk, ruin problems and random processes Lecture 4 - Random walk, ruin problems and random processes Jan Bouda FI MU April 19, 2009 Jan Bouda (FI MU) Lecture 4 - Random walk, ruin problems and random processesapril 19, 2009 1 / 30 Part I Random

More information

Homework 6: Solutions Sid Banerjee Problem 1: (The Flajolet-Martin Counter) ORIE 4520: Stochastics at Scale Fall 2015

Homework 6: Solutions Sid Banerjee Problem 1: (The Flajolet-Martin Counter) ORIE 4520: Stochastics at Scale Fall 2015 Problem 1: (The Flajolet-Martin Counter) In class (and in the prelim!), we looked at an idealized algorithm for finding the number of distinct elements in a stream, where we sampled uniform random variables

More information

1 Presessional Probability

1 Presessional Probability 1 Presessional Probability Probability theory is essential for the development of mathematical models in finance, because of the randomness nature of price fluctuations in the markets. This presessional

More information

UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON

UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON UNIVERSITY OF LONDON IMPERIAL COLLEGE LONDON BSc and MSci EXAMINATIONS (MATHEMATICS) MAY JUNE 23 This paper is also taken for the relevant examination for the Associateship. M3S4/M4S4 (SOLUTIONS) APPLIED

More information

14 Branching processes

14 Branching processes 4 BRANCHING PROCESSES 6 4 Branching processes In this chapter we will consider a rom model for population growth in the absence of spatial or any other resource constraints. So, consider a population of

More information

Tom Salisbury

Tom Salisbury MATH 2030 3.00MW Elementary Probability Course Notes Part V: Independence of Random Variables, Law of Large Numbers, Central Limit Theorem, Poisson distribution Geometric & Exponential distributions Tom

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

3. DISCRETE RANDOM VARIABLES

3. DISCRETE RANDOM VARIABLES IA Probability Lent Term 3 DISCRETE RANDOM VARIABLES 31 Introduction When an experiment is conducted there may be a number of quantities associated with the outcome ω Ω that may be of interest Suppose

More information

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8.

Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Suppose that you have three coins. Coin A is fair, coin B shows heads with probability 0.6 and coin C shows heads with probability 0.8. Coin A is flipped until a head appears, then coin B is flipped until

More information

6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

More information

Stat 100a, Introduction to Probability.

Stat 100a, Introduction to Probability. Stat 100a, Introduction to Probability. Outline for the day: 1. Geometric random variables. 2. Negative binomial random variables. 3. Moment generating functions. 4. Poisson random variables. 5. Continuous

More information

Chapter 5. Chapter 5 sections

Chapter 5. Chapter 5 sections 1 / 43 sections Discrete univariate distributions: 5.2 Bernoulli and Binomial distributions Just skim 5.3 Hypergeometric distributions 5.4 Poisson distributions Just skim 5.5 Negative Binomial distributions

More information

CASE STUDY: EXTINCTION OF FAMILY NAMES

CASE STUDY: EXTINCTION OF FAMILY NAMES CASE STUDY: EXTINCTION OF FAMILY NAMES The idea that families die out originated in antiquity, particilarly since the establishment of patrilineality (a common kinship system in which an individual s family

More information

Generating Functions. STAT253/317 Winter 2013 Lecture 8. More Properties of Generating Functions Random Walk w/ Reflective Boundary at 0

Generating Functions. STAT253/317 Winter 2013 Lecture 8. More Properties of Generating Functions Random Walk w/ Reflective Boundary at 0 Generating Function STAT253/37 Winter 203 Lecture 8 Yibi Huang January 25, 203 Generating Function For a non-negative-integer-valued random variable T, the generating function of T i the expected value

More information

1 Random Variable: Topics

1 Random Variable: Topics Note: Handouts DO NOT replace the book. In most cases, they only provide a guideline on topics and an intuitive feel. 1 Random Variable: Topics Chap 2, 2.1-2.4 and Chap 3, 3.1-3.3 What is a random variable?

More information

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014 Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note

More information

Stat 134 Fall 2011: Notes on generating functions

Stat 134 Fall 2011: Notes on generating functions Stat 3 Fall 0: Notes on generating functions Michael Lugo October, 0 Definitions Given a random variable X which always takes on a positive integer value, we define the probability generating function

More information

8 Laws of large numbers

8 Laws of large numbers 8 Laws of large numbers 8.1 Introduction We first start with the idea of standardizing a random variable. Let X be a random variable with mean µ and variance σ 2. Then Z = (X µ)/σ will be a random variable

More information

Things to remember when learning probability distributions:

Things to remember when learning probability distributions: SPECIAL DISTRIBUTIONS Some distributions are special because they are useful They include: Poisson, exponential, Normal (Gaussian), Gamma, geometric, negative binomial, Binomial and hypergeometric distributions

More information

MAT 135B Midterm 1 Solutions

MAT 135B Midterm 1 Solutions MAT 35B Midterm Solutions Last Name (PRINT): First Name (PRINT): Student ID #: Section: Instructions:. Do not open your test until you are told to begin. 2. Use a pen to print your name in the spaces above.

More information

Lectures on Markov Chains

Lectures on Markov Chains Lectures on Markov Chains David M. McClendon Department of Mathematics Ferris State University 2016 edition 1 Contents Contents 2 1 Markov chains 4 1.1 The definition of a Markov chain.....................

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem 1 Gambler s Ruin Problem Consider a gambler who starts with an initial fortune of $1 and then on each successive gamble either wins $1 or loses $1 independent of the past with probabilities p and q = 1

More information

the law of large numbers & the CLT

the law of large numbers & the CLT the law of large numbers & the CLT Probability/Density 0.000 0.005 0.010 0.015 0.020 n = 4 0.0 0.2 0.4 0.6 0.8 1.0 x-bar 1 sums of random variables If X,Y are independent, what is the distribution of Z

More information

CS145: Probability & Computing

CS145: Probability & Computing CS45: Probability & Computing Lecture 5: Concentration Inequalities, Law of Large Numbers, Central Limit Theorem Instructor: Eli Upfal Brown University Computer Science Figure credits: Bertsekas & Tsitsiklis,

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

Gambler s Ruin with Catastrophes and Windfalls

Gambler s Ruin with Catastrophes and Windfalls Journal of Grace Scientific Publishing Statistical Theory and Practice Volume 2, No2, June 2008 Gambler s Ruin with Catastrophes and Windfalls B unter, Department of Mathematics, University of California,

More information

Extinction of Family Name. Student Participants: Tianzi Wang Jamie Xu Faculty Mentor: Professor Runhuan Feng

Extinction of Family Name. Student Participants: Tianzi Wang Jamie Xu Faculty Mentor: Professor Runhuan Feng Extinction of Family Name Student Participants: Tianzi Wang Jamie Xu Faculty Mentor: Professor Runhuan Feng Description The purpose of this project is to give a brief introduction of the Galton- Watson-Bienaymé

More information

18.175: Lecture 17 Poisson random variables

18.175: Lecture 17 Poisson random variables 18.175: Lecture 17 Poisson random variables Scott Sheffield MIT 1 Outline More on random walks and local CLT Poisson random variable convergence Extend CLT idea to stable random variables 2 Outline More

More information

Lectures on Probability and Statistical Models

Lectures on Probability and Statistical Models Lectures on Probability and Statistical Models Phil Pollett Professor of Mathematics The University of Queensland c These materials can be used for any educational purpose provided they are are not altered

More information

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Theorems. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Theorems Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Section 9.1. Expected Values of Sums

Section 9.1. Expected Values of Sums Section 9.1 Expected Values of Sums Theorem 9.1 For any set of random variables X 1,..., X n, the sum W n = X 1 + + X n has expected value E [W n ] = E [X 1 ] + E [X 2 ] + + E [X n ]. Proof: Theorem 9.1

More information

Markov Processes Hamid R. Rabiee

Markov Processes Hamid R. Rabiee Markov Processes Hamid R. Rabiee Overview Markov Property Markov Chains Definition Stationary Property Paths in Markov Chains Classification of States Steady States in MCs. 2 Markov Property A discrete

More information

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015

Part IA Probability. Definitions. Based on lectures by R. Weber Notes taken by Dexter Chua. Lent 2015 Part IA Probability Definitions Based on lectures by R. Weber Notes taken by Dexter Chua Lent 2015 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after lectures.

More information

Theory of Stochastic Processes 3. Generating functions and their applications

Theory of Stochastic Processes 3. Generating functions and their applications Theory of Stochastic Processes 3. Generating functions and their applications Tomonari Sei sei@mist.i.u-tokyo.ac.jp Department of Mathematical Informatics, University of Tokyo April 20, 2017 http://www.stat.t.u-tokyo.ac.jp/~sei/lec.html

More information

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 )

1. Let X and Y be independent exponential random variables with rate α. Find the densities of the random variables X 3, X Y, min(x, Y 3 ) 1 Introduction These problems are meant to be practice problems for you to see if you have understood the material reasonably well. They are neither exhaustive (e.g. Diffusions, continuous time branching

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Topic 3: The Expectation of a Random Variable

Topic 3: The Expectation of a Random Variable Topic 3: The Expectation of a Random Variable Course 003, 2017 Page 0 Expectation of a discrete random variable Definition (Expectation of a discrete r.v.): The expected value (also called the expectation

More information

Discrete Random Variables

Discrete Random Variables CPSC 53 Systems Modeling and Simulation Discrete Random Variables Dr. Anirban Mahanti Department of Computer Science University of Calgary mahanti@cpsc.ucalgary.ca Random Variables A random variable is

More information

3 Multiple Discrete Random Variables

3 Multiple Discrete Random Variables 3 Multiple Discrete Random Variables 3.1 Joint densities Suppose we have a probability space (Ω, F,P) and now we have two discrete random variables X and Y on it. They have probability mass functions f

More information

A Note on Bisexual Galton-Watson Branching Processes with Immigration

A Note on Bisexual Galton-Watson Branching Processes with Immigration E extracta mathematicae Vol. 16, Núm. 3, 361 365 (2001 A Note on Bisexual Galton-Watson Branching Processes with Immigration M. González, M. Molina, M. Mota Departamento de Matemáticas, Universidad de

More information

Basic concepts of probability theory

Basic concepts of probability theory Basic concepts of probability theory Random variable discrete/continuous random variable Transform Z transform, Laplace transform Distribution Geometric, mixed-geometric, Binomial, Poisson, exponential,

More information

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected 4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X

More information

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3 Introduction to Probability Due:August 8th, 211 Solutions of Final Exam Solve all the problems 1. (15 points) You have three coins, showing Head with probabilities p 1, p 2 and p 3. You perform two different

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2

MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 MATH 56A: STOCHASTIC PROCESSES CHAPTER 2 2. Countable Markov Chains I started Chapter 2 which talks about Markov chains with a countably infinite number of states. I did my favorite example which is on

More information

Chapter 6: Branching Processes: The Theory of Reproduction

Chapter 6: Branching Processes: The Theory of Reproduction 116 Chapter 6: Branching Processes: The Theory of Reproduction Viruses Aphids Royalty DNA Although the early development of Probability Theory was motivated by problems in gambling, probabilists soon realised

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

On random walks. i=1 U i, where x Z is the starting point of

On random walks. i=1 U i, where x Z is the starting point of On random walks Random walk in dimension. Let S n = x+ n i= U i, where x Z is the starting point of the random walk, and the U i s are IID with P(U i = +) = P(U n = ) = /2.. Let N be fixed (goal you want

More information

q P (T b < T a Z 0 = z, X 1 = 1) = p P (T b < T a Z 0 = z + 1) + q P (T b < T a Z 0 = z 1)

q P (T b < T a Z 0 = z, X 1 = 1) = p P (T b < T a Z 0 = z + 1) + q P (T b < T a Z 0 = z 1) Random Walks Suppose X 0 is fixed at some value, and X 1, X 2,..., are iid Bernoulli trials with P (X i = 1) = p and P (X i = 1) = q = 1 p. Let Z n = X 1 + + X n (the n th partial sum of the X i ). The

More information

Name of the Student:

Name of the Student: SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct

More information

Question: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds.

Question: My computer only knows how to generate a uniform random variable. How do I generate others? f X (x)dx. f X (s)ds. Simulation Question: My computer only knows how to generate a uniform random variable. How do I generate others?. Continuous Random Variables Recall that a random variable X is continuous if it has a probability

More information

HANDBOOK OF APPLICABLE MATHEMATICS

HANDBOOK OF APPLICABLE MATHEMATICS HANDBOOK OF APPLICABLE MATHEMATICS Chief Editor: Walter Ledermann Volume II: Probability Emlyn Lloyd University oflancaster A Wiley-Interscience Publication JOHN WILEY & SONS Chichester - New York - Brisbane

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

Math 456: Mathematical Modeling. Tuesday, March 6th, 2018

Math 456: Mathematical Modeling. Tuesday, March 6th, 2018 Math 456: Mathematical Modeling Tuesday, March 6th, 2018 Markov Chains: Exit distributions and the Strong Markov Property Tuesday, March 6th, 2018 Last time 1. Weighted graphs. 2. Existence of stationary

More information

Positive and null recurrent-branching Process

Positive and null recurrent-branching Process December 15, 2011 In last discussion we studied the transience and recurrence of Markov chains There are 2 other closely related issues about Markov chains that we address Is there an invariant distribution?

More information

arxiv: v2 [math.pr] 4 Sep 2017

arxiv: v2 [math.pr] 4 Sep 2017 arxiv:1708.08576v2 [math.pr] 4 Sep 2017 On the Speed of an Excited Asymmetric Random Walk Mike Cinkoske, Joe Jackson, Claire Plunkett September 5, 2017 Abstract An excited random walk is a non-markovian

More information

Probability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9

Probability. Computer Science Tripos, Part IA. R.J. Gibbens. Computer Laboratory University of Cambridge. Easter Term 2008/9 Probability Computer Science Tripos, Part IA R.J. Gibbens Computer Laboratory University of Cambridge Easter Term 2008/9 Last revision: 2009-05-06/r-36 1 Outline Elementary probability theory (2 lectures)

More information

Expected Value 7/7/2006

Expected Value 7/7/2006 Expected Value 7/7/2006 Definition Let X be a numerically-valued discrete random variable with sample space Ω and distribution function m(x). The expected value E(X) is defined by E(X) = x Ω x m(x), provided

More information

Chapter 11 Advanced Topic Stochastic Processes

Chapter 11 Advanced Topic Stochastic Processes Chapter 11 Advanced Topic Stochastic Processes CHAPTER OUTLINE Section 1 Simple Random Walk Section 2 Markov Chains Section 3 Markov Chain Monte Carlo Section 4 Martingales Section 5 Brownian Motion Section

More information

Population growth: Galton-Watson process. Population growth: Galton-Watson process. A simple branching process. A simple branching process

Population growth: Galton-Watson process. Population growth: Galton-Watson process. A simple branching process. A simple branching process Population growth: Galton-Watson process Population growth: Galton-Watson process Asimplebranchingprocess September 27, 2013 Probability law on the n-th generation 1/26 2/26 A simple branching process

More information

Mathematical Statistics 1 Math A 6330

Mathematical Statistics 1 Math A 6330 Mathematical Statistics 1 Math A 6330 Chapter 3 Common Families of Distributions Mohamed I. Riffi Department of Mathematics Islamic University of Gaza September 28, 2015 Outline 1 Subjects of Lecture 04

More information

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages ELEC206 Probability and Random Processes, Fall 2014 Gil-Jin Jang gjang@knu.ac.kr School of EE, KNU page 1 / 15 Chapter 7. Sums of Random

More information

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

STAT2201. Analysis of Engineering & Scientific Data. Unit 3 STAT2201 Analysis of Engineering & Scientific Data Unit 3 Slava Vaisman The University of Queensland School of Mathematics and Physics What we learned in Unit 2 (1) We defined a sample space of a random

More information

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables

ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 2 Discrete Random Variables Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Random Variable Discrete Random

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

Homework for 1/13 Due 1/22

Homework for 1/13 Due 1/22 Name: ID: Homework for 1/13 Due 1/22 1. [ 5-23] An irregularly shaped object of unknown area A is located in the unit square 0 x 1, 0 y 1. Consider a random point distributed uniformly over the square;

More information

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN

Lecture Notes 5 Convergence and Limit Theorems. Convergence with Probability 1. Convergence in Mean Square. Convergence in Probability, WLLN Lecture Notes 5 Convergence and Limit Theorems Motivation Convergence with Probability Convergence in Mean Square Convergence in Probability, WLLN Convergence in Distribution, CLT EE 278: Convergence and

More information

MATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015

MATH 564/STAT 555 Applied Stochastic Processes Homework 2, September 18, 2015 Due September 30, 2015 ID NAME SCORE MATH 56/STAT 555 Applied Stochastic Processes Homework 2, September 8, 205 Due September 30, 205 The generating function of a sequence a n n 0 is defined as As : a ns n for all s 0 for which

More information

Chapter 4. Chapter 4 sections

Chapter 4. Chapter 4 sections Chapter 4 sections 4.1 Expectation 4.2 Properties of Expectations 4.3 Variance 4.4 Moments 4.5 The Mean and the Median 4.6 Covariance and Correlation 4.7 Conditional Expectation SKIP: 4.8 Utility Expectation

More information

Relationship between probability set function and random variable - 2 -

Relationship between probability set function and random variable - 2 - 2.0 Random Variables A rat is selected at random from a cage and its sex is determined. The set of possible outcomes is female and male. Thus outcome space is S = {female, male} = {F, M}. If we let X be

More information

Math 510 midterm 3 answers

Math 510 midterm 3 answers Math 51 midterm 3 answers Problem 1 (1 pts) Suppose X and Y are independent exponential random variables both with parameter λ 1. Find the probability that Y < 7X. P (Y < 7X) 7x 7x f(x, y) dy dx e x e

More information

1 Generating functions

1 Generating functions 1 Generating functions Even quite straightforward counting problems can lead to laborious and lengthy calculations. These are greatly simplified by using generating functions. 2 Definition 1.1. Given a

More information

Intro to Probability. Andrei Barbu

Intro to Probability. Andrei Barbu Intro to Probability Andrei Barbu Some problems Some problems A means to capture uncertainty Some problems A means to capture uncertainty You have data from two sources, are they different? Some problems

More information

6 Stationary Distributions

6 Stationary Distributions 6 Stationary Distributions 6. Definition and Examles Definition 6.. Let {X n } be a Markov chain on S with transition robability matrix P. A distribution π on S is called stationary (or invariant) if π

More information

6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes

6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes 6.207/14.15: Networks Lecture 3: Erdös-Renyi graphs and Branching processes Daron Acemoglu and Asu Ozdaglar MIT September 16, 2009 1 Outline Erdös-Renyi random graph model Branching processes Phase transitions

More information

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. Stationary independent increments 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process. 2. If each set of increments, corresponding to non-overlapping collection of

More information

Markov Chains (Part 3)

Markov Chains (Part 3) Markov Chains (Part 3) State Classification Markov Chains - State Classification Accessibility State j is accessible from state i if p ij (n) > for some n>=, meaning that starting at state i, there is

More information

4 Moment generating functions

4 Moment generating functions 4 Moment generating functions Moment generating functions (mgf) are a very powerful computational tool. They make certain computations much shorter. However, they are only a computational tool. The mgf

More information

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation ECE353: Probability and Random Processes Lecture 5 - Cumulative Distribution Function and Expectation Xiao Fu School of Electrical Engineering and Computer Science Oregon State University E-mail: xiao.fu@oregonstate.edu

More information

Statistics 3657 : Moment Generating Functions

Statistics 3657 : Moment Generating Functions Statistics 3657 : Moment Generating Functions A useful tool for studying sums of independent random variables is generating functions. course we consider moment generating functions. In this Definition

More information

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order.

Order Statistics. The order statistics of a set of random variables X 1, X 2,, X n are the same random variables arranged in increasing order. Order Statistics The order statistics of a set of random variables 1, 2,, n are the same random variables arranged in increasing order. Denote by (1) = smallest of 1, 2,, n (2) = 2 nd smallest of 1, 2,,

More information

Nonparametric hypothesis tests and permutation tests

Nonparametric hypothesis tests and permutation tests Nonparametric hypothesis tests and permutation tests 1.7 & 2.3. Probability Generating Functions 3.8.3. Wilcoxon Signed Rank Test 3.8.2. Mann-Whitney Test Prof. Tesler Math 283 Fall 2018 Prof. Tesler Wilcoxon

More information

1 Random Walks and Electrical Networks

1 Random Walks and Electrical Networks CME 305: Discrete Mathematics and Algorithms Random Walks and Electrical Networks Random walks are widely used tools in algorithm design and probabilistic analysis and they have numerous applications.

More information

CS 361: Probability & Statistics

CS 361: Probability & Statistics February 19, 2018 CS 361: Probability & Statistics Random variables Markov s inequality This theorem says that for any random variable X and any value a, we have A random variable is unlikely to have an

More information

Chernoff Bounds. Theme: try to show that it is unlikely a random variable X is far away from its expectation.

Chernoff Bounds. Theme: try to show that it is unlikely a random variable X is far away from its expectation. Chernoff Bounds Theme: try to show that it is unlikely a random variable X is far away from its expectation. The more you know about X, the better the bound you obtain. Markov s inequality: use E[X ] Chebyshev

More information

Chapter 3 Single Random Variables and Probability Distributions (Part 1)

Chapter 3 Single Random Variables and Probability Distributions (Part 1) Chapter 3 Single Random Variables and Probability Distributions (Part 1) Contents What is a Random Variable? Probability Distribution Functions Cumulative Distribution Function Probability Density Function

More information

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019

Lecture 10: Probability distributions TUESDAY, FEBRUARY 19, 2019 Lecture 10: Probability distributions DANIEL WELLER TUESDAY, FEBRUARY 19, 2019 Agenda What is probability? (again) Describing probabilities (distributions) Understanding probabilities (expectation) Partial

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2017 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p). CHAPTER 6 Solution to Problem 6 (a) The random variable R is binomial with parameters p and n Hence, ( ) n p R(r) = ( p) n r p r, for r =0,,,,n, r E[R] = np, and var(r) = np( p) (b) Let A be the event

More information

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University Chapter 3, 4 Random Variables ENCS6161 - Probability and Stochastic Processes Concordia University ENCS6161 p.1/47 The Notion of a Random Variable A random variable X is a function that assigns a real

More information

A Generalized Geometric Distribution from Vertically Dependent Bernoulli Random Variables

A Generalized Geometric Distribution from Vertically Dependent Bernoulli Random Variables Academic Advances of the CTO, Vol. 1, Issue 3, 2017 Original Research Article A Generalized Geometric Distribution from Vertically Dependent Bernoulli Random Variables Rachel Traylor, Ph.D. Abstract This

More information

Interlude: Practice Final

Interlude: Practice Final 8 POISSON PROCESS 08 Interlude: Practice Final This practice exam covers the material from the chapters 9 through 8. Give yourself 0 minutes to solve the six problems, which you may assume have equal point

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

. Find E(V ) and var(v ).

. Find E(V ) and var(v ). Math 6382/6383: Probability Models and Mathematical Statistics Sample Preliminary Exam Questions 1. A person tosses a fair coin until she obtains 2 heads in a row. She then tosses a fair die the same number

More information

Random Models. Tusheng Zhang. February 14, 2013

Random Models. Tusheng Zhang. February 14, 2013 Random Models Tusheng Zhang February 14, 013 1 Introduction In this module, we will introduce some random models which have many real life applications. The course consists of four parts. 1. A brief review

More information