Discree Markov Processes 1. Inroducion 1. Probabiliy Spaces and Random Variables Sample space. A model for evens: is a family of subses of such ha c (1) if A, hen A, (2) if A 1, A 2,..., hen A1 A 2..., (3). This is called -algebra. Probabiliy measure is a funcion P: [0, 1] such ha (a) P( ) = 1, (b) if A i s disjoin, hen 1
P(A1 A 2... ) = P(A 1) + P(A 2) +... Evens A 1, A 2... are independen, if for any subse P(An1 A n2...) = P(A n1)p(a n2), Triple (,, P) is a probabiliy space. X: is a random variable, if for all x, { X( ) x }. A collecion of random variables {X T}, defined on (,, P), is a sochasic process. F(x) = P(X x) is cumulaive disribuion funcion of X F (x) = f(x) is densiy funcion of X (if i exiss). 2
G(s) = E[exp(sX)] is momen generaing funcion of X. Abbreviaions: c.d.f., d.f., and m.g.f. Ex. 1. X ~ N(0, 1), or X has sandard normal -1/2 2 disribuion, if f(x) = (2 ) exp(-x /2). Ex. 2. X ~ Exp( ), > 0, or X has exponenial disribuion, if F(x) = 1 - exp(- x) for x > 0. Ex. 3. X ~ Po( ), > 0, or X has Poisson disribuion, k if P(X = k) = exp(- ) /k! For k = 0, 1, 2,... Ex. 4. X ~ Bin(n, p), 0 < p < 1 and n = 1, 2,..., or X has binomial disribuion, if k n - k P(X = k) = n!/(k!(n - k)!)p (1 - p), for k = 0, 1,..., n. Ex. 5. X ~ Geo(p), 0 < p < 1, or X has geomeric k - 1 disribuion, if P(X = k) = p(1 - p), k = 1, 2,... Ex. 6. T = {1, 2, 3,...}, X ~ Ber(p), T, independen. 3
Then, {X T} is a Bernoulli process. Here, = {0, 1} {0, 1}..., and = all subses of. 1.2. Srucure of Gambling Problems T = {1, 2, 3,...} corresponds o a sequence of games Def. X = 1 if player wins game, X = 0 oherwise. Assume he Bernoulli process model of Ex. 6. Suppose each game coss 1 uni for he player. If he player wins, he ges y 1 unis. Def. Y = resul (or winnings ) of game, or Y = -1, if player loses, Y = y 0, if player wins. If y = (1 - p)/p, hen E[Y ] = 0, or he game is fair. Def. N = X 1 +... + X = # imes player wins in firs games 4
S = Y 1 +... + Y = oal winnings in firs games. Ex. If = hisory up o, hen E[N+1 ] = N + p. Can apply Cenral Limi Theorem o N and S! Borel-Canelli lemma. Evens A 1, A 2..., wih P(A k) = p k. Suppose k p k <. Then, wih probabiliy 1, only finiely many of he evens A k occur. Proof. B n = k n A k= a leas one of evens A n, A n+1,... occurs, B = n 1B n = infiniely many of evens A 1, A 2,... occur. P(B) P(B n) k n p k 0 Applicaion o Gambling: Bernoulli process wih 0 < p < ½, and y = 1 (unfair!). 5
P(player is on his own infiniely ofen) = P(n wins in 2n games for infiniely many n) = 0. 1/2 n -n (Use Sirling s formula: n! ~ (2 n) n e ) 1.3. Condiional Probabiliies and Expecaions Def. A, B, P(B) > 0. Condiional probabiliy of A given B is P(A B) = P(A B)/P(B). Ex. A = { X > a}, B = {X > b}, a > b > 0, where X ~ Exp( ). Then, P(A B) = exp(-(a - b) ). Def. X, Y discree rv s wih P(Y = y) > 0. The condiional expecaion of X given Y = y is E[X Y = y] = x P(X = x Y = y). Noe, E[X Y] is a rv ha depends on Y, bu is a consan w.r.. X! Recall: E[E[X Y]] = E[X]. x Ex. X i ~ Po( i), i > 0, i = 1, 2 independen. Y = X 1 + X 2. Then, he disribuion of X 1 given Y = y is, (X1 Y = y) ~ Bin(y, 1/( 1 + 2)). Therefore, E[X Y = y] = y /( + ). 1 1 1 2 6
Def. X, Y discree rv s wih P(Y = y) > 0 for all y. The condiional variance of X given Y is 2 2 Var(X Y) = E[X Y] - E[X Y]. Th. Var(X) = E[Var(X Y)] + Var(E[X Y]). 1.4. Expecaion of a Waiing Time A rv X 0 defined on (,, P) can be inerpreed as a waiing ime, wih a survival funcion p(x) = P(X > x). Define a Bernoulli process {X T} such ha X( ) = 1 if X( ) >, and X ( ) = 0 oherwise, for all. Then, I follows ha Ex. 1. X ~ Exp( ), > 0, and a > 0. Then, E[X] = 1/, and E[X X > a] = a + 1/. Ex. 2. X ~ Geo(p), 0 < p < 1, saisfies P(X > k) = k (1 - p), so E[X] = 1/p. 7
1.5. Two Applicaions Ex. 1. S. Peersburg paradox: if you increase bes o cover pas losses, bu sop a firs win, you win wih cerainy in fair, and even in unfavorable, games! Ex. 2. N ~ Po( ) is # of accidens, X i ~ Po( ) is # hospialized in acciden i, Y = X +...+ X. Var(Y) =? 1 N 8