Solutions For Stochastic Process Final Exam (a) λ BMW = 20 0% = 2 X BMW Poisson(2) Let N t be the number of BMWs which have passes during [0, t] Then the probability in question is P (N ) = P (N = 0) = e 2 (b) λ non-bmw = 20 90% = 8 X non-bmw Poisson(8) Let N t be the number of non-bmws which have passes during [0, t] Then the expectation in question is (c) (d) E(N 0 + N 0 N 0 = 5) = E(N 0 N 0 = 5) + E(N 0 N 0 = 5) = 5 + E(N 0) = 5 + λ non-bmw 0 = 5 + 8 0 = 95 P {N = 0 N 3 = 5} = P {N = 0, N 3 = 5} P {N 3 = 5} = P {N = 0, N 3 N = 5} P {N 3 = 5} = P {N = 0}P {N 3 N = 5} P {N 3 = 5} = e 2 e 4 45 5! e 6 6 5 5! = ( 2 3 )5 = 2 243 P {N 4 = 6, N 4 = 6 N 0 = 8} = P {N 4 = 6, N 0 = 8, N 4 = 6} P {N 0 = 8} = P {N 4 = 6, N 0 N 4 = 2, N 4 = 6} P {N 0 = 8} = P {N 4 = 6}P {N 0 N 4 = 2}P {N 4 = 6} P {N 0 = 8} = e 8 8 6 6! e 2 2 2 2! e 20 20 8 8! e 72 72 6 6! (e) E(T 50 ) = 50 2 = 25 The expected time in question is PM 2:25
2 Firstly, we compute 3 (a) E[Z(t) N(t) = n] = E[ = E[ = = Θ k (t) N(t) = n] k= ξ k e α(t Sk) N(t) = n] k= E[ξ k e α(t Sk) N(t) = n] k= E[ξ k N(t) = n]e[e e αs k N(t) = n] k= = E(ξ )e E[e αs k N(t) = n] k= = E(ξ )e E[e αs + e αs 2 + + e αs n N(t) = n] = E(ξ )e E[e αu + e αu 2 + + e αun ] = E(ξ )e ne[e αu ] = E(ξ )e n e = ne(ξ ) e ThenE[Z(t)] = E[E[Z(t) N(t) = n]] = E[N(t)E(ξ ) ] = E(ξ ) e E[N(t)] = λe(ξ ) e α {T = } = {X = 6} σ(x ) {T = 2} = {(X, X 2 ) X =, 2,, 5, X 2 = 6} σ(x, X 2 ) {T = 3} = {(X, X 2, X 3 ) X =, 2,, 5, X 2 =, 2,, 5, X 3 =, 2,, 6} σ(x, X 2, X 3 ) For k > 3, {T = k} = σ(x, X 2, X 3 ) T is a stopping time 2
(b) (c) S T = T X i i= E(S T ) = E(X + X 2 + + X T ) = E(X )E(T ) (by Wald s equation) = [( + 2 + + 6) 6 ] [ 6 + 2 5 6 6 + 3 (5 6 )2 ] = 7 2 9 36 = 637 72 n = : E(S T T = ) = E(X T = ) = 6 n = 2 : E(S T T = 2) = E(X + X 2 T = 2) = E(X T = 2) + E(X 2 T = 2) = ( + 2 + 5) + 6 = 9 5 n = 3 : E(S T T = 3) = E(X + X 2 + X 3 T = 3) = E(X T = 3) + E(X 2 T = 3) + E(X 3 T = 3) = 5 ( + 2 + + 5) + 5 ( + 2 + + 5) + ( + 2 + + 6) 6 = 95 (d) E(S T ) = E[(S T T = n)] = P (T = )E(S T T = ) + P (T = 2)E(S T T = 2) + P (T = 3)E)(S T T = 3) = 6 6 + 5 6 6 9 + 5 6 5 6 95 = 637 72 4 (a) Let x denote the number of machines in operation condition at the end of a period Let y denote the number of machines that have been repaired for one period Then the state space S = {(x, y)} = {(2, 0), (, 0), (, ), (0, 0), (0, )} P = (2, 0) (, 0) (, ) (0, 0) (0, ) (2, 0) ( α) 2 2α( α) 0 α 2 0 (, 0) 0 0 ( β) 0 β (, ) ( β) β 0 0 0 (0, 0) 0 0 0 0 (0, ) 0 0 0 0 (b) Let π = (π, π 2, π 3, π 4, π 5 ) be the limiting distribution of the Markov chain ( α) 2 π + ( β)π 3 = π 2α( α)π + βπ 3 + π 5 = π 2 ( β)π Solve 2 = π 3 α 2, π = π 4 βπ 2 + π 4 = π 5 π + π 2 + π 3 + π 4 + π 5 = 3
then we have π = π 2 = π 3 = π 4 = and π 5 = ( β) 2 (2α 2 + )( β) 2 + 2α(2 α), α(2 α) (2α 2 + )( β) 2 + 2α(2 α), α( β)(2 α) (2α 2 + )( β) 2 + 2α(2 α), α 2 ( β) 2 (2α 2 + )( β) 2 + 2α(2 α), αβ(2 α) + α 2 ( β) 2 (2α 2 + )( β) 2 + 2α(2 α) 5 (a) P = The probability in question is α(2 α)(2 β) + ( β)2 π + π 2 + π 3 = (2α 2 + )( β) 2 + 2α(2 α) α=0,β=02 = 9508% c c + c + d 0 0 c 0 0 c + 0 0 c c c + d 0 0 c c c c + c + d 0 0 c c (b) P 2 c = P P = 0 0 c c c + 0 0 c + d 0 0 c c + c + d 0 0 P 3 = P 2 c P = 0 0 c + 0 0 = P c c c + d 0 0 c c { P, for k = 2n + ; By Induction, for k N and n Z, P k = P 2, for k = 2n Therefore, 4
6 lim P 2n = n and lim P 2n+ = n c c + c + d 0 0 c c c 0 0 c c c + 0 0 c + d 0 0 c c + c + d 0 0 c 0 0 c + 0 0 c c c + d 0 0 c c E(N t λt N τ λτ) = E(N t N τ + N τ λτ + λτ λt N τ λτ) = E(N t N τ ) + N τ λτ + λτ λt = λ(t τ) + N τ λτ λ(t τ) = N τ λτ {N T λt : t 0} is a martingale 7 (a) P {V t s N u, u t} = P {N t+s N t N u, u t} = P {N t+s N t } = P {N s } = P {N s = 0} = e λs (b) If V t were a stopping time of the Poisson process, we would have {V t s} is an event in the σ-algebra σ{n u ; u s} for all s > 0 Mathematically, this amounts to the following relation: {V t s} σ{n u ; u s}, s > 0 () Each sample point ω in σ{n u ; u s} is a coset (or equivalence class ) by ignoring the information of each Poisson sample particle after time s, so that if two Poisson sample 5
points differ from each other only after time s, both are members of the same coset ω Then, to verify (), we need to answer the following query with a deterministic YES/NO answer For each ω σ{n u ; u s}, is ω {V t s}? (2) We note that {U t s} = {t T Nt s} = {T Nt t s} In other words, the set {U t s} consists of all the sample points whose last arrival prior to t occurred at or after time mark t s Without loss of generality, we assume that t s < s as the above graph shows The other case, however, can be similarly proved Suppose ω σ{n u ; u s} has an arrival occurring between the time interval [t s, s] In this case, regardless what could happen after time s, we are certain that this ω (to be precise, all the members in this coset ω) must have its last arrival prior to t happen after t s The answer to the query (2) is therefore YES for such an ω On the other hand, Suppose ω σ{n u ; u s} does not have an arrival occurring between the time interval [t s, s] Then, some member of ω could be in {T Nt t s} if it has an arrival in (s, t] whereas some member of ω isn t if it doesn t have an arrival in (s, t] Since ω is lack of information after s, both situations are possible As the result, we do not have a definite YES or NO answer to the above query (2) for ω Hence V t is not a stopping time of the Poisson process To check whether V t is a stopping time, we have to verify {V t s} σ{n u ; u s} This can not be true if s < t Given the information only up to s < t, we can really say nothing about what will happen after time t In other words, V t is obviously not a stopping time (c) We first claim that T Nt is not a stopping time If s < t, we have {T Nt s} = {N t N s = 0} In other words, the last arrival before t actually happened before an earlier time s if, and only if there is no arrival in (s, t] However, {N t N s = 0} σ{n u ; u s} since {N t N s = 0} requires information after time s Hence T Nt is not a stopping time On the other hand, T Nt + can be shown to be a stopping time If s t, since T Nt + > t by definition, we have {T Nt + s} = σ{n u ; u s} If s > t, then we can observe the information in (t, s] for each ω in σ{n u ; u s} If an ω is such that it has no arrival on (t, s], this ω can not belong to {T Nt + s} If an ω has at least one arrival on (t, s], that ω belongs to {T Nt+ s} Namely, we have just shown {T Nt+ s} = {N s N t } σ{n u ; u s} The Poisson process paradox does not contradict the strong Markovian property because T Nt+ T Nt is not the difference of two stopping times Therefore, the distribution of T Nt+ T Nt is not necessarily exponential (d) By E[T n+ T n ], we are averaging the inter-arrival time between the n th and the (n + ) th event over all possible sample particles ω On the other hand, given an observation time t, E[T Nt + T Nt ] averages, for each ω, the particular interval [T Nt, T Nt +] that covers t Therefore, there is no paradox After all, we are talking about two different things: E[T n+ T n ] and E[T Nt + T Nt ] In the class, we have seen that the latter is actually larger Why is E[T Nt + T Nt ] larger? In a Poisson process, the inter-arrival times are iid exponentially distributed and the average is /λ More precisely, the length of the time interval [T, T 2 ] is random, depending on ω, and its average E[T 2 T ] = /λ Similarly, the length of the time interval [T 5, T 6 ] is random, depending on ω, but independent of the length [T, T 2 ](ω), obeying again the exponential distribution with an average E[T 6 T 5 ] = 6
/λ In general, we have E[T n+ T n ] = /λ for any n This average is sort of a space average as the average is taken over the state (space) space Do we have a time average for the inter-arrival time here? The answer is positive as the Poisson process is Markovian! Let us consider E[ m m k=0 T k T k ] (3) It is easily computed that the expected value is /λ since each one is /λ Given a sample particle ω, the interpretation for m T k T k (ω) is the average of the first m interarrival times for this ω If we further take the mean (over all possible ω s) of the averaged m k=0 first m inter-arrival time, it is the meaning of (3) Now, let m tend to the infinity and the limit is /λ, which is the long run mean of the inter-arrival times, over all sample particles; and over all inter-arrival times Now, we can interpret why E[T Nt + T Nt ] is longer We first notice that, for any ω, its inter-arrival times are not of uniform lengths, but consist of short and long periods According to (3), the average length of the short and long ones should be /λ over all possible ω s If we make a random observation at time t and at a particular ω and assume that the observation is independent of everything in the Poisson process, it is more likely that we hit those longer ones in ω Similarly, suppose another ω is observed, we again have a better chance to hit those longer intervals in ω In other words, for almost all ω, we are confident that we have always observed a longer interval Consequently, the averaged observed interval should be longer than the averaged inter-arrival times, which is /λ so that we have the following E[T Nt+ T Nt ] > /λ = E[T n+ T n ] On a particular day ω when you are going to ride a bus whose arrival process is Poisson, for the majority of the passengers they will experience that they hit the longer inter-arrival periods of the day As a result, upon your arrival (think of you belonging to the majority unless you get lucky), regardless when the previous bus had left the station, the expected remaining time for the next bus is still /λ That is, if the information says that the bus is served on a 5 minutes basis, no matter when you arrive the station, you d be prepared to wait for 5 minutes on average 7