IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space. Suppose that, just as for a CTMC, it is embedded into continuous time, to get a continuous-time process {X(t) : t } but the holding times H i are positive rvs distributed as a general distribution F i (x) P (H i x), x, with finite first moment < E(H i ) 1/a i <, i S. So, just as a CTMC, the process makes transitions according to a Markov chain {X n }, and when making a transition from i to j (probability P i,j, independent of the past) the chain remains in state j, independent of the past, for an amount of time H j F j, but because the holding times are not assumed exponential, the Markov property does not hold in continuous time for {X(t)}. Nonetheless show that the limiting distribution 1 P j lim t t t I{X(s) j}ds, wp1, j S, still exists and is exactly the same as when the F i are exponential at rate a i, i S; the same as for the CTMC case; π j a P j j i π. (1) i a i (Recall Proposition 1.3, Page 1 of your Lecture Notes on Continuous-Time Markov Chains.) Thus the limiting distribution of a semi-markov process only depends on the F i through their means. SOLUTION: The proof is exactly that given in Proposition 1.3, Page 1 mentioned above; no where did the proof use the exponential distribution of holding times, only finite and non-zero first moment. Renewal Reward Theorem still yields P j 1/a j E(T j,j ), where T jj is the continuous-time return time back to state j given that initially X() j and has holding time H j. Here we also are assuming a finite state space so that E(T j,j ) < and also the denominator in the formula (1) is always finite. 2. Consider a stable FIFO M/M/1 queue, < ρ < 1. Let X(t) denote the number in system at time t, and let P n (1 ρ)ρ n, n denote the stationary distribution. Show (direct calculation) that if X() (P n ) (e.g., the chain is started off with its stationary distribution, hence is a stationary process), then the time until the first departure (after time t ), t d 1, has an exponential distribution at rate λ. SOLUTION: We condition on wether the server is busy or not at time t. If busy (probability P (X() > ) ρ 1 P ), then the next departure will occur after the (remaining) service completion of length S that is exponential at rate µ by the memoryless property; t d 1 S exp(µ). If not busy (probability P (X() 1 ρ P ), then first an arrival must come (remaining length T exp(λ) by the memoryless property), and then this customer must enter service for an independent amount of time S exp(µ); so in this case t d 1 T + S exp(λ) exp(µ) (a convolution). We now can compute the Laplace transform of t d 1 yielding (after some algebra) that for any s, 1
and indeed t d 1 exp(λ). E std 1 ρ µ µ + s + (1 ρ) λ λ + s µ µ + s λ λ + s, 3. Consider the M/M/1 queue (arrival rate λ service time rate µ) with the following twist: Each customer independently will get impatient after an amount of time that is exponentially distributed at rate γ while waiting in line (queue) and leave before ever entering service, and without ever returning. A customer who does enter service completes service (e.g., customers are only impatient while waiting in the line, not when in service.) (a) You arrive finding exactly one customer in the system (hence they are in service) and you join the queue to wait. What is the probability that you will get served? SOLUTION: Letting C denote your independent exp(γ) rv, independent of the exp(µ) remaining service time S, we want P (C > S) µ/(µ + γ). (b) You arrive finding exactly two (2) customers in the system (one in service, one in line)) and you join the end of the queue to wait. What is the probability that you will get served? SOLUTION Let S denote the remaining service time of who is in service (exp(µ) by the memoryless property), and let S 1 denote the service time of the customer waiting in front of you (and let C 1 denote their exponential impatience time, and let C 2 denote yours.) S, S 1, C 1, C 2 are independent exponential rvs. Let D denote the length of time until the server will be free to serve you: D S + S 1 I{C 1 > S}. We need to compute P (D < C 2 ). Denote the desired probability by p. Observe that p p 1 p 2 where p 1 the probability that you will move into the next position (e.g., of the customer in front of you; first in line), and p 2 is the (conditional) probability that you will get served given that you do move into the next position. By the memoryless property of exponential distributions, p 2 is simply the same as the answer to (a), p 2 P (C 1 > S) µ/(µ + γ). Moreover, p 1 P (C 2 > min{s, C 1 }) because {C 2 > min{s, C 1 }} is the event that you move into the next position: Either you do so because the customer in service departs (S completes) before either of the two of you in line get impatient, or because the customer in front of you gets impatient before S is complete and before you get impatient. (Both scenarios cause you to move to the next position (head of the line)). min{s, C 1 } exp(µ + γ) independent of C 2 exp(γ), so p 1 (µ + γ)/(µ + 2γ). Thus p p 1 p 2 µ µ + 2γ. This method can be generalized (by induction) to handle computing the probability that you get served if upon arrival you fine n 1 in front of you and the answer is p µ µ + nγ. (c) Model as a Birth and Death process, give the birth and death rates, λ n, µ n, n. SOLUTION X(t) the number of customers in the system at time t forms a B& D process. λ n λ, n, µ n (n 1)γ + µ, n 1. (µ ). 2
(d) Set up the Birth and Death balance equations for the limiting probabilities P n (but do not try to solve.) SOLUTION λp n (nγ + µ)p n+1, n. (e) Compute the ratio P n+1 /P n and prove using the ratio test from calculus, that the limiting probabilities exist for all values of λ >, µ >, γ >. Thus this chain is always positive recurrent (e.g., a condition such as ρ < 1 is not needed); explain intuitively why this should be so. SOLUTION λ P n+1 /P n nγ + µ, as n. In particular the ratio is eventually strictly bounded less than a constant less than 1; hence n P n <. Intuitively, the system is always stable because if the line gets too big, then more and more customers will get impatient and leave, hence reducing the congestion. 4. Time-reversible CTMCs: Just as in discrete time, if we consider a positive recurrent CTMC in stationarity that has been started since the infinite past, {X (t) : < t < }, then the (stationary) time-reversal X (r) (t) X ( t) : t is itself a CTMC. It has the same holding time rates {a i } and the same stationary distribution P (P j ) as the original forward time CTMC. The only thing that can differ are the infinitesimal generators Q and Q (r) (equivalently, the embedded chain transition matrices.) A positive recurrent CTMC is called time-reversible if the time-reversed process {X (r) : t } has the same distribution as the forward-time process {X (t) : t }. In words: the long-run rate that the chain moves from i to j equals the long-run rate that the chain moves from from j to i, for any two states i, j S. Thus because the holding time rates {a i } and stationary distribution P (P j ) are the same, this is equivalent to saying that a positive recurrent CTMC is time-reversible if for all pairs of states i, j S, a i P i P i,j a j P j P j,i (2) (a) Show that if the embedded chain {X n } is also positive recurrent, then (2) reduces to {X n } being time-reversible; π i P i,j π j P j,i, as we would expect. SOLUTION: Recall (Problem 1 above), that in this case P j π j a j i π i a i, j S. Using this formula for P i and P j in Equation (2) above yields the result. (b) Recall that in general, if {X n } is a positive recurrent MC, then its time-reversal MC exists and has transition matrix P (r) i,j 3 π j π i P j,i.
Now let us consider our positive recurrent CTMC {X(t)} and suppose that its embedded chain {X n } is null recurrent, hence π does not exist. Nonetheless, {X (r) (t)} must have an embedded Markov chain. Find its transition matrix in this case (denote it by P (r) i,j.) SOLUTION: Fom the Equation (2) discussion, the general (time-reversible or not) version would be The long-run rate that the reversed continuous-time chain moves from i to j equals the long-run rate that the forward continuous-time chain moves from from j to i, for any two states i, j S. This yields: a i P i P (r) i,j a j P j P j,i, (3) Thus we get P (r) i,j a jp j a i P i P j,i. Notice that when {X n } is positive recurrent, then a j P j a i P i π j π i, via using P j π j a j i π i a i, j S. (c) Since a birth and death process can only make transitions of magnitude ±1 we see that in such a case Equation (2) reduces to the long-run rate that the chain moves from i to i + 1 equals the long-run rate that the chain moves from from i + 1 to i, for all states i S ; the birth and death balance equations. We conclude: Every positive recurrent birth and death (B&D) process is time-reversible. We now apply this to the M/M/1 queue: X(t) the number of customers in a FIFO M/M/1 queue is a B& D process, so we conclude that when ρ < 1, it is timereversible. Assume that it is started at time t with its stationary distribution. Let ψ {t n : n 1} denote the Poisson arrival times starting from time t : < t 1 < t 2 <, and let ψ (d) {t d n : n 1} denote the point process of departure times after time t : < t d 1 < td 2 <. We know from Exercise 2 above that td 1 has an exponential distribution at rate λ, but here you will deduce more. Argue (from time-reversibility) that ψ (d) must have the same distribution as ψ, and hence must itself be a Poisson process at rate λ: the departure process from a stationary M/M/1 queue is itself a Poisson process. SOLUTION: The points of ψ are precisely the times at which the forward process jumps up by 1 (e.g., the times at which births occur.) Thus they must have the same distribution (from time-reversibility) as the points in time for which the time-reversed process jumps up by 1; but such times have the same distribution as the departure times ψ (d) (e.g., the times at which deaths occur). 4
5. Consider a CTMC {X(t)} (with P i,i, i S) with embedded chain transition matrix P (P i,j ) and holding time rates {a i }. Assume that a sup{a i : i S} <. Consider an alternative CTMC {X(t)}for which all holding time rates are fixed at the constant rate a; a i a, and has embedded chain transition probabilities given by { ai P i,j a P i,j if j i, 1 a i a if j i. (So P i,i > is possible.) (a) Let N(t) denote the number of transitions by time t for {X(t)}. Explain why {N(t) : t } forms a Poisson process at rate a. SOLUTION: By the Markov property, for any state i, given X(t) i, the chain will spend an exponential amount of time at rate a in state i independent of the past and then move. But this rate a does not even depend on i, so the sequence of consecutive holding times always forms an iid sequence of rvs distributed as exponential a; a Poisson process at rate a. (b) Show that the balance equations are the same for the two chains. SOLUTION: Balance equations for X(t) are which algebraically reduce to ap j ap j (1 a j a ) + P i a( a i a P i,j), j S, (4) i j a j P j i j P i a i P ij, j S. (5) (c) Explain why {X(t)} and {X(t)} have the same distribution as stochastic processes; P i,j (t) P i,j (t), t for all pairs i, j. SOLUTION: Recall that a geometric sum of iid exponentials is yet again exponential. So all that is happening here is that each original holding time H i exp(a i ) has been broken down and re-expressed as a geometric sum of iid exponentials at rate a in which the geometric has success probability a i /a. That is what the P i,j do. So when state i is entered, in both models the total amount of time spent there until changing to a state j i is exponential at rate a i. 6. Sampling a CTMC to obtain a discrete-time Markov chain: Suppose that {X(t)} is an irreducible (non-explosive) CTMC with infinitesimal generator (rates matrix) Q (q i,j ), and P (t) (P i,j (t)) e Qt, t. (a) Argue that Z n X(n), n, is an irreducible discrete-time Markov chain with transition matrix P ( P i,j ) given by P i,j P i,j (1) e Q. SOLUTION: It is Markov because P (X(n + 1) j X(n) i, X(n 1) i n 1,..., X() i ) P (X(n + 1) j X(n) i) via the Markov property assumed on {X(t)}: Give the present state X(n), the future X(n + 1) is independent of all of the past {X(u) : u < n}. P (X(n + 1) j X(n) i) P (X(1) j X() i) P i,j (1) e Q. 5
(b) We now generalize (a) to sampling at the times of a Poisson process. Independently of {X(t)} let {t n : n 1} be a Poisson process at rate λ. Let Z n X(t n ), n 1, Z X(). Argue that {Z n : n } is an irreducible discrete-time Markov chain with transition matrix P ( P i,j ) given by P [I (Q/λ)] 1. SOLUTION: By independence, each t n is a stopping time, and hence (via the strong Markov property) given Z n, {X(t n + t) : t > } is yet again the same CTMC but with initial condition Z n, independent of the past. This will be so for any independent renewal process used for sampling. If we let T n t n+1 t n denote the iid interarrival times for the renewal process, then given Z n we only need the independent (and independent of the past) rv T n to predict the future value Z n+1. P i,j P (Z n+1 j Z n i) does not depend on n because the T n, n are iid. Let T t 1 exp(λ), it has density λe λt, t. We know that P (t) e Qt, t. Thus, conditional on T t, E( P T t) P (t) e Qt, and so E( P T ) e QT. Taking expected values then yields P E(P (T )) (6) P (t)λe λt dt (7) P (u/λ)e u du (8) P (u/λ)e ui du (9) e Qu/λ e ui du (1) e u(i (Q/λ)) du (11) [I (Q/λ)] 1. (12) Irreducibility follows since in fact P i,j > for all pairs (i, j), a condition known as strong irreducibility: We know that P i,j (t) > for some t and since holding times are continuous (exponential) it follows that P i,j (t) > within a time interval t (s 1, s 2 ), s 1 < s 2, and hence P i,j P i,j (t)λe λt dt s2 s 1 P i,j (t)λe λt dt >. (c) Prove that {X(t)} is positive recurrent if and only if {Z n } is positive recurrent in which case π P ; they share the same stationary distribution. SOLUTION: By irreducibility, it suffices to prove that a probability solution to π π P (for {Z n }) exists if and only if a probability solution to P Q (for {X(t)}) exists and that π P. This is immediate: π π P if and only if π π[i (Q/λ)] 1 if and only if π[i (Q/λ)] π if and only if πq. 6