Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis.
Outline Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues in Tandem
Outline Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues in Tandem
Time-Reversed Markov Chains {X n : n=0,, } irreducible aperiodic Markov chain with transition probabilities P i = 0 P i =, i = 0,,... Unique stationary distribution (π > 0) if and only if GBE holds, i.e., Process in steady state: π = π P, = 0,,... i= 0 i i Pr{ X n = } = π = lim Pr { X n = X 0 = i} n Starts at n=-, that is {X n : n =,-,0,, }, or Choose initial state according to the stationary distribution How does {X n } look reversed in time?
Time-Reversed Markov Chains Define Y n =X τ-n, for arbitrary τ>0 => {Y n } is the reversed process. Proposition : {Y n } is a Markov chain with transition probabilities: P Pi =, i, = 0,,... π π * i i {Y n } has the same stationary distribution π with the forward chain {X n } The reversed chain corresponds to the same process, looked at in the reversed-time direction
Time-Reversed Markov Chains Proof of Proposition : P = P{ Y = Y = i, Y = i, K, Y = i } * i m m m m k k = P{ X = X = i, X = i, K, X = i } τ m τ m+ τ m+ τ m+ k k = P{ X = X = i, X = i, K, X = i } n n+ n+ n+ k k P{ X n =, X n+ = i, X n+ = i, K, X n+ k = ik} = P { X = i, X = i, K, X = i } n+ n+ n+ k k n+ = K n+ k = k n = n+ n n+ P{ X i,, X i X, X = i} P{ X =, X = i} = P{ X = i, K, X = i X = i} P{ X = i} P{ X =, X = i} P{ X = i} n n+ = = n+ n+ n+ k k n+ n+ P{ X = i X = } P{ X = } Piπ P{ X = i} π n+ n n = = n+ π P * i π π π π 0 ipi = i i 0 i = P i 0 i = = = = πi P{ X = X = i} = P{ Y = Y = i} n n+ m m i
Outline Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues in Tandem
Reversibility Stochastic process {X(t)} is called reversible if (X(t ), X(t ),, X(t n )) and (X(τ-t ), X(τ-t ),, X(τ-t n )) have the same probability distribution, for all τ, t,, t n Markov chain {X n } is reversible if and only if the transition probabilities of forward and reversed chains are equal, i.e., P i = P * i Detailed Balance Equations Reversibility π P = π P, i, = 0,,... i i i
Reversibility Discrete-Time Chains Theorem : If there exists a set of positive numbers {π }, that sum up to and satisfy: π P = π P, i, = 0,,... i i i Then:. {π } is the unique stationary distribution. The Markov chain is reversible Example: Discrete-time birth-death processes are reversible, since they satisfy the DBE
Example: Birth-Death Process P 0 S Pn, n S c P n, n + 0 n n+ P 00 P 0 Pn, n P n, n P n +, n One-dimensional Markov chain with transitions only between neighboring states: P i =0, if i- > Detailed Balance Equations (DBE) π P = π P n = 0,,... n n, n+ n+ n+, n Proof: GBE with S ={0,,,n} give: n n π P = π P π P = π P i i i n n, n+ n+ n+, n = 0 i= n+ = 0 i= n+
Time-Reversed Markov Chains (Revisited) Theorem : Irreducible Markov chain with transition probabilities P i. If there exist: A set of transition probabilities P i*, with P i* =, i 0, and A set of positive numbers {π }, that sum up to, such that Then: * i π i P = π P, i, i i P i * are the transition probabilities of the reversed chain, and {π } is the stationary distribution of the forward and the reversed chains 0 Remark: Used to find the stationary distribution, by guessing the transition probabilities of the reversed chain even if the process is not reversible
Continuous-Time Markov Chains {X(t): - < t < } irreducible aperiodic Markov chain with transition rates q i, i Unique stationary distribution (p i > 0) if and only if: p q = p q, = 0,,... i i i i i Process in steady state e.g., started at t =- : Pr{ X ( t) = } = p = lim Pr{ X ( t) = X (0) = i} If {π } is the stationary distribution of the embedded discrete-time chain: t π / ν p =, ν q i, = 0,,... i π / ν i i i
Reversed Continuous-Time Markov Chains Reversed chain {Y(t)}, with Y(t)=X(τ-t), for arbitrary τ>0 Proposition :. {Y(t)} is a continuous-time Markov chain with transition rates: p * q i qi =, i, = 0,,..., i p i. {Y(t)} has the same stationary distribution {p } with the forward chain Remark: The transition rate out of state i in the reversed chain is equal to the transition rate out of state i in the forward chain i p q p q q = = = q = ν, i = 0,,... * i i i i i i i i i pi pi
Reversibility Continuous-Time Chains Markov chain {X(t)} is reversible if and only if the transition rates of forward and reversed chains are equal q = q *, or equivalently p q = p q, i, = 0,,..., i i i i i.e., Detailed Balance Equations Reversibility i i Theorem 3: If there exists a set of positive numbers {p }, that sum up to and satisfy: p q = p q, i, = 0,,..., i i i i Then:. {p } is the unique stationary distribution. The Markov chain is reversible
Example: Birth-Death Process S S c λ 0 λ λn λ n 0 n n+ n n + Transitions only between neighboring states q = λ, q =, q = 0, i > i, i+ i i, i i i Detailed Balance Equations λ, 0,,... n pn = n + pn + n = Proof: GBE with S ={0,,,n} give: n i i i n n n+ n+ = 0 i= n+ = 0 i= n+ M/M/, M/M/c, M/M/ n p q = p q λ p = p
Reversed Continuous-Time Markov Chains (Revisited) Theorem 4: Irreducible continuous-time Markov chain with transition rates q i. If there exist: A set of transition rates q i *, with i q i * = i q i, i 0, and A set of positive numbers {p }, that sum up to, such that Then: p q i * i = p q, i, 0, i q i * are the transition rates of the reversed chain, and i {p } is the stationary distribution of the forward and the reversed chains Remark: Used to find the stationary distribution, by guessing the transition probabilities of the reversed chain even if the process is not reversible
Reversibility: Trees Theorem 5: q 0 0 q 0 q q 6 q q 6 6 q 3 q 3 q 67 q 76 Irreducible Markov chain, with transition rates that satisfy q i >0 q i >0 Form a graph for the chain, where states are the nodes, and for each q i >0, there is a directed arc i Then, if graph is a tree contains no loops then Markov chain is reversible Remarks: Sufficient condition for reversibility Generalization of one-dimensional birth-death process 3 7 4 5
Kolmogorov s Criterion (Discrete Chain) Detailed balance equations determine whether a Markov chain is reversible or not, based on stationary distribution and transition probabilities Should be able to derive a reversibility criterion based only on the transition probabilities! Theorem 6: A discrete-time Markov chain is reversible if and only if: P P LP P = P P LP P i i i i i i i i i i i i i i i i 3 n n n n n n 3 for any finite sequence of states: i, i,, i n, and any n Intuition: Probability of traversing any loop i i i n i is equal to the probability of traversing the same loop in the reverse direction i i n i i
Kolmogorov s Criterion (Continuous Chain) Detailed balance equations determine whether a Markov chain is reversible or not, based on stationary distribution and transition rates Should be able to derive a reversibility criterion based only on the transition rates! Theorem 7: A continuous-time Markov chain is reversible if and only if: q q Lq q = q q Lq q i i i i i i i i i i i i i i i i 3 n n n n n n 3 for any finite sequence of states: i, i,, i n, and any n Intuition: Product of transition rates along any loop i i i n i is equal to the product of transition rates along the same loop traversed in the reverse direction i i n i i
Kolmogorov s Criterion (proof) Proof of Theorem 6: Necessary: If the chain is reversible the DBE hold π π π P i i P i i 3 = π P i i = π Pi i M P P LP P = P P LP P = π P = πp n 3 3 n i i n i i π P n n n n P n i i i i n i i i i i i i i i i i i i i i i 3 n n n n n n 3 Sufficient: Fixing two states i =i, and i n = and summing over all states i,, n- i we have P P LP P = P P LP P P P = P P Taking the limit n n n i, i i i i, i i, i i i i, i i i i i 3 n n 3 n n lim Pi Pi = Pi lim Pi π Pi = Piπ i n n
Example: M/M/ Queue with Heterogeneous Servers 0 αλ ( α) λ A B A B B A λ λ M/M/ queue. Servers A and B with service rates A and B respectively. When λ 3 λ A + B A + B the system empty, arrivals go to A with probability α and to B with probability -α. Otherwise, the head of the queue takes the first free server. Need to keep track of which server is busy when there is customer in the system. Denote the two possible states by: A and B. Reversibility: we only need to check the loop 0 A B 0: q0, Aq A,q,Bq B,0 = αλ λ A B q0,bq B,q, Aq A,0 = ( α) λ λ B A Reversible if and only if α=/. What happens when A = B, and α /?
Example: M/M/ Queue with Heterogeneous Servers S 3 0 αλ ( α) λ A B A B B A λ λ λ λ 3 + A + B A B S S n λ pn = p, n =,3,... A + B p = p λ p0 = A p A + B p B ( + ) p = λ( p + p ) p = p A 0 λ λ + α ( A + B ) λ + + λ λ + ( α )( + ) A B A B A B B 0 B λ + A + B A + p A = p0 + B p + ( ) A + p = p0 A B λ + A + B ( λ) αλ A A B λ λ α α + ( ) + p0 + p A + pb + p n = p0 = n= + A + B λ A B λ + A + B λ λ λ α A α B B
Outline Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues in Tandem
Multidimensional Markov Chains Theorem 8: {X (t)}, {X (t)}: independent Markov chains {X i (t)}: reversible {X(t)}, with X(t)=(X (t), X (t)): vector-valued stochastic process {X(t)} is a Markov chain {X(t)} is reversible Multidimensional Chains: Queueing system with two classes of customers, each having its own stochastic properties track the number of customers from each class Study the oint evolution of two queueing systems track the number of customers in each system
Example: Two Independent M/M/ Queues Two independent M/M/ queues. The arrival and service rates at queue i are λ i and i respectively. Assume ρ i = λ i / i <. {(N (t), N (t))} is a Markov chain. Probability of n customers at queue, and n at queue, at steady-state Product-form distribution Generalizes for any number K of independent queues, M/M/, M/M/c, or M/M/. If p i (n i ) is the stationary distribution of queue i: p n n ρ ρ ρ ρ p n p n n n (, ) = ( ) ( ) = ( ) ( ) p( n, n, K, n ) = p ( n ) p ( n ) K p ( n ) K K K
Example (contd.) Stationary distribution: λ λ λ λ p( n, n ) = Detailed Balance Equations: p( n +, n ) = λ p( n, n ) p( n, n + ) = λ p( n, n ) Verify that the Markov chain is reversible Kolmogorov criterion n n λ λ λ 03 3 3 33 λ λ λ λ λ λ λ 0 3 λ λ λ λ λ λ λ λ λ λ λ λ λ 0 3 λ 00 0 0 30
Truncation of a Reversible Markov Chain Theorem 9: {X(t)} reversible Markov process with state space S, and stationary distribution {p : S}. Truncated to a set E S, such that the resulting chain {Y(t)} is irreducible. Then, {Y(t)} is reversible and has stationary distribution: p p% =, E p p k E k Remark: This is the conditional probability that, in steady-state, the original process is at state, given that it is somewhere in E Proof: Verify that: p pi p% q i = p% iqi q i = qi p q i = piqi, i, S; i p p E p% k E p = = E p k E k k k E k
Example: Two Queues with Joint Buffer The two independent M/M/ queues of the previous example share a common buffer of size B arrival that finds B customers waiting is blocked State space restricted to E = {( n, n ) : ( n ) + ( n ) B} + + Distribution of truncated chain: n n ρ ρ p( n, n ) = p(0,0), ( n, n ) E Normalizing: p(0,0) n n = ( n, n ) E ρ ρ Theorem specifies oint distribution up to the normalization constant Calculation of normalization constant is often tedious λ λ λ λ λ 03 3 0 λ λ λ λ λ λ λ λ 0 λ λ λ λ λ λ 3 λ 00 0 0 30 State diagram for B =
Outline Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues in Tandem
Birth-death process {X(t)} birth-death process with stationary distribution {p } Arrival epochs: points of increase for {X(t)} Departure epoch: points of decrease for {X(t)} {X(t)} completely determines the corresponding arrival and departure processes Arrivals Departures
Forward & reversed chains of M/M/* queues Poisson arrival process: λ =λ, for all Birth-death process called a (λ, )-process Examples: M/M/, M/M/c, M/M/ queues Poisson arrivals LAA: for any time t, future arrivals are independent of {X(s): s t} (λ, )-process at steady state is reversible: forward and reversed chains are stochastically identical => Arrival processes of the forward and reversed chains are stochastically identical => Arrival process of the reversed chain is Poisson with rate λ + the arrival epochs of the reversed chain are the departure epochs of the forward chain => Departure process of the forward chain is Poisson with rate λ
Forward & reversed chains of M/M/* queues (contd.) t t t t Reversed chain: arrivals after time t are independent of the chain history up to time t (LAA) => Forward chain: departures prior to time t and future of the chain {X(s): s t} are independent
Burke s Theorem Theorem 0: Consider an M/M/, M/M/c, or M/M/ system with arrival rate λ. Suppose that the system starts at steady-state. Then:. The departure process is Poisson with rate λ. At each time t, the number of customers in the system is independent of the departure times prior to t Fundamental result for study of networks of M/M/* queues, where output process from one queue is the input process of another
Outline Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues in Tandem
Single-Server Queues in Tandem λ Poisson Station Station λ Customers arrive at queue according to Poisson process with rate λ. Service times exponential with mean /. i Assume service times of a customer in the two queues are independent. Assume ρ=λ/ i < i What is the oint stationary distribution of N and N number of customers in each queue? Result: in steady state the queues are independent and p n n p n p n n n (, ) = ( ρ ) ρ ( ρ ) ρ = ( ) ( )
Single-Server Queues in Tandem λ Poisson Station Station λ Q is a M/M/ queue. At steady state its departure process is Poisson with rate λ. Thus Q is also M/M/. Marginal stationary distributions: p n n = ρ ρ = p ( n ) = ( ρ ) ρ, n = 0,,... n ( ) ( ), n 0,,... To complete the proof: establish independence at steady state Q at steady state: at time t, N (t) is independent of departures prior to t, which are arrivals at Q up to t. Thus N (t) and N (t) independent: Letting t, the oint stationary distribution P{ N ( t) = n, N ( t) = n } = P{ N ( t) = n } P{ N ( t) = n } = p ( n ) P{ N ( t) = n } p( n, n ) = p ( n ) p ( n ) = ( ρ ) ρ ( ρ ) ρ n n
Queues in Tandem Theorem : Network consisting of K single-server queues in tandem. Service times at queue i exponential with rate, i independent of service times at any queue i. Arrivals at the first queue are Poisson with rate λ. The stationary distribution of the network is: K n p ( n, K, n ) = ( ρ ) ρ i, n = 0,,...; i =,..., K K i i i i= At steady state the queues are independent; the distribution of queue i is that of an isolated M/M/ queue with arrival and service rates λ and i n p ( n ) = ( ρ ) ρ i, n = 0,,... i i i i i Are the queues independent if not in steady state? Are stochastic processes {N (t)} and {N (t)} independent?
Queues in Tandem: State-Dependent Service Rates Theorem : Network consisting of K queues in tandem. Service times at queue i exponential with rate i (n i ) when there are n i customers in the queue independent of service times at any queue i. Arrivals at the first queue are Poisson with rate λ. The stationary distribution of the network is: K where {p i (n i )} is the stationary distribution of queue i in isolation with Poisson arrivals with rate λ Examples:./M/c and./m/ queues If queue i is./m/, then: p( n, K, n ) = p ( n ), n = 0,,...; i =,..., K K i i i i= ( λ / ) λ p n e n ni i / i i( i ) =, i = 0,,... ni!
Summary Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Multi-dimensional Markov chains Burke s Theorem Queues in Tandem