reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

Similar documents
TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Statistics 150: Spring 2007

Introduction to Queuing Networks Solutions to Problem Sheet 3

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Readings: Finish Section 5.2

IEOR 6711, HMWK 5, Professor Sigman

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Introduction to queuing theory

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

Time Reversibility and Burke s Theorem

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

QUEUING MODELS AND MARKOV PROCESSES

Lecture 20: Reversible Processes and Queues

Introduction to Markov Chains, Queuing Theory, and Network Performance

Performance Evaluation of Queuing Systems

Modelling data networks stochastic processes and Markov chains

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Probability and Statistics Concepts

6 Solving Queueing Models

Continuous-Time Markov Chain

Continuous Time Markov Chains

MARKOV PROCESSES. Valerio Di Valerio

λ λ λ In-class problems

GI/M/1 and GI/M/m queuing systems

The Transition Probability Function P ij (t)

Session-Based Queueing Systems

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

MITOCW MIT6_041F11_lec17_300k.mp4

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

88 CONTINUOUS MARKOV CHAINS

Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING

Modelling data networks stochastic processes and Markov chains

A general algorithm to compute the steady-state solution of product-form cooperating Markov chains

MITOCW MIT6_262S11_lec18_300k.mp4

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

Examples of Countable State Markov Chains Thursday, October 16, :12 PM

Markov processes and queueing networks

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Markov Processes and Queues

Computer Systems Modelling

Figure 10.1: Recording when the event E occurs

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Queueing Networks and Insensitivity

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Markov Chain Model for ALOHA protocol

CS 798: Homework Assignment 3 (Queueing Theory)

Advanced Queueing Theory

Exercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems

LECTURE #6 BIRTH-DEATH PROCESS

Buzen s algorithm. Cyclic network Extension of Jackson networks

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Modelling Complex Queuing Situations with Markov Processes

Stochastic models in product form: the (E)RCAT methodology

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Quantitative Model Checking (QMC) - SS12

Multi Stage Queuing Model in Level Dependent Quasi Birth Death Process

A TANDEM QUEUEING SYSTEM WITH APPLICATIONS TO PRICING STRATEGY. Wai-Ki Ching. Tang Li. Sin-Man Choi. Issic K.C. Leung

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Math 597/697: Solution 5

Queueing Networks G. Rubino INRIA / IRISA, Rennes, France

Outline. Finite source queue M/M/c//K Queues with impatience (balking, reneging, jockeying, retrial) Transient behavior Advanced Queue.

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

(implicitly assuming time-homogeneity from here on)

4452 Mathematical Modeling Lecture 16: Markov Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Sandwich shop : a queuing net work with finite disposable resources queue and infinite resources queue

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Data analysis and stochastic modeling

Lecture 10: Powers of Matrices, Difference Equations

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

Continuous time Markov chains

1/2 1/2 1/4 1/4 8 1/2 1/2 1/2 1/2 8 1/2 6 P =

STAT 380 Continuous Time Markov Chains

Stationary Analysis of a Multiserver queue with multiple working vacation and impatient customers

Link Models for Circuit Switching

MITOCW MIT6_262S11_lec20_300k.mp4

Queues and Queueing Networks

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

CHAPTER 4. Networks of queues. 1. Open networks Suppose that we have a network of queues as given in Figure 4.1. Arrivals

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

M/G/1 and M/G/1/K systems

IOE 202: lectures 11 and 12 outline

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Stochastic process. X, a series of random variables indexed by t

Continuous Time Processes

Bulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1

1 IEOR 4701: Continuous-Time Markov Chains

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Birth-Death Processes

Chapter 1 Review of Equations and Inequalities

6.842 Randomness and Computation March 3, Lecture 8

Stochastic Models in Computer Science A Tutorial

Introduction to Queueing Theory with Applications to Air Transportation Systems

Transcription:

Lecture 10 Networks of queues In this lecture we shall finally get around to consider what happens when queues are part of networks (which, after all, is the topic of the course). Firstly we shall need an important result about time reversibility Markov chains. Burke s Theorem Consider an ergodic Markov chain {X j : j N} which is Markov (λ, P) running backwards (or if you prefer consider the reverse iterates of the Markov chain). This asks questions about P [X n = j X n+1 = i, X n+2 = i 2,..., X n+k = i k ] = P [X n = j, X n+1 = i, X n+2 = i 2,..., X n+k = i k ] P [X n+1 = i, X j+2 = i 2,..., X n+k = i k ] = P [X n = j, X n+1 = i] P [X n+2 = i 2,..., X n+k = i k X n = j, X n+1 = i] P [X n+1 = i] P [X n+2 = i 2,..., X n+k = i k X n+1 = i] = P [X n = j, X n+1 = i] P [X n+1 = i] = P [X n = j] P [X n+1 = i X n = j] P [X n+1 = i] = π jp ji π i, The fourth line follows from the Markov property for the last equality the assumption has been made that the chain has reached its steady-state equilibrium probabilities. Denote by p ij the reversed transition probability p ij = P [X n = j X n+1 = i] = π jp ji π i. (1) The reversed chain is ergodic has the same equilibrium probabilities (check that π j = i=0 π ip ij ). Intuitively think of the film of the Markov chain in action being shown backwards. A chain is called time reversible if p ij = p ij for all i j. Clearly, from equation (1) this occurs iff p ij π i = p ji π j, for all i, j. Note that this means that Birth-Death processes are time reversible (the proof of this is left as an exercise for the student). Therefore all the queues which can be modelled as such are, themselves time reversible. This includes the M/M/1, the M/M/m the M/M/ queues. Therefore, queue which can be represented as a Birth-Death process can be considered, once it has reached the steady state. An important point here is that the departure process of the forward system is the arrival process of the forward system. This leads us to Burke s Theorem. Theorem 1. For an M/M/1, M/M/m or M/M/ queue in the steady state then: (1) The departure process is Poisson with rate λ. (2) At time t the number of customers is independent of the sequence of departure times prior to t. 1

Proof. Part (1) follows immediately from the fact that the arrival process is Poisson with rate λ the time reverse of this of this is also a Poisson with rate λ. Part (2) follows from the fact that the departures prior to t in the reversed system is the same process as the arrival process after t in the reversed process. It is clear that the number in the system queue is independent of the arrivals after that point in a Poisson system. Note that these results are more than a little counter-intuitive. In (1) The exit rate is not a function of the server rate. In (2) a sequence of closely spaced exits does not mean that the system is at all likely to have a large queue at the moment (though it may imply that the system did have a large queue until recently). A Useful Approach Given an ergodic Markov chain with transition probabilities p ij. a vector π = (π 0, π 1,... ) such that π i > 0 i=0 π i = 1 then if the scalars form a stochastic matrix, that is p ij = π jp ji π i, (2) p ij = 1 for all i = 0, 1,..., (3) j=0 then π is the equilibrium distribution the p ij are the reversed transition probabilities. The proof of this is left as an exercise for the student. Note that this property holds even if the chain is not time reversible. Therefore, if insight (or a lucky guess) provides such a π p ij then proving that equations (2) (3) hold proves that the invariant density reverse transition probabilities have been found. Jackson s Theorem Consider a network of K queues each with a single Poisson server. These queues are connected in a network so that on exiting one queue the customers may leave the network or join another queue. Customers enter the network at any queue as a Poisson process. On exiting any queue then they either move to a new queue (or possibly rejoin the same one) or leave the network entirely at rom. Definitions: P ij the probability that a customer leaving i goes on to j. r j the arrival rate of new cusomers at queue j as a Poisson process. λ j is the total arrival rate at the queue j (all customers). µ j is the service rate of queue j. ρ j = λ j /µ j is the utilisation of queue j. This is assumed by hypothesis to be less than 1 for all queues. It must be that: K P ij 1, 2

this assumed to be a strict inequality (< 1) for at least one queue (there is at least one place where customers leave the system forever). It is also assumed that each customer entering the system can reach a queue which has a probability of leaving ( each customer will eventually leave the system almost surely). The probability that a customer leaving queue i leaves the system forever is given by: P [Customer leaving queue i leaves system] = 1 K P ij. The state of the system is represented by a vector n = (n 1,..., n K ) where n i Z + is the number of customers in queue i. The system can now be viewed as a Markov chain with states in Z K +. A special notation will be used to indicate transitions between states. So, state n(j + ) is the state corresponding to a new arrival at queue j. That is The probability of such a transition is given by n(j + ) = (n 1,..., n j + 1,..., n K ). p nn(j+ ) = r j. (4) State n(j ) similarly corresponds to customer leaving the system completely from state j. That is n(j ) = (n 1,..., n j 1,..., n K ). The probability of such a transition is given by p nn(j ) = µ j (1 i P ji ). (5) Finally, state n(i +, j ) corresponds to a customer leaving queue j joining queue i. That is n(i +, j ) = (n 1,..., n i + 1,... n j 1,..., n K ). The probability of such a transition is given by p nn(i +,j ) = µ j P ji. (6) Let π(n) = π(n 1,..., n K ) be the equilibrium probability of the state n. By finding π(n) the probability of any given distribution is given the expected queue lengths can be determined. Jackson s theorem is the remarkable claim that, under the conditions given, the queues can be treated as K independent M/M/1 queues. Theorem 2. Assuming that ρ j < 1 for all j then for n 1,... n k Z + : π(n) = π 1 (n 1 )π 2 (n 2 )... π K (n K ), (7) where π j (n j ) = ρ nj j (1 ρ j) for all n j Z +. (8) Proof. Assume without loss of generality that λ j > 0 for all j 1. Take two states, n n. The probability of a transition between them is p nn. The reverse probabilities are denoted by p nn. (Note that the chain is not in general reversible.) Now, from 1 If λ j = 0 then π j (0) = 1 π j (n j ) = 0 for n j > 0. Hence this can be removed from equation (7). 3

equations (2) 3, the theorem is proved, if given equations (7) (8), then the probabilities given by satisfy for all n p nn Note that from equation (9) then p nn = p nn for all n also that n p nn = 1. Therefore, equation (10) is equivalent to showing for all n. = π(n )p n n, (9) π(n) p nn = 1, (10) n n n p nn = n n The following transitions have already been defined: p nn, (11) p nn(j + ) = r j p nn(j ) = µ j (1 i P ji ) p nn(i+ j ) = µ j P ji. For all other n n then p nn = 0. Therefore, for the forward system, for all n then K p nn = p nn(j+ ) + p nn(j ) + n n i, i, p nn(i+,j ) K = r j + K µ j (1 P ji ) + µ j P ji. Which finally gives K p nn = r j + µ j. (12) n n Now, taking the reverse transitions, from equation (8) then π(n(j + )) = ρ j π(n). Similarly also π(n(j )) = π(n)/ρ j, π(n(i +, j )) = ρ i π(n)/ρ j. 4

From equation (9) the above three equations then the reverse transitions are given by p nn(j + ) = π(n(j+ )) π(n) p n(j + )n = π(n(j+ )) p π(n) nn(j ) = λ j (1 i P ji ) p nn(j ) = π(n(j )) p π(n) n(j )n = π(n(j )) p nn(j+1) = µ j r j /λ j π(n) p nn(i +,j ) = p n(j +,i )n = π(n) π(n(j +, i )) p nn(j +,i ) = ρ(i) ρ(j) µ ip ij = µ j λ i P ji /λ j. As before, for all other n n then p nn = 0. Therefore, summing for all n for the reversed system n n p nn K = p nn(j + ) + K K = λ j (1 P ji ) + K K = λ j (1 P ji ) + p nn(j ) + j,i n j>0 µ j r j λ j + p nn(i +,j ) j,i n j>0 µ j λ i P ij λ j µ j (r j + K λ ip ij ) λ j. This finally gives p nn K = K λ j (1 P ji ) + n n µ j. (13) Finally, summing all the processes entering queue j (either from another queue or from outside, remembering that the exit rate of queue j must be equal to the input rate (if the queue is not to grow forever) then: K λ j = r j + λ i P ij, for all j = 1... K. Rearranging this gives Summing over all queues gives: r j = λ j K λ i P ij. K K K r j = λ j (1 P ji ). (14) Combining equations (12), (13) (14) gives p nn = p nn, n n n n which is equation (11) as required hence the theorem is proved. 5

Jackson s Theorem Example Amnesia house is a house which deals with people who have trouble with remembering things. Forgetful people decide to leave the building as a Poisson process a rate λ. The queue to leave is a single server Poisson process with a rate µ 1. Unfortunately, on leaving, a proportion p of them remember something they have forgotten must join a separate queue established for people who have forgotten something wish to reenter. This is also a Poisson process with a rate µ 2. If the forgetful people are chosen at rom ( people may forget multiple times) then find N 1 the average number of people queuing to leave N 2 the average number of people queuing to get back in (assuming that µ 1 µ 2 are sufficiently large that the system is ergodic). Find the total number trying to leave. Define λ 1 as the input rate to the queue to leave λ 2 as the input rate to the queue to reenter. The rates to each queue are Solving gives Therefore λ 1 = λ + λ 2, λ 2 = pλ 1. λ 2 = λ 1 = λp 1 p, λ 1 p ρ 1 = λ 1 λ =, µ 1 (1 p)µ 1 ρ 2 = λ 2 λp =. µ 2 (1 p)µ 2 By Jackson s theorem the two queues are independent M/M/1 queues. Therefore P [n 1, n 2 ] = ρ n1 1 (1 ρ 1)ρ n2 2 (1 ρ 2). Also Hence N 1 = ρ 1 1 ρ 1, N 2 = ρ 2 1 ρ 2. N = ρ 1 1 ρ 1 + ρ 2 1 ρ 2. 6