Lecture 20: Reversible Processes and Queues

Similar documents
Statistics 150: Spring 2007

Performance Evaluation of Queuing Systems

IEOR 6711, HMWK 5, Professor Sigman

Introduction to Queuing Networks Solutions to Problem Sheet 3

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

Time Reversibility and Burke s Theorem

Markov processes and queueing networks

Introduction to Markov Chains, Queuing Theory, and Network Performance

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Data analysis and stochastic modeling

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

CS 798: Homework Assignment 3 (Queueing Theory)

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Continuous-Time Markov Chain

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

The Transition Probability Function P ij (t)

1 IEOR 4701: Continuous-Time Markov Chains

Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING

Queueing Networks and Insensitivity

Stochastic process. X, a series of random variables indexed by t

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Part I Stochastic variables and Markov chains

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Continuous Time Processes

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

MARKOV PROCESSES. Valerio Di Valerio

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

QUEUING SYSTEM. Yetunde Folajimi, PhD

Chapter 1. Introduction. 1.1 Stochastic process

GI/M/1 and GI/M/m queuing systems

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Queues and Queueing Networks

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Non Markovian Queues (contd.)

Figure 10.1: Recording when the event E occurs

NEW FRONTIERS IN APPLIED PROBABILITY

Readings: Finish Section 5.2

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

Part II: continuous time Markov chain (CTMC)

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

M/G/1 and M/G/1/K systems

Computer Systems Modelling

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Introduction to queuing theory

Queuing Theory. Using the Math. Management Science

Simple queueing models

A review of Continuous Time MC STA 624, Spring 2015

Exact Simulation of the Stationary Distribution of M/G/c Queues

Probability Models in Electrical and Computer Engineering Mathematical models as tools in analysis and design Deterministic models Probability models

Continuous time Markov chains

Stationary remaining service time conditional on queue length

Operations Research Letters. Instability of FIFO in a simple queueing system with arbitrarily low loads

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Continuous Time Markov Chains

6 Solving Queueing Models

arxiv: v2 [math.pr] 24 Mar 2018

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory

Quantitative Model Checking (QMC) - SS12

Introduction to Queueing Theory

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

Introduction to Queueing Theory

Computer Networks More general queuing systems

P (L d k = n). P (L(t) = n),

LECTURE #6 BIRTH-DEATH PROCESS

All models are wrong / inaccurate, but some are useful. George Box (Wikipedia). wkc/course/part2.pdf

Probability and Statistics Concepts

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

ECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic

Performance Modelling of Computer Systems

M/M/1 Retrial Queueing System with Negative. Arrival under Erlang-K Service by Matrix. Geometric Method

STAT 380 Continuous Time Markov Chains

BRAVO for QED Queues

Multi Stage Queuing Model in Level Dependent Quasi Birth Death Process

1 Continuous-time chains, finite state space

Other properties of M M 1

Structured Markov Chains

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

Photo: US National Archives

Classification of Queuing Models

Introduction to Queueing Theory

Examination paper for TMA4265 Stochastic Processes

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Computer Systems Modelling

Markov Processes and Queues

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Transcription:

Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n N 0 } A Markov process {X t N 0 : t R} on the state space N 0 is called a birth-death process if its infinitesimal transition probabilities satisfy λ n h + o(h), if m = 1, P n,n+m (h) = µ n h + o(h), if m = 1, o(h), if m > 1 We say f (h) = o(h) if lim h 0 f (h)/h = 0 In other words, a birth-death process is any CTMC with generator of the form λ 0 λ 0 0 0 0 µ 1 (λ 1 + µ 1 ) λ 1 0 0 Q = 0 µ 2 (λ 2 + µ 2 ) λ 2 0 0 0 µ 3 (λ 3 + µ 3 ) λ 3 Proposition 11 An ergodic birth-death process in steady-state is time-reversible Proof Since the process is stationary, the probability flux must balance across any cut of the form A = {0,1,2,,i}, i 0 But, this is precisely the equation π i λ i = π j µ j since there are no other transitions possible across the cut So the process is time-reversible In fact, the following, more general, statement can be proven using similar ideas Proposition 12 Consider an ergodic CTMC on a countable state space I with the following property: for any pair of states i j I, there is a unique path i = i 0 i 1 i n(i, j) = j of distinct states having positive probability Then the CTMC in steady-state is reversible 12 Truncated Markov Processes Consider a transition rate matrix (Q i j ) i, j I on the countable state space I Given a nonempty subset A I, the truncation of Q to A is the transition rate matrix {Q A i j : i, j A}, where for all i, j A { Q A Q i j, j i, i j = k i,k A Q ik, j = i 1

Proposition 13 Suppose {X t : t R} is an irreducible, time-reversible CTMC on the countable state space I, with generator Q = {Q i j : i, j I} and stationary probabilities π = {π j : j I} Suppose the truncation Q A is irreducible for some A I Then, any stationary CTMC with state space A and generator Q A is also time-reversible, with stationary probabilities π A j = π j i A π i, j A Proof It is clear that π A is a distribution on state space A We must show the reversibility with this distribution π A That is, we must show for all i, j A π A i Q i j = π A j Q ji However, this is true since the original chain is time reversible 13 The Metropolis-Hastings algorithm Let {a j R + : j [m]} be a set of (known) positive numbers with A = m i=1 a i Suppose our goal is to build a sampler for a random variable with probability mass function π j = a j A, for each j [m], where m is large and A is difficult to compute directly This rules out direct evaluation of the fraction a j A Idea A clever way of (approximately) generating a sample from the distribution π = {π j : j N} is by constructing an easy-to-simulate Markov chain with limiting (stationary) distribution π We simply run this Markov chain long enough and return the sample (state) at the end Let M be an irreducible transition probability matrix on the integers [m] such that M = M T An example is the transition matrix of an iid sequence of uniform random variables on [m] Consider the Markov chain {X i : i N} on the state space [m] with the following transition probabilities: ( M i j min 1, a j a P i j = i ), j i, ( 1 k i M ik {1 min 1, a k a i )}, j = i Note that the key property that allows us to easily simulate this Markov chain is that only the relative ratios a j /a i are required, and not A! It can be directly verified that (1) this Markov chain is irreducible, and that (2) it is reversible with equilibrium distribution π! 14 Random walks on edge-weighted graphs Consider an undirected graph G = (I,E) with the vertex set I and the edge set E being a subset of unordered pairs of elements from I Assume having a positive number w i j associated with each edge {i, j} in E Further the edge weight w i j is defined to be 0 if {i, j} is not an edge of the graph Suppose that a particle moves in discrete time, from one vertex to another in the following manner: If the particle is presently at vertex i then it will next move to vertex j with probability P i j = w i j j w i j The Markov chain describing the sequence of vertices visited by the particle is a random walk on an undirected edge-weighted graph Google s PageRank algorithm, to estimate the relative importance of webpages, is essentially a random walk on a graph! 2

Proposition 14 Consider an irreducible Markov chain that describes the random walk on an edge weighted graph with a finite number of vertices In steady state, this Markov chain is time reversible with stationary probability of being in a state i I given by π i = j w i j j k w k j (1) Proof Using the definition of transition probabilities for this Markov chain, we notice that the detailed balance equation for each pair of states i, j I reduces to α i w i j k w ik = α jw ji k w jk From the symmetry of edge weights in undirected graphs, it follows that w i j = w ji Hence, we see that the distribution π defined as in (1) solves the equation, and we get the desired result The following dual result also holds: Lemma 15 Let {X} n be a reversible Markov chain on a finite state space I and transition probability matrix P Then, there exists a random walk on a weighted, undirected graph G with the same transition probability matrix P Proof We create a graph G = (I,E), where (i, j) E if and only if P i j > 0 We then set edge weights w i j π i P i j = π j P ji = w ji, where π is the stationary distribution of X With this choice of weights, it is easy to check that w i = j w i j = π i, and the transition matrix associated with a random walk on this graph is exactly P 2 General queueing theory The notation A/B/C/D/E for a queueing system indicates A: Inter-arrival time distribution, B: Service time distribution, C: Number of servers, D: Maximum number of jobs that can be waiting and in service at any time ( by default), and E: Queueing service discipline (FIFO by default) Theorem 21 (PASTA) Poisson arrivals see time averages At any time t, we denote a system state by N(t) and the number of arrivals in [0,t) by A(t) The nth arrival instant is denoted by A n and B a Borel measurable set in R +, then 1 t τ B lim 1{N(u) B}du = lim t R + t 0 n n N 1 n i=1 1{N(A i ) B} c B Proof We will show the special case when N(t) is the number of customers in the system at time t, and B = {k} We define for k N 0 P k lim t R + Pr{N(t) = k} A k lim t R + Pr{N(t ) = k A k+1 = t} 3

Using independent increment property of Poisson arrivals, Baye s rule, and continuity of probabilities, we can write the second limiting probability as Pr{N(t ) = k,a(t + h) A(t) = 1} A k = lim limh 0 = lim Pr{N(t) = k} = P k t R + Pr{A(t + h) A(t) = 1} t R + Theorem 22 (Little s law) Consider a stable single server queue Let T i be waiting time of customer i, N(t) be the number of customers in the system at time t, and A(t) be the number of customers that entered system in duration [0,t), then t 0 lim N(u)du = lim t t A(t) i=1 T i t A(t) Proof Let A(t), D(t) respectively denote the number of arrivals and departures in time [0, t) Then, we have Further, for a stable queue we have Combining these two results, the theorem follows 21 The M/M/1 queue D(t) t A(t) T i N(u)du i=1 0 i=1 D(t) A(t) lim = lim t t t t The M/M/1 queue is the simplest and most studied models of queueing systems We assume a continuoustime queueing model with following components There is a single queue for waiting that can accommodate arbitrarily large number of customers Arrivals to the queue occur according to a Poisson process with rate λ > 0 That is, let A n be the arrival instant of the nth customer, then the sequence of inter-arrival times {A n A n 1 : n N} is iid exponentially distributed with rate λ There is a single server and the service time of nth customer is denoted by a random variable S n The sequence of service times {S n : n N} are iid exponentially distributed with rate µ > 0, independent of the Poisson arrival process We assume that customers join the tail of the queue, and hence begin service in the order that they arrive first-in-queue-first-out (FIFO) Let X(t) denote the number of customers in the system at time t R +, where system means the queue plus the service area For example, X(t) = 2 means that there is one customer in service and one waiting in line Due to continuous distributions of inter-arrival and service times, a transition can only occur at customer arrival or departure times Further, departures occur whenever a service completion occurs Let D n denote the nth departure from the system At an arrival time A n, the number X(A n ) = X(A n ) + 1 jumps up by the amount 1, whereas at a departure time D n, then number X(D n ) = X(D n ) 1 jumps down by the amount 1 For the M/M/1 queue, one can argue that {X(t) : t R + } is a CTMC on the state space N 0 We will soon see that a stable M/M/1 queue is time-reversible T i 4

211 Transition rates Given the current state {X(t) = i}, the only transitions possible in an infinitesimal time interval are (a) a single customer arrives, or (b) a single customer leaves (if i 1) It follows that the infinitesimal generator for the CTMC {X(t)} t is λ, j = i + 1, Q i j = µ, j = i 1, 0, j i > 1 Since λ,µ > 0, this defines an irreducible CTMC 212 Equilibrium distribution and reversibility The M/M/1 queue s generator defines a birth-death process Hence, if it is stationary, then it must be time-reversible, with the equilibrium distribution π satisfying the detailed balance for each i N 0 π i λ = π i+1 µ This yields π i+1 = λ µ π i Since i 0 π = 1, we must have ρ λ µ < 1, giving for each i N 0 π i = (1 ρ)ρ i In other words, if λ < µ, then the equilibrium distribution of the number of customers in the system is geometric with parameter ρ = λ/µ We say that the M/M/1 queue is in the stable regime when ρ < 1 We have thus shown Corollary 23 The number of customers in an M/M/1 queueing system at equilibrium is a reversible Markov process Further, since M/M/1 queue is a reversible CTMC, the following theorem follows Theorem 24 (Burke) Departures from a stable M/M/1 queue are Poisson with same rate as the arrivals 213 Limiting waiting room: M/M/1/K Consider a variant of the M/M/1 queueing system that has a finite buffer capacity of at most k customers Thus, customers that arrive when there are already k customers present are rejected It follows that the CTMC for this system is simply the M/M/1 CTMC truncated to the state space {0,1,,K}, and so it must be time-reversible with stationary distribution π i = ρ i / k j=0 ρ j, 0 i k (Two queues with joint waiting room) Consider two independent M/M/1 queues with arrival and service rates λ i and µ i respectively for i [2] Then, joint distribution of two queues is π(n 1,n 2 ) = (1 ρ 1 )ρ n 1 1 (1 ρ 2)ρ n 2 2, n 1,n 2 N 0 Suppose both the queues are sharing a common waiting room, where if arriving customer finds R waiting customer then it leaves In this case, π(n 1,n 2 ) = (1 ρ 1 )ρ n 1 1 (1 ρ 2)ρ n 2 2, (n 1,n 2 ) A N 0 N 0 5