N.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a

Similar documents
Departure processes from MAP/PH/1 queues

Departure Processes of a Tandem Network

Modelling Complex Queuing Situations with Markov Processes

The Transition Probability Function P ij (t)

Introduction to Queuing Networks Solutions to Problem Sheet 3

CHUN-HUA GUO. Key words. matrix equations, minimal nonnegative solution, Markov chains, cyclic reduction, iterative methods, convergence rate

Statistics 150: Spring 2007

Statistics 992 Continuous-time Markov Chains Spring 2004

Online Companion for. Decentralized Adaptive Flow Control of High Speed Connectionless Data Networks

2 THE COMPLEXITY OF TORSION-FREENESS On the other hand, the nite presentation of a group G also does not allow us to determine almost any conceivable

MARKOV CHAINS: STATIONARY DISTRIBUTIONS AND FUNCTIONS ON STATE SPACES. Contents

Overload Analysis of the PH/PH/1/K Queue and the Queue of M/G/1/K Type with Very Large K

Stochastic Processes

Stochastic process. X, a series of random variables indexed by t

IEOR 6711, HMWK 5, Professor Sigman

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

The purpose of this paper is to demonstrate the exibility of the SPA

Midterm 1. Every element of the set of functions is continuous

MATH 326: RINGS AND MODULES STEFAN GILLE

Linear Algebra (part 1) : Vector Spaces (by Evan Dummit, 2017, v. 1.07) 1.1 The Formal Denition of a Vector Space

SMSTC (2007/08) Probability.

Markov Processes Cont d. Kolmogorov Differential Equations

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

QUASI-UNIFORMLY POSITIVE OPERATORS IN KREIN SPACE. Denitizable operators in Krein spaces have spectral properties similar to those

Stability, Queue Length and Delay of Deterministic and Stochastic Queueing Networks Cheng-Shang Chang IBM Research Division T.J. Watson Research Cente

2 Section 2 However, in order to apply the above idea, we will need to allow non standard intervals ('; ) in the proof. More precisely, ' and may gene

Vector Space Basics. 1 Abstract Vector Spaces. 1. (commutativity of vector addition) u + v = v + u. 2. (associativity of vector addition)

University of California. Berkeley, CA fzhangjun johans lygeros Abstract

Eigenvalue comparisons in graph theory

G-networks with synchronized partial ushing. PRi SM, Universite de Versailles, 45 av. des Etats Unis, Versailles Cedex,France

Advanced Queueing Theory

Linear Algebra Practice Problems

only nite eigenvalues. This is an extension of earlier results from [2]. Then we concentrate on the Riccati equation appearing in H 2 and linear quadr

Multi Stage Queuing Model in Level Dependent Quasi Birth Death Process

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Experimental evidence showing that stochastic subspace identication methods may fail 1

Markov Chains, Stochastic Processes, and Matrix Decompositions

[4] T. I. Seidman, \\First Come First Serve" is Unstable!," tech. rep., University of Maryland Baltimore County, 1993.

growth rates of perturbed time-varying linear systems, [14]. For this setup it is also necessary to study discrete-time systems with a transition map

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

Perron Frobenius Theory

Analysis on Graphs. Alexander Grigoryan Lecture Notes. University of Bielefeld, WS 2011/12

Convergence Rates for Renewal Sequences

Fast Estimation of the Statistics of Excessive Backlogs in Tandem Networks of Queues.

OPTIMALITY OF RANDOMIZED TRUNK RESERVATION FOR A PROBLEM WITH MULTIPLE CONSTRAINTS

Model reversibility of a two dimensional reflecting random walk and its application to queueing network

Linear Algebra: Linear Systems and Matrices - Quadratic Forms and Deniteness - Eigenvalues and Markov Chains

CS 798: Homework Assignment 3 (Queueing Theory)

Chapter 30 Minimality and Stability of Interconnected Systems 30.1 Introduction: Relating I/O and State-Space Properties We have already seen in Chapt

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Cover Page. The handle holds various files of this Leiden University dissertation

STABILITY OF INVARIANT SUBSPACES OF COMMUTING MATRICES We obtain some further results for pairs of commuting matrices. We show that a pair of commutin

IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, Weirdness in CTMC s

Regenerative Processes. Maria Vlasiou. June 25, 2018

Contents. 6 Systems of First-Order Linear Dierential Equations. 6.1 General Theory of (First-Order) Linear Systems

Non-Essential Uses of Probability in Analysis Part IV Efficient Markovian Couplings. Krzysztof Burdzy University of Washington

Spurious Chaotic Solutions of Dierential. Equations. Sigitas Keras. September Department of Applied Mathematics and Theoretical Physics

ECON Answers Homework #2

Markov Chains and Stochastic Sampling

On the Pathwise Optimal Bernoulli Routing Policy for Homogeneous Parallel Servers

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

Linear Algebra review Powers of a diagonalizable matrix Spectral decomposition

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

8 Extremal Values & Higher Derivatives

MATH37012 Week 10. Dr Jonathan Bagley. Semester

What is A + B? What is A B? What is AB? What is BA? What is A 2? and B = QUESTION 2. What is the reduced row echelon matrix of A =

8. Statistical Equilibrium and Classification of States: Discrete Time Markov Chains

2. Prime and Maximal Ideals

form, but that fails as soon as one has an object greater than every natural number. Induction in the < form frequently goes under the fancy name \tra

New concepts: Span of a vector set, matrix column space (range) Linearly dependent set of vectors Matrix null space

X. Hu, R. Shonkwiler, and M.C. Spruill. School of Mathematics. Georgia Institute of Technology. Atlanta, GA 30332

Lecture 10: Semi-Markov Type Processes

THE VARIANCE CONSTANT FOR THE ACTUAL WAITING TIME OF THE PH/PH/1 QUEUE. By Mogens Bladt National University of Mexico

Queueing systems in a random environment with applications

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Lecture 20: Reversible Processes and Queues

4.1 Eigenvalues, Eigenvectors, and The Characteristic Polynomial

A TANDEM QUEUE WITH SERVER SLOW-DOWN AND BLOCKING

Balance properties of multi-dimensional words

SOME MEASURABILITY AND CONTINUITY PROPERTIES OF ARBITRARY REAL FUNCTIONS

One important issue in the study of queueing systems is to characterize departure processes. Study on departure processes was rst initiated by Burke (

Operations Research Letters. Instability of FIFO in a simple queueing system with arbitrarily low loads

1 IEOR 4701: Continuous-Time Markov Chains

LECTURE NOTES: Discrete time Markov Chains (2/24/03; SG)

Chapter 1. Introduction. 1.1 Stochastic process

Exact Simulation of the Stationary Distribution of M/G/c Queues

Convex Optimization CMU-10725

Math Camp Notes: Linear Algebra II

An M/M/1 Queue in Random Environment with Disasters

Perturbation results for nearly uncoupled Markov. chains with applications to iterative methods. Jesse L. Barlow. December 9, 1992.

Introduction to Queueing Theory with Applications to Air Transportation Systems

290 J.M. Carnicer, J.M. Pe~na basis (u 1 ; : : : ; u n ) consisting of minimally supported elements, yet also has a basis (v 1 ; : : : ; v n ) which f

Adaptive linear quadratic control using policy. iteration. Steven J. Bradtke. University of Massachusetts.

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes

Lecture 7. µ(x)f(x). When µ is a probability measure, we say µ is a stationary distribution.

Geometric ρ-mixing property of the interarrival times of a stationary Markovian Arrival Process

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

System theory and system identification of compartmental systems Hof, Jacoba Marchiena van den

Transcription:

WHEN IS A MAP POISSON N.G.Bean, D.A.Green and P.G.Taylor Department of Applied Mathematics University of Adelaide Adelaide 55 Abstract In a recent paper, Olivier and Walrand (994) claimed that the departure process of an MMPP/M/ queue is not a MAP unless the queue is a stationary M/M/ queue. They also conjectured that the result extends to MAP/PH/ queues. In the rst part of this paper we show that their proof has an algebraic error, which leaves the above question open. There is also a more fundamental problem with Olivier and Walrand's proof. In order to discuss the problem, it is essential to be able to determine from its generator when a stationary MAP is a Poisson process. This is not discussed in Olivier and Walrand (994), nor does it appear to have been discussed in the literature. This deciency is remedied in the second part of this paper, where we use ideas from non-linear ltering theory to give a characterisation as to when a stationary MAP is a Poisson process. INTRODUCTION. A Markovian Arrival Process, (MAP) is a process which counts transitions of a nite state Markov Chain. It is possible for a process which counts transitions of an innite state Markov Chain to be statistically equivalent to a MAP. For example, consider an M/M/ queue, with arrival rate > and service rate >, which is modelled by a Markov chain x = fx t ; t g on the state space Z +, where x t represents the number in the queue at time t. This Markov chain

has the following innite transition rate matrix 2 6 4 : : : ( + ) : : : ( + ) : : :.......... 3 7 5 ; () where the number in the queue x, increases as the row number of the matrix. In the situation where <, the queue is positive recurrent and, under stationary conditions, the point process of occurrence of transitions (n+; n); n, is known (see Burke (956)) to be a Poisson process of rate, which is of course a trivial MAP. A natural question to ask is whether a similar property holds for other queues. That is, for more general queues, does there exist a nite state Markov chain and a set of transitions for which the counting process of the transitions is identical to the departure process of the original queue. Olivier and Walrand (994) presented an argument to show that there exists no such nite state chain for an MMPP/M/ queue and conjectured that this is also true for a MAP/M/ queue. Unfortunately there is an algebraic error in the argument of Olivier and Walrand, which we point out in Section 3, and so the question of whether the output of an MMPP/M/ queue can be a MAP still remains open. There is also a more fundamental problem when addressing this question, which was not discussed in Olivier and Walrand (994). Since it is possible for the arrival process of a MAP/M/ queue to be Poisson, but with a possibly complicated description, and since we know that the output of such a queue is a MAP, (as mentioned above, it is Poisson) it is essential to be able to tell from it's generator when a MAP is, in fact, Poisson. 2 MARKOV ARRIVAL PROCESSES (MAP s). We generalise the arrival process for the transition rate matrix in () by relaxing the requirement that the inter-arrival times be negative exponentially distributed. This is achieved by adding auxiliary states or phases to the arrival process and associating with each arrival, certain phase changes. The Markovian simplicity is still preserved since the sojourn times within phases of the auxiliary process are still negative exponentially distributed. We dene a two state Markov chain (x; y), where x represents the number in the queue at time t and y represents the phase of the arrival process at time t. Following the notation of Neuts (98) and letting the number of phases be m, we get the following block matrix form for the conservative rate matrix of the MAP/M/ queue. Q = 2 6 4 B A A 2 A A A 2 A A. 3 7 7 5 : (2).........

When the queue is empty, the matrix B governs the transitions of the arrival process which do not correspond to an arrival and A governs those transitions which do. When the, queue is occupied, the matrix A 2 = I mm governs departures, A governs arrivals and A = B A 2 governs those transitions which do not correspond to an arrival or a departure. Note that is a conservative rate matrix so that S = A + B = A + A + A 2 (3) S = : (4) As in Neuts (98) we assume that Q denes an irreducible, regular Markov chain. Necessary conditions for this are that the m m matrices B and A are nonsingular. Hence this Markov chain has at most, one stationary distribution such that Q =. This stationary distribution has a matrix-geometric form and is given by = [ ; R; R 2 ; : : : ; R x ; : : :] ; where R is the minimal non-negative solution to the matrix quadratic equation R 2 A 2 + RA + A = (5) and is the unique positive solution to the system of equations (B + RA 2 ) = and (I R) = : (6) 2. PH-RANDOM VARIABLES AND PH-RENEWAL PROCESSES. Any continuous distribution on [; ) which can be obtained as the distribution of time until absorption in a continuous-time nite-space Markov chain which has a single absorbing state into which absorption is certain, is said to be of Phase-type. Consider the following Markov chain with m+ states, initial probability vector (; m+ ) and transition rate matrix " # T T Q = ; where T is a non-singular m m matrix with T ii <, T ij for all i 6= j and T is an m vector such that T +T =. The conditional probabilities r j (t) that the process is in state j at time t with initial conditions (r (); r 2 (); : : : ; r m ()) = r() = (; m+ ) satisfy the dierential equations which have solution dr(t) = r(t) Q; dt r(t) = (; m+ ) e Qt :

The conditional probability vector v(t) that the process is still in one of the states ; : : : ; m at time t is given by Thus v(t) = e T t : F (t) = e T t ; is the probability distribution of time until absorption into state m +. This is classied as a Ph-type distribution with representation (; T ). If we consider the m + st state of the Ph-random variable as an instantaneous state, in that we instantaneously restart the process using the probability vector, then the process consisting of absorption epochs is a Ph-renewal process with representation (; T ). 3 THE QUESTION OF A MAP OUTPUT FROM A STATIONARY MAP/M/ QUEUE. We use the techniques of non-linear ltering, as given in Walrand (988). Consider a Markov chain x = fx t ; t g having a countably innite state space X, with transition rate matrix Q. For i 6= j 2 X, let Q (i; j) Q(i; j); for i 2 X let and let Q (i; i) < ; Q = Q Q : The transitions with rates Q are observed, while those in Q are hidden. Let J t count the number of observed transitions up to time t, and let t(k) = P fx t = kjj t g; so that t(k) is the probability of being in state k at time t conditioned by the number of observed jumps up to time t. Also let t be the row vector t = f t(k); k 2 Xg: A Markov chain with rate matrix Q is a MAP process if and only if there exists a nite state Markov chain with rate matrix Q and corresponding matrices Q and Q such that Q = Q + Q and t = t Q = t Q = t ; for all t 2 R + ; where is a column of 's of the appropriate dimension.

A necessary condition for the above to hold is given in Olivier and Walrand (994) to be t (Q ) y (Q ) x = t (Q ) y (Q ) x ; for all x; y : (7) Olivier and Walrand (994) considered the special case of an MMPP/M/ queue and from the necessary condition given in equation (7), derived the following for all x,! x! = I Q! x+ Q (I R) B @ Q! x A ; (8) where is dened by equation (6) and is the stationary distribution of the conservative generator Q. Recalling the denition of the matrix S given in equation (3), we let = (I R), which is the stationary distribution admitted by S, so that S =. Using this and equation (4), Olivier and Walrand erroneously concluded that B! x =! x A + S = A! x : (9) >From this point they argued by contradiction that no nite state space equivalent Markov chain dened by Q could exist. Equation (9) however is incorrect. For example, a value of x = 3 will yield! 3! 3 A + S A = A 6= : @ A! 3! + A SA 3 3. A COUNTER-EXAMPLE TO OLIVIER AND WALRAND'S ASSERTION. It is, in fact possible for equation (8) to be satised for a non-trivial MAP/M/ queue. Consider the case where the arrival process is a Ph-renewal process with the inter-event time distribution dened by an initial distribution and transition matrix B ( that is a Ph-renewal process with representation (; B )). Note that A can be written as A = B : () Let x be the stationary probability vector in the original queue that a departure leaves the system empty and the phase of the arrival process in state i, for i 2 f; : : : ; mg. Then if the (m + ) (m + ) matrices " # " # Q B A = and Q = ; () x ( x )

are substituted into equation (8), it is satised. The stationary solution for the conservative generator Q = (Q + Q ), in this case is given by = ( ; ) ; where is the m-vector dened previously in equation (6). It must be noted however that equation (8) is necessary but not sucient for the output process to be equivalent to a MAP. A necessary and sucient condition is given in the next section in equation (2). As a result, the question of whether the output process from an MMPP/M/ queue can be a MAP is still an open question, as is the similar question for that of any MAP/M/ queue or a MAP/PH/ queue. 4 WHEN IS A STATIONARY MAP POISON The question of when a stationary MAP is Poisson was not discussed in Olivier and Walrand (994). This also appears to be the case in the literature, even though it is a very natural question. We take up this problem using the techniques of nonlinear ltering. From Walrand (988) we see that two point processes dened by Q = Q + Q and Q = Q + Q have the same nite dimensional distributions if and only if for any given initial distributions of states and respectively e Q t Q e Q t 2 : : : Q e Q t k Q = e Q t Q e Q t 2 : : : Q e Q t k Q e Q t Q e Q t 2 : : : Q e Q t k e Q t Q e Q t 2 : : : Q ye Q t k for all k ; t k 2 [; ): (2) Until now we have only considered the case where Q is an innite matrix and Q is a nite matrix, however equation (2) can be used to compare any two point processes. Therefore for this section we rewrite equation (2) for the case where the right-hand side is a Poisson process of rate > and the left-hand side is a MAP. e Q t Q e Q t 2 : : : Q e Q t k Q e Q t Q e Q t 2 : : : Q e Q t k = for all k ; t k 2 [; ) ; (3) where Q = Q + Q. Any process Q = Q + Q which is equivalent to a Poisson process of rate >, must satisfy equation (3). We will consider the matrix Q in it's spectral form, concentrating on the the situation where Q has distinct eigenvalues and can be written Q = i= i r i l i : (4) Note that trivially, any MAP which has the same arrival rate in every phase of the arrival process will satisfy equation (3) which can be seen by noticing that for such a MAP, Q =, giving a Poisson process of rate =. This result is

not aected by the initial distribution and corresponds to the case where is a right eigenvector of the matrix Q and as Q is conservative, also a right eigenvector of Q and Q. 4. A PHASE (PH) RANDOM VARIABLE. We consider initially the equivalence of a Ph-random variable to a negative exponential random variable, and must assume that = (that is m+ ) so that there is no atom of probability at t =, as equivalence would then not be possible. Theorem 4. A Ph (; T ) random variable where T is irreducible and has distinct eigenvalues, is negative exponential with parameter >, (where is the eigenvalue of T of maximal real part) if and only if for all i 2 r i = or l i = : Proof: If e T t = e t then we have from equation (4) i= e it r i l i = e t : (5) For this to be true for all t 2 [; ), it must be that the eigenvalue of T of maximal real part = by the asymptotic behaviour of PH-distributions Neuts (98). Now for equation (5) to be true for all t 2 [; ) also requires that r i l i =, for all i 2, since e it > and i. A necessary and sucient condition for this to be true is that either r i = or l i = for i 2. This is always true if is the left or is the right eigenvector of T corresponding to, but this is not necessary. 4.2 A PH-RENEWAL PROCESS (; T ). For the special case of a Ph-renewal process with Q = T and Q = T, for all k ; t k 2 [; ), e Q t Q : : : Q e Q t k Q e Q t Q : : : Q e Q t k = ( e T t T ) ( e T t 2 T ) : : : e T t k T ( e T t T ) ( e T t 2T ) : : : e T t k = 8 >< >: e T t T e T t ; for k = : e T t k T e T t k ; for k > : (6)

Corollary 4.2 A stationary Ph-renewal process (; T ), where T is irreducible and has distinct eigenvalues, is a Poisson process of rate >, (where is the eigenvalue of T of maximal real part) if and only if for all i 2 r i = or l i = : Proof: >From equations (3), (4) and (6) we see that for a stationary Ph-renewal process to be Poisson of rate p p i= i= i e it r i l i e it r i l i = for all t 2 [; ) and p 2 (; 2) ; (7) where = and 2 =. Note that we are considering the stationary process and therefore the initial distribution =, the stationary distribution admitted by the Ph-renewal process. In the case where p =, the proof is basically the same for that of the Ph-random variable and we only consider the case when p =. In this case we write equation (6) as T e T t e T t for all t 2 [; ) ; (8) by noting that T and e T t commute. The next step is to establish a relationship between the stationary distribution and the renewal probability vector. We start by considering from which we see that Q = (T T ) = ; T = ( T ) : >From the assumption of irreducibility we know that T is non-singular and so we can write = ( T ) T : Then by substituting into equation (8), noticing that ( T ) is a scalar quantity and equating to we rearrange to get e T t T e T t = for all t 2 [; ) : Using the spectral form for the inverse of the matrix T, we get P m i= e it r i l i P m i= e it i r i l i = for all t 2 [; ) ;

which implies i= ( i + )e it r i l i = for all t 2 [; ): Now, r i and l i are positive since T is a generator matrix (see Seneta (98)), so r l >. Therefore = and r i l i =, for i 2. The result now follows by similar argument to that of Theorem 4:. 4.3 GENERAL MAP s. Theorem 4.3 A stationary general MAP, where Q is irreducible and has distinct eigenvalues, is Poisson of rate >, (where is the eigenvalue of Q of maximal real part) if and only if for all i 2 and for r i = (9) or l i = (2) I R I L def = fi : r i 6= g and (2) def = fi : l i 6= g; (22) we have l i Q r j = ; for all (i; j) 2 I R fi L nfgg: (23) Proof: >From equations (3) and (4) we see that for such a MAP to be Poisson of rate >, we have to satisfy the following i= i= e it r i l i Q : : : e it r i l i Q : : : l= l= l e lt k r l l l e lt k r l l l = ; for all k ; t k 2 [; ): Since r and l are strictly positive (see Seneta (98)), then r l >, so that if we choose k = in (24), we get the following necessary condition (24) = and either r i = or l i = for all i 2: (25) Choosing k = 2 we look at equation (24) again which for all t ; t 2 2 [; ) yields i= i= e it r i l i Q e it r i l i Q @ X m j=2 X @ m j=2 j e jt 2 r j l j + e t 2 r l A j e jt 2 r j l j + e t 2 r l A =

This in turn gives us another necessary condition r i l i Q r j l j = ; for j 6= ; for all i; (26) which by (2) and (22) reduces to the condition that l i Q r j = ; for (i; j) 2 I R fi L =fgg: (27) We now show that (25) and (27) are also sucient by substituting them into equation (24) which gives us X X e it r i l i Q : : : e jtk r j l j e tk r l i2i X R j2i X R e it r i l i Q : : : e jtk r j l j e tk r l i2i R j2i R = = ; for all k ; t k 2 [; ): (28) Note that if or are left or right eigenvectors respectively corresponding to then all of the above conditions are again trivially satised. Acknowledgements. The authors would like to thank Soren Asmussen for the argument that led to Theorem 4:. We would also like to acknowledge the nancial support of the Australian Research Council through grant A692762. References [] Burke, P.J. (956). The output of a queueing system, Operations Research,4:35{ 65. [2] Neuts, M. (98). Matrix-geometric solutions in stochastic models, The John Hopkins University Press, Baltimore, MD. [3] Olivier, C. and Walrand, J. (994). On the existence of nite-dimensional lters for Markov-modulated trac, Journal of Applied Probability,3:55{525. [4] Seneta, E. (98). Non-negative matrices and Markov chains, Springer-Verlag, New York, Heidelberg, Berlin. [5] Walrand, J. (988). An introduction to queueing networks, Prentice-Hall, Englewood Clis, N.J.