Continuous Time Processes

Similar documents
Performance Evaluation of Queuing Systems

Figure 10.1: Recording when the event E occurs

Performance Modelling of Computer Systems

Queueing Theory. VK Room: M Last updated: October 17, 2013.

THE QUEEN S UNIVERSITY OF BELFAST

Markov Processes and Queues

Part II: continuous time Markov chain (CTMC)

Part I Stochastic variables and Markov chains

Lecture 20: Reversible Processes and Queues

Stochastic process. X, a series of random variables indexed by t

Statistics 150: Spring 2007

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 1. Introduction. 1.1 Stochastic process

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

The exponential distribution and the Poisson process

Data analysis and stochastic modeling

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Manual for SOA Exam MLC.

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

Queues and Queueing Networks

Continuous-Time Markov Chain

LECTURE #6 BIRTH-DEATH PROCESS

Introduction to queuing theory

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

NICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Introduction to Markov Chains, Queuing Theory, and Network Performance

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Markov processes and queueing networks

All models are wrong / inaccurate, but some are useful. George Box (Wikipedia). wkc/course/part2.pdf

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

PBW 654 Applied Statistics - I Urban Operations Research

GI/M/1 and GI/M/m queuing systems

Systems Simulation Chapter 6: Queuing Models

Introduction to Queuing Networks Solutions to Problem Sheet 3

CDA5530: Performance Models of Computers and Networks. Chapter 4: Elementary Queuing Theory

1 IEOR 4701: Continuous-Time Markov Chains

Introduction to queuing theory

M/G/1 and M/G/1/K systems

Chapter 2. Poisson Processes

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Simple queueing models

Computer Systems Modelling

Kendall notation. PASTA theorem Basics of M/M/1 queue

Classification of Queuing Models

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

Exponential Distribution and Poisson Process

Chapter 2 Queueing Theory and Simulation

CS 798: Homework Assignment 3 (Queueing Theory)

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Queuing Theory. Using the Math. Management Science

ECE 313 Probability with Engineering Applications Fall 2000

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

An Analysis of the Preemptive Repeat Queueing Discipline

Waiting Line Models: Queuing Theory Basics. Metodos Cuantitativos M. En C. Eduardo Bustos Farias 1

QUEUING SYSTEM. Yetunde Folajimi, PhD

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Stochastic Simulation Discrete simulation/event-by-event

Link Models for Packet Switching

Chapter 6: Random Processes 1

QUEUING MODELS AND MARKOV PROCESSES

Derivation of Formulas by Queueing Theory

Continuous Time Markov Chains

5/15/18. Operations Research: An Introduction Hamdy A. Taha. Copyright 2011, 2007 by Pearson Education, Inc. All rights reserved.

Assignment 3 with Reference Solutions

Lecture 4a: Continuous-Time Markov Chain Models

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/6/16. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA

The Transition Probability Function P ij (t)

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Queueing Theory and Simulation. Introduction

Probability and Statistics Concepts

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

EE 368. Weeks 3 (Notes)

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

M/M/1 Queueing System with Delayed Controlled Vacation

6 Solving Queueing Models

IOE 202: lectures 11 and 12 outline

Non Markovian Queues (contd.)

Lecture 20. Poisson Processes. Text: A Course in Probability by Weiss STAT 225 Introduction to Probability Models March 26, 2014

M/G/1 queues and Busy Cycle Analysis

Classical Queueing Models.

Waiting Time Analysis of A Single Server Queue in an Out- Patient Clinic

Link Models for Circuit Switching

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/25/17. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA

4.7 Finite Population Source Model

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Chapter 6 Queueing Models. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

Computer Systems Modelling

Slides 9: Queuing Models

Review of Queuing Models

Transcription:

page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point process consists of those instants (t 0 = 0), t 1, t 2,... at which a change of state (or transition) occurs. The intervals g n = t n t n 1, n 1 are called gaps or sojourn times. A point process can be specified by giving a (probabilistic) mechanism for determining either the times t 1, t 2,... or the gaps g 1, g 2,... To specify the continuous time process we must also give a mechanism for determining the transitions at t 1, t 2,... 7.2 Counting Processes Such processes are particularly simple: the state variable is just a count which increases by 1 at each transition point t i. A stochastic process {N(t), t 0} is said to be a counting process if N(t) represents the total number of transitions or events that have occurred in (0, t]. N(t) must satisfy the following conditions: (i) N(t) is a count r.v., with N(0) = 0. (ii) If s < t, then N(s) N(t). (iii) For s < t, N(t) N(s) is equal to the number of events that have occurred in (s, t]. A counting process is said to possess (i) independent increments if the number of events which occur in disjoint (i.e. non-overlapping) time intervals are independent; (ii) stationary or time-homogeneous increments if the probability distribution of the number of events in the interval (u, u + t] (u 0) (i.e. N(u + t) N(t)) depends only on t, i.e. on the length of the time interval and not on its position on the time axis.

page 103 Let the r.v. T n denote the time at which the n th event after t = 0 occurs, i.e. T n = inf{t : N(t) = n}, n 1, and let T 0 = 0. Note that T 0 T 1 T 2 and N(t) = max{n : T n t} A typical realization of a counting process is shown. N(t) 5 4 3 2 1 0 t 0 0 t 1 t 2 t 3 t 4 t 5 t 7.3 Poisson Process 7.3.1 Definition and distribution of N(t) The counting process {N(t), t 0} is said to be a Poisson process having rate λ if (i) the process has independent increments; (ii) for any short time interval (t, t + δt], P(1 event occurs in (t, t + δt]) = λδt + o(δt) (as δt 0), i.e. P(N(t + δt) N(t) = 1) = λδt + o(δt) or, to put it another way, P(N(t + δt) = n + 1 N(t) = n) = λδt + o(δt), n 0; (iii) P(k events occur in (t, t + δt]) = o(δt), k 2, or, alternatively, P(2 or more events occur in (t + δt]) = o(δt), each of which properties can be similarly expressed in terms of N(t).

page 104 Theorem If {N(t), t > 0} is a Poisson process, then, for each t > 0, the r.v. distribution with parameter λt, i.e. N(t) has the Poisson P(N(t) = n) = (λt)n e λt, n = 0, 1, 2,... (7.1) n! Proof: For convenience write p n (t) for P(N(t) = n). Since N(0) = 0, we have that p 0 (0) = 1, p n (0) = 0 for n > 0. (7.2) Now, invoking independence, p n (t + δt) = P(n events in (0, t + δt]) = But since we have that Then Also n P(n k events in (t, t + δt]).p(k events in (0, t]). k=0 P(0 events occur in (t, t + δt]) + P(1 event occurs in (t, t + δt]) + P(2 or more events occur in (t, t + δt]) = 1, P(0 events occur in (t, t + δt]) = 1 λδt + o(δt). p n (t + δt) = (1 λδt + o(δt)).p n (t) +(λδt + o(δt)).p n 1 (t) + o(δt), n 1 i.e. p n (t + δt) = (1 λδt)p n (t) + λδtp n 1 (t) + o(δt), n 1. So p n (t + δt) p n (t) δt p 0 (t + δt) p 0 (t) and δt Now let δt 0: we obtain p 0 (t + δt) = p 0 (t).(1 λδt + o(δt)) = (1 λδt)p 0 (t) + o(δt). dp n (t) dp 0 (t) = λp n 1 (t) λp n (t) + o(δt), n 1 δt = λp 0 (t) + o(δt). δt = λp n 1 (t) λp n (t), n 1 = λp 0 (t). Using the initial conditions (7.2), this set of difference-differential equations can be solved successively for n = 0, 1, 2,... to give p 0 (t) = e λt, p 1 (t) = λte λt,... and hence by induction to give the general result (7.1) for p n (t). Alternatively, we can use PGFs to solve (7.3). Let G(s, t) be the PGF of {p n (t), n 0}, i.e. (7.3) G(s, t) = p n (t)s n (7.4) (for fixed t 0, G(s, t) has the usual PGF properties). Multiply (7.1) by s n and sum over n = 0, 1, 2,... to get dp 0 (t) s 0 + n=1 dp n (t) s n = λp 0 (t)s 0 + λ{p n 1 (t) p n (t)}s n n=1

page 105 i.e. Thus G(s, t) t dp n (t) = s n = λ p n (t)s n + λ p n 1 (t)s n = λg(s, t) + λsg(s, t). n=1 where C(s) is constant with respect to t. 0 = 0 + C(s), i.e. C(s) = 0. It follows that 1 G. G = λ(s 1) t i.e. t {log e G(s, t)} = λ(s 1) i.e. log G(s, t) = λ(s 1)t + C(s) The initial conditions (7.2) give G(s, 0) = 1, so G(s, t) = exp{λt(s 1)}, (7.5) which is the PGF of the Poisson distribution with parameter λt hence the result. Corollary P[N(u + t) N(u) = n] = (λt)n e λt, n 0, for u 0, t > 0 (7.6) n! i.e., the Poisson process has stationary increments (as we would expect from conditions (ii) and (iii)). The proof proceeds along the same lines (let p n (u, t) be the above probability). 7.3.2 Other associated distributions The interarrival (or inter-event) times in a Poisson process are the r.v.s X 1, X 2,... given by Then so the c.d.f. of X 1 is 1 e λx, i.e. Also X n = T n T n 1, n 1. (7.7) P(X 1 > x) = P(N(x) = 0) = e λx, x 0 X 1 exp(λ). P(X 2 > x X 1 = x 1 ) = P(no arrival in (x 1, x 1 + x] X 1 = x 1 ). In this conditional probability, the first event concerns arrivals in (x 1, x 1 + x] and the second (conditioning) event concerns arrivals in [0, x 1 ], so the events are independent. Hence P(X 2 > x X 1 = x 1 ) = P(no arrival in (x 1, x 1 + x]) = e λx (using the Corollary); i.e., X 2 is independent of X 1 and has the same distribution. Similarly P(X n+1 > x X 1 = x 1,..., X N = x n ) = P(no arrival in (t, t + x]) where t = x 1 + x 2 + + x n. It follows by induction on n that X 1, X 2, X 3,... are independent exponential r.v.s, each with parameter λ.

page 106 The waiting time for the r th event is the r.v. W r = X 1 + X 2 + + X r, (7.8) and so W r Gamma(r, λ). (7.8) An alternative derivation of this result is as follows. W r w N(w) r. So F Wr (w) = P(W r w) (λw) j e λw = P(N(w) r) =. j! Differentiating to get the p.d.f., we again find that W r Gamma(r, λ). More generally, this is also the distribution of time between the m th and (m + r) th events. j=r 7.4 Markov Processes A discrete-state continuous-time stochastic process {X(t), t 0} is said to be a Markov process if P(X(t n ) = j X(t 0 ) = i 0, X(t 1 ) = i 1,..., X(t n 1 ) = i) = P(X(t n ) = j X(t n 1 ) = i) (7.9) for all t 0 < t 1 < < t n and all relevant values of i, j, i 0, i 1,... A Markov process is said to be time-homogeneous or stationary if for all i, j and u > 0, t > 0. P(X(u + t) = j X(u) = i) = P(X(t) = j X(0) = i) = p ij (t) (7.10) A stationary Markov process is completely described by its transition probability functions {p ij (t)} and its initial probability distribution. In modelling Markov processes it is generally assumed that p ij (δt) = { λij δt + o(δt) i j 1 + λ ii δt + o(δt), i = j, (7.11) where the constants λ ij are called transition rates and satisfy and λ ij 0, i j, λ ii < 0, all i, j λ ij = 0 (since j p ij(δt) = 1). (7.12) Example The Poisson process with rate λ is a simple example of a Markov process, with transition rates λ n,n+1 = λ, λ n,n = λ, λ n,m = 0, m n, n + 1. We do not pursue the theory of general Markov processes in this module: instead we study a particular Markov process from first principles.

page 107 7.5 Birth-and-Death Process Here X(t) may be thought of as the size of a population at time t (with possible values 0, 1, 2,...), with births and deaths occurring from time to time in such a way that α i δt + o(δt) i 0, j = i + 1 β p ij (δt) = i δt + o(δt) i 1, j = i 1 1 (α i + β i )δt + o(δt) i 0, j = i o(δt) otherwise. The process is conveniently represented on a transition rate diagram, as shown. (7.13) α 0 α 1 α 2 α n 1 α n α n+1 0 1 2 n n+1 β 1 β 2 β 3 β n β n+1 β n+2 (The nodes represent possible states X(t) and the arrows possible transitions to other states. Note that the transition rates are in general state-dependent). For convenience, we again write p n (t) = P(X(t) = n), n = 0, 1, 2,... Then i.e. Inserting (7.13)) gives Then and P(X(t + δt) = n) = P(X(t + δt) = n X(t) = i).p(x(t) = i) i=0 p n (t + δt) = p in (δt)p i (t). i=0 p n (t + δt) = (1 α n δt β n δt)p n (t) +(α n 1 δt)p n 1 (t) +(β n+1 δt)p n+1 (t) + o(δt), n 1 and p 0 (t + δt) = (1 α 0 δt)p 0 (t) + (β 1 δt)p 1 (t) + o(δt). dp n (t) dp 0(t) p n (t + δt) p n (t) = lim δt 0 δt = (α n + β n )p n (t) + α n 1 p n 1 (t) + β n+1 p n+1 (t), n 1 = α 0 p 0 (t) + β 1 p 1 (t). (7.14) We shall solve this set of difference-differential equations in a special case, again using the PGF.

page 108 Example Suppose that Linear birth-and-death process α n = nα, n 0 β n = nβ, n 1. (7.15) (Note: since α 0 = 0, X(t) = 0 is absorbing.) Suppose that initially i.e. The PGF of X(t) is X(0) = a (> 0) p a (0) = 1 p m (0) = 0, m a. where G(s, 0) = s a. Multiply (7.14) by s n and sum over n to get G(s, t) = p n (t)s n (7.16) G(s, t) t = [ (α + β)s + αs 2 + β] np n (t)s n 1 n=1 G(s, t) = (αs β)(s 1) s It can be verified that the solution satisfying the initial condition G(s, 0) = s a is [ ] (αs β)e (β α)t a β(s 1) (αs β)e G(s, t) = (β α)t, if α β α(s 1) [ ] s αt(s 1) a, if α = β. 1 αt(s 1) (7.17) Then p n (t) is the coefficient of s n when G(s, t) is expanded as a power series in s. The probability of extinction by time t is [ ] β βe (β α)t a p 0 (t) = G(0, t) = α βe (β α)t, if α β [ ] αt a, if α = β αt + 1. (7.18) Thus p 0 (t) i.e., extinction is certain if and only if α β. The mean population size is So, as t, [ ] G(s, t) E{X(t)} = s { 1, if α β (β/α) a, if α > β s=1 = 0 if α < β E{X(t)} a if α = β if α > β. as t ; (7.19) { ae (α β)t if α β a if α = β. (7.20) (7.21)

page 109 7.6 Steady-State Distribution Often it is sufficient to study a Markov process in the limit t. Under certain conditions p n (t) π n as t, (7.22) where {π n } is the equilibrium or steady-state distribution, obtained by setting dp n = 0, p n (t) = π n for all n in the appropriate differential equations. So for the birth-and-death process we have 0 = (α n + β n )π n + α n 1 π n 1 + β n+1 π n+1, n 1 0 = α 0 π 0 + β 1 π 1, or (α n + β n )π n = α n 1 π n 1 + β n+1 π n+1, n 0, (7.23) where we have defined α 1 = β 0 = 0 for mathematical convenience. Now sum these equations from n = 0 to n = m where m 0: we obtain m m m (α n + β n )π n = α n 1 π n 1 + β n+1 π n+1 from which we deduce that The normalisation requirement yields where = m 1 α n π n + m+1 β n π n i.e. α m π m = β m+1 π m+1, m 0, π m = α 0α 1...α m 1 β 1 β 2...β m π 0, m 1. m=0 (7.24) π m = 1 (7.25) π 0 = S 1, (7.26) S = 1 + α 0 β 1 + α 0α 1 β 1 β 2 +... (7.27) The steady-state distribution (if it exists) can be derived directly without appealing to the differential equations. It can be shown that when a general Markov process has a steady-state distribution (π 0, π 1,...) = π, it satisfies the equations πq = 0, (7.28) where Q = (λ ij ), 0 = (0, 0,...), together with the normalisation condition i π i = 1. For a birth-and death process α 0 α 0 0 0...... β 1 (α 1 + β 1 ) α 1 0...... 0 β Q = 2 (α 2 + β 2 ) α 2...... 0 0 β 3 (α 3 + β 3 ).................................. and substitution in (7.28) yields the equations (7.23) derived above.

page 110 7.7 Simple Queueing Systems 7.7.1 Introduction Queueing is a common human activity and hence it is convenient to use human terminology in discussing queueing systems (although such a system might involve machines waiting for service, messages arriving at a switchboard, etc.). Customers arriving at a service point require service from servers (channels, counters), but are not always able to get it immediately. A queueing system is specified by (i) The arrival pattern This is generally described by specifying the probability distribution of the inter-arrival times; these are usually considered to be independent, but may depend e.g. on the queue size. (ii) The queue discipline This determines the order in which customers are served : many everyday queues are first-in, first-out (FIFO). Some queues operate on a priority basis (e.g. in an accident and emergency department of a hospital), and sometimes the system has limited capacity. (iii) The service mechanism The service time (nearly always considered to be independent of the arrival pattern) is usually modelled by some continuous distribution. There may be several servers, or waiting customers may be served in batches (e.g. a queue for a lift or bus). For relatively simple systems a queue is described by the notation A / B / s (due to Kendall), where A : inter-arrival time distribution, B : service time distribution, s : number of servers. Examples: (i) The D / M / 1 queue has a constant inter-arrival time (D for deterministic) and an exponential service time distribution (M for Markov), with 1 server. (ii) The M / M / 2 queue has an exponential inter-arrival time distribution (or equivalently the arrival pattern is a Poisson process) and a (different) exponential service time distribution, with 2 servers. Amongst the aspects of interest are the probability distribution of the number of customers in the queue; the distribution of a customer s waiting time (queueing time + service time); the busy (or idle) time of the servers. Much of the simple theory has been developed under the assumption that the queueing system is in the equilibrium or steady state, and we shall confine our discussion to this case.

page 111 7.7.2 Queues as birth-and-death processes Consider a queueing system in which, if the current queue size (that is, number of customers waiting or being served) is n, then (i) the time to the next arrival is exponentially distributed with parameter λ n ; (ii) the time to the next departure (upon completion of service) is exponentially distributed with parameter µ n. Then the system can be modelled as a birth-and-death process with X(t) = number of customers in the system at time t; α n = λ n, β n = µ n. If a steady-state distribution {π n } exists, where π n = λ 0λ 1...λ n 1 µ 1 µ 2,,,, µ n π 0, n 1, (7.24) π 0 = S 1, S = 1 + λ 0 µ 1 + λ 0λ 1 µ 1 µ 2 + (7.25) 7.7.3 Specific Queues (i) M/M/1 queue ( simple queue) In this case the arrival and service rates are constant: λ n = λ, n 0; µ n = µ, n 1. (7.26) All inter-arrival times have the same negative exponential distribution, with parameter λ, i.e. arrivals follow a Poisson process with parameter λ. There is one server, whose service time is negative exponential with mean µ 1, i.e. service completions follow a Poisson process with parameter µ. Let ρ = λ µ = Mean service time Mean inter-arrival time (called the traffic intensity). Then where ( ) λ n π n = π 0 = ρ n π 0, µ π 0 = S 1, S = 1 + ρ + ρ 2 + = 1 1 ρ provided 0 ρ < 1. Thus 0 ρ < 1 is the condition for the existence of a steady-state distribution, and then a modified geometric distribution with mean π n = (1 ρ)ρ n, n 0 (7.27) ρ 1 ρ (being the average queue size).

page 112 (ii) Queue with discouragement This is the situation when the arrival rate decreases as n, the number of customers in the system, increases. Suppose for example that λ λ n = n + 1, n 0 µ m = µ, n 1. (7.28) Then π n = ρn n! e ρ, n 0, ρ = λ/µ. (7.29) Note that in this case the steady-state distribution exists for all ρ 0. (iii) Queue with limited capacity This can be viewed as a special case of the previous situation: now λ n is falls abruptly to zero for some critical value of n and above. For example, suppose that we have a simple queue with the additional restriction that there is room in the system for at most N customers. Then we set { λ for n < N λ n = (7.30) 0 for n N. (iv) M/M/s queue In this case there are s independent servers: suppose that each serves customers at rate µ. Then we set { nµ, n < s µ n = (7.31) sµ, n s. For, so long as all the servers are occupied (n s), the service completions comprise s independent Poisson processes and therefore together follow a single Poisson process with rate sµ: otherwise we have n independent Poisson processes, equivalent to a single Poisson process with rate nµ. (v) M/M/k queue with capacity k An example of such a system is a telephone exchange with k lines and a blocked calls cleared protocol; i.e. any call which arrives to find all lines being utilised is lost to the system. In this case we take { λ, n < k (vi) λ n = µ n = { 0, nµ, n k 1 n k 0, n > k. Queue with ample independent servers (7.32) Ample means that a server is always available to serve a customer immediately upon arrival, so that there are no waiting customers at all! Then, in the simplest case, we have λ n = λ, n 0 µ n = nµ, n 1. (7.33) For further analysis of some of the above cases, see Homework 9. *****