QUEUING MODELS AND MARKOV PROCESSES

Similar documents
Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

λ λ λ In-class problems

Statistics 150: Spring 2007

Chapter 3: Markov Processes First hitting times

Markov Model. Model representing the different resident states of a system, and the transitions between the different states

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

λ, µ, ρ, A n, W n, L(t), L, L Q, w, w Q etc. These

Queuing Theory. Using the Math. Management Science

Performance Evaluation of Queuing Systems

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Markov Processes and Queues

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Continuous-Time Markov Chain

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Non Markovian Queues (contd.)

The Transition Probability Function P ij (t)

Readings: Finish Section 5.2

Math 166: Topics in Contemporary Mathematics II

GI/M/1 and GI/M/m queuing systems

4.7 Finite Population Source Model

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

IEOR 6711, HMWK 5, Professor Sigman

Continuous Time Processes

Introduction to Markov Chains, Queuing Theory, and Network Performance

1.225 Transportation Flow Systems Quiz (December 17, 2001; Duration: 3 hours)

Queuing Theory. 3. Birth-Death Process. Law of Motion Flow balance equations Steady-state probabilities: , if

Introduction to queuing theory

Time Reversibility and Burke s Theorem

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

Slides 9: Queuing Models

Chapter 5: Special Types of Queuing Models

Quiz Queue II. III. ( ) ( ) =1.3333

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

Probability and Statistics Concepts

Queueing Theory (Part 4)

Note special lecture series by Emmanuel Candes on compressed sensing Monday and Tuesday 4-5 PM (room information on rpinfo)

Queuing Analysis. Chapter Copyright 2010 Pearson Education, Inc. Publishing as Prentice Hall

STAT 380 Continuous Time Markov Chains

Data analysis and stochastic modeling

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

A Study on Performance Analysis of Queuing System with Multiple Heterogeneous Servers

Continuous Time Markov Chains

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Photo: US National Archives

MULTIPLE CHOICE QUESTIONS DECISION SCIENCE

Calculation of Steady-State Probabilities of M/M Queues: Further Approaches. Queuing Theory and Applications (MA597)

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Analysis of Two-Heterogeneous Server Queueing System

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Basic Queueing Theory

ECE 6960: Adv. Random Processes & Applications Lecture Notes, Fall 2010

Waiting Time Analysis of A Single Server Queue in an Out- Patient Clinic

Buzen s algorithm. Cyclic network Extension of Jackson networks

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Chapter 6 Queueing Models. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

M/M/1 Retrial Queueing System with Negative. Arrival under Erlang-K Service by Matrix. Geometric Method

Introduction to queuing theory

Operational Laws Raj Jain

I, A BRIEF REVIEW ON INFINITE QUEUE MODEL M.

QUEUING SYSTEM. Yetunde Folajimi, PhD

EE 368. Weeks 3 (Notes)

Session-Based Queueing Systems

Bulk input queue M [X] /M/1 Bulk service queue M/M [Y] /1 Erlangian queue M/E k /1

MARKOV PROCESSES. Valerio Di Valerio

Lecture 20: Reversible Processes and Queues

Random Walk on a Graph

YORK UNIVERSITY FACULTY OF ARTS DEPARTMENT OF MATHEMATICS AND STATISTICS MATH , YEAR APPLIED OPTIMIZATION (TEST #4 ) (SOLUTIONS)

Two Heterogeneous Servers Queueing-Inventory System with Sharing Finite Buffer and a Flexible Server

NANYANG TECHNOLOGICAL UNIVERSITY SEMESTER I EXAMINATION MH4702/MAS446/MTH437 Probabilistic Methods in OR

IE 5112 Final Exam 2010

Modelling Complex Queuing Situations with Markov Processes

P (L d k = n). P (L(t) = n),

Chapter 6: Temporal Difference Learning

A FAST MATRIX-ANALYTIC APPROXIMATION FOR THE TWO CLASS GI/G/1 NON-PREEMPTIVE PRIORITY QUEUE

Markov Processes Cont d. Kolmogorov Differential Equations

Computer Networks More general queuing systems

Link Models for Circuit Switching

Review of Queuing Models

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Little s result. T = average sojourn time (time spent) in the system N = average number of customers in the system. Little s result says that

THE ROYAL STATISTICAL SOCIETY 2009 EXAMINATIONS SOLUTIONS GRADUATE DIPLOMA MODULAR FORMAT MODULE 3 STOCHASTIC PROCESSES AND TIME SERIES

Queues and Queueing Networks

Analysis of Software Artifacts

Module 8. Lecture 3: Markov chain

CDA5530: Performance Models of Computers and Networks. Chapter 4: Elementary Queuing Theory

M/M/1 Retrial Queueing System with N-Policy. Multiple Vacation under Non-Pre-Emtive Priority. Service by Matrix Geometric Method

Stochastic process. X, a series of random variables indexed by t

PBW 654 Applied Statistics - I Urban Operations Research

Computer Systems Modelling

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes.

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

A TANDEM QUEUEING SYSTEM WITH APPLICATIONS TO PRICING STRATEGY. Wai-Ki Ching. Tang Li. Sin-Man Choi. Issic K.C. Leung

Transcription:

QUEUING MODELS AND MARKOV ROCESSES Queues form when customer demand for a service cannot be met immediately. They occur because of fluctuations in demand levels so that models of queuing are intrinsically stochastic. Some definitions The number of servers is s The mean arrival rate (number of customers per unit of time) is λ - this is assumed to follow a oisson distribution n λne n! λ where n is the probability of n arrivals in the time period. The mean service rate (number of customers served per unit time per server) is µ - this is assumed to follow an exponential distribution ( t) 1 e µ t where (t) is the probability of being served by time period t. We assume independence of the two processes λ and µ and require the condition that s * µ > λ, otherwise the queue will grow indefinitely. The average service time is given by 1 / µ.

Costs of Queuing Waiting Cost (WC) This depends on the average time spent waiting in line WC C w L where L is the average number of customers with L / s < 0, and C w is the waiting cost per customer per unit of time. Service Cost (SC) This is directly proportional to the number of servers s. SC C s s where C s is the cost per customer per unit of time. Increasing the number of servers s decreases WC but increases SC. Total Cost (TQC) TQC WC + SC Total costs are nonlinear in s with a minimum at s*, the optimal level of service. TQC SC WC s* Level of service

A Simple Model with One Server Consider the simplest model with s1. Then the following results hold: the probability of the system being busy: ρ λ / µ. the probability of n customers: n (1-ρ) ρ n. the average number of customers: L λ / (µ - λ). the average time spent in the system: W L / λ the average time spent queuing: W q λ / [µ (µ - λ)] W - 1/µ. A general result in a steady state queuing process is that L λ W. This relationship is known as Little s law. A Model with s Servers Consider a model with s servers. Then the following results hold: the probability of the system being busy: ρ λ /(s µ). the probability of 0 customers 0 is given by the formula 1 0 s ( ) ( ) 1 n s λ / µ µ λ / µ + n 0 n! ( s 1)! sµ λ the probability of n customers: n is given by n ( λ / µ ) n for n s and n s s s 0 >! n ( λ / µ ) n 0 for 0 < n s. n! the average number of customers: L is given by L ( λ / µ ) λµ 2 0 s ( s 1)!( sµ λ) λ + µ the average time spent in the system: W L / λ the average time spent queuing: W q W - 1/µ. As before, note that Little s law holds and L λ W.

MARKOV ROCESSES Markov processes are used to describe a system moving over time between different states. Suppose that there are n states: S 1,..., S n and the probability of being in a state at time t: p 1 (t),..., p n (t) The probabilities are assumed to change over time following a simple stochastic process. The probabilities p 1 (t),..., p n (t) can be represented in a row vector p(t) [ p 1 (t)... p n (t) ]. Transitional robability The transitional probability ij (t) is the probability of moving from state i to state j at time t. This is the conditional probability ij (t) p j (t+1) p i (t) A crucial assumption of the Markov model is that transitional probabilities are independent of time so that ij p j (t+1) p i (t) The transitional probabilities can be represented in a transition matrix of dimension n x n: 11 n1... ij... 1n nn The elements in each row of the matrix must sum to unity since, whatever state you are in at time t, you must end up in one of the n states in period t+1.

The First Order Markov rocess The simplest Markov process is the first order process defined by the matrix equation p(t+1) p(t) or, equivalently, p j (t+1) Σ i p i (t) ij. The first order Markov process is called a zero memory process because the state next period depends only on the current state and not on any past states. Higher Order Markov rocesses th More generally, a k order Markov process can be defined by p(t+1) p(t) 1 + p(t-1) 2 +... + p(t-k+1) k where 1, 2,..., transition matrices. The k th k are n x n order Markov process depends on the state as far as k-1 periods back. redicting Future States Starting from an initial state p(0) at time t0 and assuming a first order Markov process, the state at time t1 is determined by the matrix equation p(1) p(0). For period t2 we have p(2) p(1) p(0) 2 where 2 * is the matrix product of with itself. In general, by substitution, it can be seen that p(t) p(0) t.

The Steady State of a Markov rocess Most Markov processes eventually converge to a steady state. This is because in general Limit t t converges to a fixed matrix. When this is true, after a certain time p(t) will not change anymore so that p(t) p(t-1) p* and p* will satisfy the equation p* p*. The steady state, if it exists, can be computed analytically by solving this system of n equations subject to the adding-up restriction that the elements of p* must sum to one, or formally, Σ i p* i 1. This is a system of n+1 equations in n unknowns but it can be solved by dropping one of the equations. Special case: Absorbing states An absorbing state is a state from which, once the state is entered, there is no possibility of exit. An absorbing state is analogous to a black hole in astrophysics. In the transition matrix, an absorbing state will have a row with a one on the diagonal and elsewhere zeroes. W hen a transition matrix has absorbing states, then in the limit, everything will end up in these states.

An Example Suppose the transitional matrix is given by 0. 6. 0. 2 0. 2 0. 2 0. 7 0. 1 0. 2 0. 3 0. 5 with initial state p(0) [ 0.324 0.441 0.235]. Then p(1) p(0) [ 0.330 0.444 0.226]. and p(2) p(1) p(0) 2 [ 0.332 0.445 0.224]. Continuing to project forwards p(11) p(0) 11 and [ 0.333 0.444 0.222] p(12) p(0) 12 [ 0.333 0.444 0.222] at which point the process has converged to a steady state (to 3 decimal places) so that p(t >12) p* [ 0.333 0.444 0.222].