Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Size: px
Start display at page:

Download "Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011"

Transcription

1 Exercise 6.5: Solutions to Homework Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions occur. The process can be viewed as a single server queue where arrivals become increasingly discouraged as the queue lengthens. The word timeaverage below refers to the limiting time-average over each sample-path of the process, except for a set of sample paths of probability 0. λ λ/2 λ/3 λ/4 λ/ µ µ µ µ µ Part a) Find the time-average fraction of time p i spent in each state i > 0 in terms of p 0 and then solve for p 0. Hint: First find an equation relating p i to p i+ for each i. It also may help to recall the power series expansion of e x. Solution: From equation (6.36) we know: By iterating over i we get: λ p i = p i+ µ, for i 0 i + ( ) i+ λ λ p i+ = p i = p 0, for i 0 µ(i + ) µ (i + )! = p i i=0 [ ( ) i = p 0 + = p 0 e λ/µ i= ] λ µ i! Where the last derivation is in fact the Taylor expansion of the function e x. Thus, λ/µ p 0 = ( e ) λ i p i = e λ/µ, for i µ i! Part b) Find a closed form solution to i p iv i where v i is the departure rate from state i. Show that the process is positive recurrent for all choices of λ > 0 and µ > 0 and explain

2 intuitively why this must be so. Solution: We observe that v 0 = λ and v i = µ + λ/(i + ) for i. i λ/µ λ λ/µ piv i = λe + ( ) {[µ + λ/(i + )] e } µ i! i 0 i i i+ λ/µ λ λ = λe + µe ( ) ( ) λ/µ + µe λ/µ µ i! µ (i + )! i i ( ) ( ) = λe λ/µ λ + µe λ/µ e λ/µ + µe λ/µ e λ/µ µ ( ) = 2µ e λ/µ The value of i p iv i is finite for all values of λ > 0 and µ > 0. So this process is positive recurrent for for all choices of transition rates λ and µ. λp We saw that P = i i+ µ(i+), so p i must decrease rapidly in i for sufficiently large i. Thus the fraction of time spent in very high numbered states must be negligible. This suggests that the steady-state equations for the p i must have a solution. Since ν i is bounded between µ and µ + λ for all i, it is intuitively clear that i ν ip i is finite, so the embedded chain must be positive recurrent. Part c) For the embedded Markov chain corresponding to this process, find the steady state probabilities π i for each i 0 and the transition probabilities P ij for each i, j. Solution: Since as shown in part (b), ( ) i 0 p iv i = / i 0 π i/v i <, we know that for all j 0, π j is proportional to p j v j : π j = p j v j, forj 0 k 0 p kv k So π 0 = λe λ/µ = ρ and for j : 2µ( e λ/µ ) 2 e ρ [ µ + λ/(j + )] ρ j j! e ρ π j = 2µ ( e ρ ) µ [ + ρ/(j + )] ρ j e ρ = 2µ ( e ρ ) j! ρ i [ ] ρ = +, for j j! j + 2 (e ρ ) 2

3 The embedded Markov chain will look like: λ/(λ+2µ) λ/(λ+3µ) λ/(λ+4µ) λ/(λ+5µ) µ/(λ+2µ) 3µ/(λ+3µ) 4µ/(λ+4µ) 5µ/(λ+5µ) 6µ/(λ+6µ) The transition probabilities are: P 0 = P i,i = (i + )µ, λ + (i + )µ for i P i,i+ = λ, λ + (i + )µ for i Finding the steady state distribution of this Markov chain gives the same result as found above. Part d) For each i, find both the time-average interval and the time-average number of overall state transitions between successive visits to i. Solution: Looking at this process as a delayed renewal reward process where each entry to state i is a renewal and the inter-renewal intervals are independent. The reward is equal to whenever the process is in state i. Given that transition n of the embedded chain enters state i, the interval U n is exponential with rate v i, so E[U n X n = i] = /v i. During this U n time, reward is and then it is zero until the next renewal of the process. The total average fraction of time spent in state i is p i with high probability. So in the steady state, the total fraction of time spent in state i (p i ) should be equal to the fraction of time spent in state i in one inter-renewal interval. The expected length of time spent in state i in one inter-renewal interval is /v i and the expected inter renewal interval (W i ) is what we want to know: Thus W i = p v. i i U p i = = W i v i W i e ρ W 0 = λ W i = (i + )! e ρ, µρ i (i + + ρ) for i 3

4 Applying Theorem 5..4 to the embedded chain, the expected number of transitions, E [T ii ] from one visit to state i to the next, is T ii = /π i. Exercise 6.9: Let q i,i+ = 2 i for all i 0 and let q i,i are 0. = 2 i for all i. All other transition rates Part a) Solve the steady-state equations and show that p i = 2 i for all i 0. Solution: The defined Markov process can be shown as: / For each i 0, p i q i,i+ = p i+ q i+,i.. p i = p 2 i, fori = ( ) p j = p j 0 So p 0 = 2 and p i =, i 2 i+ 0 Part b) Find the transition probabilities for the embedded Markov chain and show that the chain is null-recurrent. Solution: The embedded Markov chain is: /2 /2 /2 / /2 /2 /2 /2 /2 The steady state probabilities satisfy π 0 = /2π and /2π i = /2π i+ for i. So 2π 0 = π = π 2 = π 3 =. This is a null-recurrent chain, as essentially shown in Exercise 5.2. Part c) For any state i, consider the renewal process for which the Markov process starts in state i and renewals occur on each transition to state i. Show that, for each i, the expected inter-renewal interval is equal to 2. Hint: Use renewal reward theory. 4

5 Solution: As explained in Exercise 6.5 part (d), the expected inter-renewal intervals of recurrence of state i (W i ) satisfies the equation p i =. Hence, v i W i W i = = = 2 v i ipi 2 (i+) 2. Where v i = q i,i+ + q i,i = 2 i Part d) Show that the expected number of transitions between each entry into state i is infinite. Explain why this does not mean that an infinite number of transitions can occur in a finite time. Solution: We have seen in part b) that the embedded chain is null-recurrent. This means that, given X 0 = i, for any given i, that a return to i must happen in a finite number of transitions (i.e., lim n F ii (n) = ). We have seen many rv s that have an infinite expectation, but, being rv s, have a finite sample value WP. Exercise 6.4: A small bookie shop has room for at most two customers. Potential customers arrive at a Poisson rate of 0 customers per hour; They enter if there is room and are turned away, never to return, otherwise. The bookie serves the admitted customers in order, requiring an exponentially distributed time of mean 4 minutes per customer. Part a) Find the steady state distribution of the number of customers in the shop. Solution: The arrival rate of the customers is 0 customers per hour and the service time is exponentially distributed with rate 5 customers per hour (or equivalently with mean 4 minutes per customer). The Markov process corresponding to this bookie store is: To find the steady state distribution of this process we use the fact that p 0 q 0, = p q,0 and p q,2 = p 2 q 2,. So: 2 0p 0 = 5p 0p = 5p Thus, p = 9, p0 = 9, p 2 = 9. = p 0 + p + p 2 5

6 Part b) Find the rate at which potential customers are turned away. Solution: The customers are turned away when the process is in state 2 and when the process is in state 2, at rate λ = 0 the customers are turned away. So the overall rate at which the customers are turned away is λp = Part c) Suppose the bookie hires an assistant; the bookie and assistant, working together, now serve each customer in an exponentially distributed time of mean 2 minutes, but there is only room for one customer (i.e., the customer being served) in the shop. Find the new rate at which customers are turned away. Solution: The new Markov process will look like: The new steady state probabilities satisfy 0 p = 30 and + =. Thus, = 0 p p 0 p p 4 and p 0 = 3 4. The customers are turned away if the process is in state and then this happens with rate λ = 0. Thus, the overall rate at which the customers are turned away is λp = 5 Exercise 6.6: Consider the job sharing computer system illustrated below. Incoming jobs arrive from the left in a Poisson stream. Each job, independently of other jobs, requires pre-processing in system with probability Q. Jobs in system are served FCFS and the service times for successive jobs entering system are IID with an exponential distribution of mean /µ. The jobs entering system 2 are also served FCFS and successive service times are IID with an exponential distribution of mean /µ 2. The service times in the two systems are independent of each other and of the arrival times. Assume that µ > λq and that µ 2 > λ. Assume that the combined system is in steady state. 2. Part a) Is the input to system Poisson? Explain. Solution: Yes. The incoming jobs from the left are Poisson process. This process is split in two processes independently where each job needs a preprocessing in system 6

7 with probability Q. We know that if a Poisson process is split into two processes, each of the processes are also Poisson. So the jobs entering the system is Poisson with rate λq. Part b) Are each of the two input processes coming into system 2 Poisson? Solution: By Burke s theorem, the output process of a M/M/ queue is a Poisson process that has the same rate as the input process. So both sequences entering system 2 are Poisson, the first one has rate Qλ and the second one has rate ( Q)λ. The overall input is merged process of these two that is going to be a Poisson with rate λ (Since these processes are independent of each other.) Part d) Give the joint steady-state PMF of the number of jobs in the two systems. Explain briefly. Solution: We call the number of customers being served in system at time t as X (t) and number of customers being served in system 2 at time t, as X 2 (t). The splitting of the input arrivals from the left is going to make two independent processes with rates Qλ and ( Q)λ. The first process goes into system and defines X (t). The output jobs of system at time t is independent of its previous arrivals. Thus the input sequence of system 2 is independent of system. The two input processes of system 2 are also independent. Thus, X (t) is the number of customers in an M/M/ queue with input rate Qλ and service rate µ and X 2 (t) is the number of customers in an M/M/ queue with input rate λ and service rate µ 2. The total number of the customers in the system is X(t) = X (t) + X 2 (t) where X (t) is independent of X 2 (t). System one can be modeled as a birth death process where q i,i+ = Qλ and q i,i = µ. Thus, Pr{X i = i} = ( ρ )ρ in the steady state where ρ = Qλ/µ. The same is true j for system 2, thus, Pr{X 2 = j} = ( ρ 2 )ρ 2 in the steady state where ρ 2 = λ/µ 2. 7

8 Due to the independency of X and X 2, Pr{X = k} = Pr{X + X 2 = k} k = Pr{X = i, X 2 = k i} i=0 k = Pr{X = i} Pr{X 2 = k i} = i=0 k ( ρ )ρ i ( i =0 = ( ρ )( ρ2)ρ k ρ 2)ρ k 2 i k i 2 (ρ /ρ 2 ) i=0 k 2 ρ k ρ = ( ρ )( ρ 2 ) ρ 2 ρ Part e) What is the probability that the first job to leave system after time t is the same as the first job that entered the entire system after time t? Solution: The first job that enters the system after time t is the same as the first job to leave system after time t if and only if X (t) = 0 (system should be empty at time t, unless other jobs will leave system before the specified job) and the first entering job to the whole system needs preprocessing and is routed to system (and should need) which happens with probability Q. Since these two events are independent, the probability of the desired event will be ( ρ )Q = ( Qλ )Q. µ Part f) What is the probability that the first job to leave system 2 after time t both passed through system and arrived at system after time t? Solution: This is the event that both systems are empty at time t and the first arriving job is routed to system and is finished serving in system before the first job without preprocesing enters system 2. These three events are independent of each other. The probability that both systems are empty at time t in steady state is Pr{X (t) = 0, X 2 (t) = 0} = ( ρ )( ρ 2 ). The probability that the first job is routed to system is Q. The service time of the first job in system is called Y which is exponentially distributed with rate µ and the probability that the first job is finished before the first job without preprocessing enters system 2 is Pr{Y < Z} where Z is the r.v. which is the arrival time of the first job that does not need preprocessing. it is also exponentially distributed with rate ( Q)λ. Thus, Pr{Y < Z} = µ µ +( Q)λ. 8

9 Hence, the total probability of the described event is ( Qλ λ µ µ )( µ )Q 2 µ +( Q)λ. 9

10 MIT OpenCourseWare Discrete Stochastic Processes Spring 20 For information about citing these materials or our Terms of Use, visit:

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

Performance Evaluation of Queuing Systems

Performance Evaluation of Queuing Systems Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems

More information

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011 Exercise 1 Solutions to Homework 6 6.262 Discrete Stochastic Processes MIT, Spring 2011 Let {Y n ; n 1} be a sequence of rv s and assume that lim n E[ Y n ] = 0. Show that {Y n ; n 1} converges to 0 in

More information

Lecture 20: Reversible Processes and Queues

Lecture 20: Reversible Processes and Queues Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n

More information

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam. Final Examination Closed book. 120 minutes. Cover page plus five pages of exam. To receive full credit, show enough work to indicate your logic. Do not spend time calculating. You will receive full credit

More information

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

reversed chain is ergodic and has the same equilibrium probabilities (check that π j = Lecture 10 Networks of queues In this lecture we shall finally get around to consider what happens when queues are part of networks (which, after all, is the topic of the course). Firstly we shall need

More information

1 IEOR 4701: Continuous-Time Markov Chains

1 IEOR 4701: Continuous-Time Markov Chains Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change

More information

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems  M/M/1  M/M/m  M/M/1/K Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Queueing Theory. VK Room: M Last updated: October 17, 2013. Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server

More information

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0. IEOR 46: Introduction to Operations Research: Stochastic Models Chapters 5-6 in Ross, Thursday, April, 4:5-5:35pm SOLUTIONS to Second Midterm Exam, Spring 9, Open Book: but only the Ross textbook, the

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Time Reversibility and Burke s Theorem

Time Reversibility and Burke s Theorem Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal

More information

Continuous Time Markov Chains

Continuous Time Markov Chains Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions Queueing Theory II Summary! M/M/1 Output process! Networks of Queue! Method of Stages " Erlang Distribution " Hyperexponential Distribution! General Distributions " Embedded Markov Chains M/M/1 Output

More information

All models are wrong / inaccurate, but some are useful. George Box (Wikipedia). wkc/course/part2.pdf

All models are wrong / inaccurate, but some are useful. George Box (Wikipedia).  wkc/course/part2.pdf PART II (3) Continuous Time Markov Chains : Theory and Examples -Pure Birth Process with Constant Rates -Pure Death Process -More on Birth-and-Death Process -Statistical Equilibrium (4) Introduction to

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

More information

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1 Queueing systems Renato Lo Cigno Simulation and Performance Evaluation 2014-15 Queueing systems - Renato Lo Cigno 1 Queues A Birth-Death process is well modeled by a queue Indeed queues can be used to

More information

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 This exam is closed book. YOU NEED TO SHOW YOUR WORK. Honor Code: Students are expected to behave honorably, following the accepted

More information

Non Markovian Queues (contd.)

Non Markovian Queues (contd.) MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where

More information

GI/M/1 and GI/M/m queuing systems

GI/M/1 and GI/M/m queuing systems GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Introduction to Markov Chains, Queuing Theory, and Network Performance

Introduction to Markov Chains, Queuing Theory, and Network Performance Introduction to Markov Chains, Queuing Theory, and Network Performance Marceau Coupechoux Telecom ParisTech, departement Informatique et Réseaux marceau.coupechoux@telecom-paristech.fr IT.2403 Modélisation

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

6 Solving Queueing Models

6 Solving Queueing Models 6 Solving Queueing Models 6.1 Introduction In this note we look at the solution of systems of queues, starting with simple isolated queues. The benefits of using predefined, easily classified queues will

More information

Markov Processes and Queues

Markov Processes and Queues MIT 2.853/2.854 Introduction to Manufacturing Systems Markov Processes and Queues Stanley B. Gershwin Laboratory for Manufacturing and Productivity Massachusetts Institute of Technology Markov Processes

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Introduction to Queueing Theory

Introduction to Queueing Theory Introduction to Queueing Theory Raj Jain Washington University in Saint Louis Saint Louis, MO 63130 Jain@cse.wustl.edu Audio/Video recordings of this lecture are available at: 30-1 Overview Queueing Notation

More information

Photo: US National Archives

Photo: US National Archives ESD.86. Markov Processes and their Application to Queueing II Richard C. Larson March 7, 2007 Photo: US National Archives Outline Little s Law, one more time PASTA treat Markov Birth and Death Queueing

More information

Classification of Queuing Models

Classification of Queuing Models Classification of Queuing Models Generally Queuing models may be completely specified in the following symbol form:(a/b/c):(d/e)where a = Probability law for the arrival(or inter arrival)time, b = Probability

More information

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains Chapter 3 Balance equations, birth-death processes, continuous Markov Chains Ioannis Glaropoulos November 4, 2012 1 Exercise 3.2 Consider a birth-death process with 3 states, where the transition rate

More information

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis TCOM 50: Networking Theory & Fundamentals Lecture 6 February 9, 003 Prof. Yannis A. Korilis 6- Topics Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues

More information

Introduction to Queueing Theory

Introduction to Queueing Theory Introduction to Queueing Theory Raj Jain Washington University in Saint Louis Jain@eecs.berkeley.edu or Jain@wustl.edu A Mini-Course offered at UC Berkeley, Sept-Oct 2012 These slides and audio/video recordings

More information

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010 Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]

More information

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days? IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross

More information

Birth-Death Processes

Birth-Death Processes Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death

More information

STAT 380 Continuous Time Markov Chains

STAT 380 Continuous Time Markov Chains STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous

More information

Introduction to Queueing Theory

Introduction to Queueing Theory Introduction to Queueing Theory Raj Jain Washington University in Saint Louis Saint Louis, MO 63130 Jain@cse.wustl.edu Audio/Video recordings of this lecture are available at: http://www.cse.wustl.edu/~jain/cse567-11/

More information

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID: Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam 1 2 3 4 5 6 7 8 9 10 7 questions. 1. [5+5] Let X and Y be independent exponential random variables where X has

More information

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions. MAT 4371 - SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions. Question 1: Consider the following generator for a continuous time Markov chain. 4 1 3 Q = 2 5 3 5 2 7 (a) Give

More information

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition R. G. Gallager January 31, 2011 i ii Preface These notes are a draft of a major rewrite of a text [9] of the same name. The notes and the text are outgrowths

More information

Introduction to Queuing Networks Solutions to Problem Sheet 3

Introduction to Queuing Networks Solutions to Problem Sheet 3 Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus

More information

Data analysis and stochastic modeling

Data analysis and stochastic modeling Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

Queues and Queueing Networks

Queues and Queueing Networks Queues and Queueing Networks Sanjay K. Bose Dept. of EEE, IITG Copyright 2015, Sanjay K. Bose 1 Introduction to Queueing Models and Queueing Analysis Copyright 2015, Sanjay K. Bose 2 Model of a Queue Arrivals

More information

CS 798: Homework Assignment 3 (Queueing Theory)

CS 798: Homework Assignment 3 (Queueing Theory) 1.0 Little s law Assigned: October 6, 009 Patients arriving to the emergency room at the Grand River Hospital have a mean waiting time of three hours. It has been found that, averaged over the period of

More information

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review

More information

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017

CPSC 531: System Modeling and Simulation. Carey Williamson Department of Computer Science University of Calgary Fall 2017 CPSC 531: System Modeling and Simulation Carey Williamson Department of Computer Science University of Calgary Fall 2017 Motivating Quote for Queueing Models Good things come to those who wait - poet/writer

More information

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations

More information

Assignment 3 with Reference Solutions

Assignment 3 with Reference Solutions Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially

More information

Renewal theory and its applications

Renewal theory and its applications Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially

More information

Readings: Finish Section 5.2

Readings: Finish Section 5.2 LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout

More information

Exercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems

Exercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems Exercises Solutions Note, that we have not formulated the answers for all the review questions. You will find the answers for many questions by reading and reflecting about the text in the book. Chapter

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

Introduction to queuing theory

Introduction to queuing theory Introduction to queuing theory Queu(e)ing theory Queu(e)ing theory is the branch of mathematics devoted to how objects (packets in a network, people in a bank, processes in a CPU etc etc) join and leave

More information

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains ECE-517: Reinforcement Learning in Artificial Intelligence Lecture 4: Discrete-Time Markov Chains September 1, 215 Dr. Itamar Arel College of Engineering Department of Electrical Engineering & Computer

More information

Part II: continuous time Markov chain (CTMC)

Part II: continuous time Markov chain (CTMC) Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1

More information

CHAPTER 4. Networks of queues. 1. Open networks Suppose that we have a network of queues as given in Figure 4.1. Arrivals

CHAPTER 4. Networks of queues. 1. Open networks Suppose that we have a network of queues as given in Figure 4.1. Arrivals CHAPTER 4 Networks of queues. Open networks Suppose that we have a network of queues as given in Figure 4.. Arrivals Figure 4.. An open network can occur from outside of the network to any subset of nodes.

More information

PBW 654 Applied Statistics - I Urban Operations Research

PBW 654 Applied Statistics - I Urban Operations Research PBW 654 Applied Statistics - I Urban Operations Research Lecture 2.I Queuing Systems An Introduction Operations Research Models Deterministic Models Linear Programming Integer Programming Network Optimization

More information

Quiz Queue II. III. ( ) ( ) =1.3333

Quiz Queue II. III. ( ) ( ) =1.3333 Quiz Queue UMJ, a mail-order company, receives calls to place orders at an average of 7.5 minutes intervals. UMJ hires one operator and can handle each call in about every 5 minutes on average. The inter-arrival

More information

Exam of Discrete Event Systems

Exam of Discrete Event Systems Exam of Discrete Event Systems - 04.02.2016 Exercise 1 A molecule can switch among three equilibrium states, denoted by A, B and C. Feasible state transitions are from A to B, from C to A, and from B to

More information

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There

More information

Simple queueing models

Simple queueing models Simple queueing models c University of Bristol, 2012 1 M/M/1 queue This model describes a queue with a single server which serves customers in the order in which they arrive. Customer arrivals constitute

More information

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time Math 416 Lecture 11 Math 416 Lecture 16 Exam 2 next time Birth and death processes, queueing theory In arrival processes, the state only jumps up. In a birth-death process, it can either jump up or down

More information

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin Queuing Networks Florence Perronnin Polytech Grenoble - UGA March 23, 27 F. Perronnin (UGA) Queuing Networks March 23, 27 / 46 Outline Introduction to Queuing Networks 2 Refresher: M/M/ queue 3 Reversibility

More information

Chapter 5: Special Types of Queuing Models

Chapter 5: Special Types of Queuing Models Chapter 5: Special Types of Queuing Models Some General Queueing Models Discouraged Arrivals Impatient Arrivals Bulk Service and Bulk Arrivals OR37-Dr.Khalid Al-Nowibet 1 5.1 General Queueing Models 1.

More information

IOE 202: lectures 11 and 12 outline

IOE 202: lectures 11 and 12 outline IOE 202: lectures 11 and 12 outline Announcements Last time... Queueing models intro Performance characteristics of a queueing system Steady state analysis of an M/M/1 queueing system Other queueing systems,

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

1 Continuous-time chains, finite state space

1 Continuous-time chains, finite state space Université Paris Diderot 208 Markov chains Exercises 3 Continuous-time chains, finite state space Exercise Consider a continuous-time taking values in {, 2, 3}, with generator 2 2. 2 2 0. Draw the diagramm

More information

Richard C. Larson. March 5, Photo courtesy of Johnathan Boeke.

Richard C. Larson. March 5, Photo courtesy of Johnathan Boeke. ESD.86 Markov Processes and their Application to Queueing Richard C. Larson March 5, 2007 Photo courtesy of Johnathan Boeke. http://www.flickr.com/photos/boeke/134030512/ Outline Spatial Poisson Processes,

More information

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk ANSAPW University of Queensland 8-11 July, 2013 1 Outline (I) Fluid

More information

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes.

IE 336 Seat # Name (one point) < KEY > Closed book. Two pages of hand-written notes, front and back. No calculator. 60 minutes. Closed book. Two pages of hand-written notes, front and back. No calculator. 6 minutes. Cover page and four pages of exam. Four questions. To receive full credit, show enough work to indicate your logic.

More information

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

QUEUING SYSTEM. Yetunde Folajimi, PhD

QUEUING SYSTEM. Yetunde Folajimi, PhD QUEUING SYSTEM Yetunde Folajimi, PhD Part 2 Queuing Models Queueing models are constructed so that queue lengths and waiting times can be predicted They help us to understand and quantify the effect of

More information

Systems Simulation Chapter 6: Queuing Models

Systems Simulation Chapter 6: Queuing Models Systems Simulation Chapter 6: Queuing Models Fatih Cavdur fatihcavdur@uludag.edu.tr April 2, 2014 Introduction Introduction Simulation is often used in the analysis of queuing models. A simple but typical

More information

Stochastic Models of Manufacturing Systems

Stochastic Models of Manufacturing Systems Stochastic Models of Manufacturing Systems Ivo Adan Systems 2/49 Continuous systems State changes continuously in time (e.g., in chemical applications) Discrete systems State is observed at fixed regular

More information

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013 Networking = Plumbing TELE302 Lecture 7 Queueing Analysis: I Jeremiah Deng University of Otago 29 July 2013 Jeremiah Deng (University of Otago) TELE302 Lecture 7 29 July 2013 1 / 33 Lecture Outline Jeremiah

More information

λ λ λ In-class problems

λ λ λ In-class problems In-class problems 1. Customers arrive at a single-service facility at a Poisson rate of 40 per hour. When two or fewer customers are present, a single attendant operates the facility, and the service time

More information

Slides 9: Queuing Models

Slides 9: Queuing Models Slides 9: Queuing Models Purpose Simulation is often used in the analysis of queuing models. A simple but typical queuing model is: Queuing models provide the analyst with a powerful tool for designing

More information

18.440: Lecture 33 Markov Chains

18.440: Lecture 33 Markov Chains 18.440: Lecture 33 Markov Chains Scott Sheffield MIT 1 Outline Markov chains Examples Ergodicity and stationarity 2 Outline Markov chains Examples Ergodicity and stationarity 3 Markov chains Consider a

More information

Analysis of A Single Queue

Analysis of A Single Queue Analysis of A Single Queue Raj Jain Washington University in Saint Louis Jain@eecs.berkeley.edu or Jain@wustl.edu A Mini-Course offered at UC Berkeley, Sept-Oct 2012 These slides and audio/video recordings

More information

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS Andrea Bobbio Anno Accademico 999-2000 Queueing Systems 2 Notation for Queueing Systems /λ mean time between arrivals S = /µ ρ = λ/µ N mean service time traffic

More information

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov

More information

Review of Queuing Models

Review of Queuing Models Review of Queuing Models Recitation, Apr. 1st Guillaume Roels 15.763J Manufacturing System and Supply Chain Design http://michael.toren.net/slides/ipqueue/slide001.html 2005 Guillaume Roels Outline Overview,

More information

Chapter 6 Queueing Models. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation

Chapter 6 Queueing Models. Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Chapter 6 Queueing Models Banks, Carson, Nelson & Nicol Discrete-Event System Simulation Purpose Simulation is often used in the analysis of queueing models. A simple but typical queueing model: Queueing

More information

Queueing Networks and Insensitivity

Queueing Networks and Insensitivity Lukáš Adam 29. 10. 2012 1 / 40 Table of contents 1 Jackson networks 2 Insensitivity in Erlang s Loss System 3 Quasi-Reversibility and Single-Node Symmetric Queues 4 Quasi-Reversibility in Networks 5 The

More information

16:330:543 Communication Networks I Midterm Exam November 7, 2005

16:330:543 Communication Networks I Midterm Exam November 7, 2005 l l l l l l l l 1 3 np n = ρ 1 ρ = λ µ λ. n= T = E[N] = 1 λ µ λ = 1 µ 1. 16:33:543 Communication Networks I Midterm Exam November 7, 5 You have 16 minutes to complete this four problem exam. If you know

More information

QUEUING MODELS AND MARKOV PROCESSES

QUEUING MODELS AND MARKOV PROCESSES QUEUING MODELS AND MARKOV ROCESSES Queues form when customer demand for a service cannot be met immediately. They occur because of fluctuations in demand levels so that models of queuing are intrinsically

More information

Performance Modelling of Computer Systems

Performance Modelling of Computer Systems Performance Modelling of Computer Systems Mirco Tribastone Institut für Informatik Ludwig-Maximilians-Universität München Fundamentals of Queueing Theory Tribastone (IFI LMU) Performance Modelling of Computer

More information

Link Models for Packet Switching

Link Models for Packet Switching Link Models for Packet Switching To begin our study of the performance of communications networks, we will study a model of a single link in a message switched network. The important feature of this model

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information