IEOR 6711, HMWK 5, Professor Sigman

Size: px
Start display at page:

Download "IEOR 6711, HMWK 5, Professor Sigman"

Transcription

1 IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space. Suppose that, just as for a CTMC, it is embedded into continuous time, to get a continuous-time process {X(t) : t } but the holding times H i are positive rvs distributed as a general distribution F i (x) P (H i x), x, with finite first moment < E(H i ) 1/a i <, i S. So, just as a CTMC, the process makes transitions according to a Markov chain {X n }, and when making a transition from i to j (probability P i,j, independent of the past) the chain remains in state j, independent of the past, for an amount of time H j F j, but because the holding times are not assumed exponential, the Markov property does not hold in continuous time for {X(t)}. Nonetheless show that the limiting distribution 1 P j lim t t t I{X(s) j}ds, wp1, j S, still exists and is exactly the same as when the F i are exponential at rate a i, i S; the same as for the CTMC case; π j a P j j i π. (1) i a i (Recall Proposition 1.3, Page 1 of your Lecture Notes on Continuous-Time Markov Chains.) Thus the limiting distribution of a semi-markov process only depends on the F i through their means. SOLUTION: The proof is exactly that given in Proposition 1.3, Page 1 mentioned above; no where did the proof use the exponential distribution of holding times, only finite and non-zero first moment. Renewal Reward Theorem still yields P j 1/a j E(T j,j ), where T jj is the continuous-time return time back to state j given that initially X() j and has holding time H j. Here we also are assuming a finite state space so that E(T j,j ) < and also the denominator in the formula (1) is always finite. 2. Consider a stable FIFO M/M/1 queue, < ρ < 1. Let X(t) denote the number in system at time t, and let P n (1 ρ)ρ n, n denote the stationary distribution. Show (direct calculation) that if X() (P n ) (e.g., the chain is started off with its stationary distribution, hence is a stationary process), then the time until the first departure (after time t ), t d 1, has an exponential distribution at rate λ. SOLUTION: We condition on wether the server is busy or not at time t. If busy (probability P (X() > ) ρ 1 P ), then the next departure will occur after the (remaining) service completion of length S that is exponential at rate µ by the memoryless property; t d 1 S exp(µ). If not busy (probability P (X() 1 ρ P ), then first an arrival must come (remaining length T exp(λ) by the memoryless property), and then this customer must enter service for an independent amount of time S exp(µ); so in this case t d 1 T + S exp(λ) exp(µ) (a convolution). We now can compute the Laplace transform of t d 1 yielding (after some algebra) that for any s, 1

2 and indeed t d 1 exp(λ). E std 1 ρ µ µ + s + (1 ρ) λ λ + s µ µ + s λ λ + s, 3. Consider the M/M/1 queue (arrival rate λ service time rate µ) with the following twist: Each customer independently will get impatient after an amount of time that is exponentially distributed at rate γ while waiting in line (queue) and leave before ever entering service, and without ever returning. A customer who does enter service completes service (e.g., customers are only impatient while waiting in the line, not when in service.) (a) You arrive finding exactly one customer in the system (hence they are in service) and you join the queue to wait. What is the probability that you will get served? SOLUTION: Letting C denote your independent exp(γ) rv, independent of the exp(µ) remaining service time S, we want P (C > S) µ/(µ + γ). (b) You arrive finding exactly two (2) customers in the system (one in service, one in line)) and you join the end of the queue to wait. What is the probability that you will get served? SOLUTION Let S denote the remaining service time of who is in service (exp(µ) by the memoryless property), and let S 1 denote the service time of the customer waiting in front of you (and let C 1 denote their exponential impatience time, and let C 2 denote yours.) S, S 1, C 1, C 2 are independent exponential rvs. Let D denote the length of time until the server will be free to serve you: D S + S 1 I{C 1 > S}. We need to compute P (D < C 2 ). Denote the desired probability by p. Observe that p p 1 p 2 where p 1 the probability that you will move into the next position (e.g., of the customer in front of you; first in line), and p 2 is the (conditional) probability that you will get served given that you do move into the next position. By the memoryless property of exponential distributions, p 2 is simply the same as the answer to (a), p 2 P (C 1 > S) µ/(µ + γ). Moreover, p 1 P (C 2 > min{s, C 1 }) because {C 2 > min{s, C 1 }} is the event that you move into the next position: Either you do so because the customer in service departs (S completes) before either of the two of you in line get impatient, or because the customer in front of you gets impatient before S is complete and before you get impatient. (Both scenarios cause you to move to the next position (head of the line)). min{s, C 1 } exp(µ + γ) independent of C 2 exp(γ), so p 1 (µ + γ)/(µ + 2γ). Thus p p 1 p 2 µ µ + 2γ. This method can be generalized (by induction) to handle computing the probability that you get served if upon arrival you fine n 1 in front of you and the answer is p µ µ + nγ. (c) Model as a Birth and Death process, give the birth and death rates, λ n, µ n, n. SOLUTION X(t) the number of customers in the system at time t forms a B& D process. λ n λ, n, µ n (n 1)γ + µ, n 1. (µ ). 2

3 (d) Set up the Birth and Death balance equations for the limiting probabilities P n (but do not try to solve.) SOLUTION λp n (nγ + µ)p n+1, n. (e) Compute the ratio P n+1 /P n and prove using the ratio test from calculus, that the limiting probabilities exist for all values of λ >, µ >, γ >. Thus this chain is always positive recurrent (e.g., a condition such as ρ < 1 is not needed); explain intuitively why this should be so. SOLUTION λ P n+1 /P n nγ + µ, as n. In particular the ratio is eventually strictly bounded less than a constant less than 1; hence n P n <. Intuitively, the system is always stable because if the line gets too big, then more and more customers will get impatient and leave, hence reducing the congestion. 4. Time-reversible CTMCs: Just as in discrete time, if we consider a positive recurrent CTMC in stationarity that has been started since the infinite past, {X (t) : < t < }, then the (stationary) time-reversal X (r) (t) X ( t) : t is itself a CTMC. It has the same holding time rates {a i } and the same stationary distribution P (P j ) as the original forward time CTMC. The only thing that can differ are the infinitesimal generators Q and Q (r) (equivalently, the embedded chain transition matrices.) A positive recurrent CTMC is called time-reversible if the time-reversed process {X (r) : t } has the same distribution as the forward-time process {X (t) : t }. In words: the long-run rate that the chain moves from i to j equals the long-run rate that the chain moves from from j to i, for any two states i, j S. Thus because the holding time rates {a i } and stationary distribution P (P j ) are the same, this is equivalent to saying that a positive recurrent CTMC is time-reversible if for all pairs of states i, j S, a i P i P i,j a j P j P j,i (2) (a) Show that if the embedded chain {X n } is also positive recurrent, then (2) reduces to {X n } being time-reversible; π i P i,j π j P j,i, as we would expect. SOLUTION: Recall (Problem 1 above), that in this case P j π j a j i π i a i, j S. Using this formula for P i and P j in Equation (2) above yields the result. (b) Recall that in general, if {X n } is a positive recurrent MC, then its time-reversal MC exists and has transition matrix P (r) i,j 3 π j π i P j,i.

4 Now let us consider our positive recurrent CTMC {X(t)} and suppose that its embedded chain {X n } is null recurrent, hence π does not exist. Nonetheless, {X (r) (t)} must have an embedded Markov chain. Find its transition matrix in this case (denote it by P (r) i,j.) SOLUTION: Fom the Equation (2) discussion, the general (time-reversible or not) version would be The long-run rate that the reversed continuous-time chain moves from i to j equals the long-run rate that the forward continuous-time chain moves from from j to i, for any two states i, j S. This yields: a i P i P (r) i,j a j P j P j,i, (3) Thus we get P (r) i,j a jp j a i P i P j,i. Notice that when {X n } is positive recurrent, then a j P j a i P i π j π i, via using P j π j a j i π i a i, j S. (c) Since a birth and death process can only make transitions of magnitude ±1 we see that in such a case Equation (2) reduces to the long-run rate that the chain moves from i to i + 1 equals the long-run rate that the chain moves from from i + 1 to i, for all states i S ; the birth and death balance equations. We conclude: Every positive recurrent birth and death (B&D) process is time-reversible. We now apply this to the M/M/1 queue: X(t) the number of customers in a FIFO M/M/1 queue is a B& D process, so we conclude that when ρ < 1, it is timereversible. Assume that it is started at time t with its stationary distribution. Let ψ {t n : n 1} denote the Poisson arrival times starting from time t : < t 1 < t 2 <, and let ψ (d) {t d n : n 1} denote the point process of departure times after time t : < t d 1 < td 2 <. We know from Exercise 2 above that td 1 has an exponential distribution at rate λ, but here you will deduce more. Argue (from time-reversibility) that ψ (d) must have the same distribution as ψ, and hence must itself be a Poisson process at rate λ: the departure process from a stationary M/M/1 queue is itself a Poisson process. SOLUTION: The points of ψ are precisely the times at which the forward process jumps up by 1 (e.g., the times at which births occur.) Thus they must have the same distribution (from time-reversibility) as the points in time for which the time-reversed process jumps up by 1; but such times have the same distribution as the departure times ψ (d) (e.g., the times at which deaths occur). 4

5 5. Consider a CTMC {X(t)} (with P i,i, i S) with embedded chain transition matrix P (P i,j ) and holding time rates {a i }. Assume that a sup{a i : i S} <. Consider an alternative CTMC {X(t)}for which all holding time rates are fixed at the constant rate a; a i a, and has embedded chain transition probabilities given by { ai P i,j a P i,j if j i, 1 a i a if j i. (So P i,i > is possible.) (a) Let N(t) denote the number of transitions by time t for {X(t)}. Explain why {N(t) : t } forms a Poisson process at rate a. SOLUTION: By the Markov property, for any state i, given X(t) i, the chain will spend an exponential amount of time at rate a in state i independent of the past and then move. But this rate a does not even depend on i, so the sequence of consecutive holding times always forms an iid sequence of rvs distributed as exponential a; a Poisson process at rate a. (b) Show that the balance equations are the same for the two chains. SOLUTION: Balance equations for X(t) are which algebraically reduce to ap j ap j (1 a j a ) + P i a( a i a P i,j), j S, (4) i j a j P j i j P i a i P ij, j S. (5) (c) Explain why {X(t)} and {X(t)} have the same distribution as stochastic processes; P i,j (t) P i,j (t), t for all pairs i, j. SOLUTION: Recall that a geometric sum of iid exponentials is yet again exponential. So all that is happening here is that each original holding time H i exp(a i ) has been broken down and re-expressed as a geometric sum of iid exponentials at rate a in which the geometric has success probability a i /a. That is what the P i,j do. So when state i is entered, in both models the total amount of time spent there until changing to a state j i is exponential at rate a i. 6. Sampling a CTMC to obtain a discrete-time Markov chain: Suppose that {X(t)} is an irreducible (non-explosive) CTMC with infinitesimal generator (rates matrix) Q (q i,j ), and P (t) (P i,j (t)) e Qt, t. (a) Argue that Z n X(n), n, is an irreducible discrete-time Markov chain with transition matrix P ( P i,j ) given by P i,j P i,j (1) e Q. SOLUTION: It is Markov because P (X(n + 1) j X(n) i, X(n 1) i n 1,..., X() i ) P (X(n + 1) j X(n) i) via the Markov property assumed on {X(t)}: Give the present state X(n), the future X(n + 1) is independent of all of the past {X(u) : u < n}. P (X(n + 1) j X(n) i) P (X(1) j X() i) P i,j (1) e Q. 5

6 (b) We now generalize (a) to sampling at the times of a Poisson process. Independently of {X(t)} let {t n : n 1} be a Poisson process at rate λ. Let Z n X(t n ), n 1, Z X(). Argue that {Z n : n } is an irreducible discrete-time Markov chain with transition matrix P ( P i,j ) given by P [I (Q/λ)] 1. SOLUTION: By independence, each t n is a stopping time, and hence (via the strong Markov property) given Z n, {X(t n + t) : t > } is yet again the same CTMC but with initial condition Z n, independent of the past. This will be so for any independent renewal process used for sampling. If we let T n t n+1 t n denote the iid interarrival times for the renewal process, then given Z n we only need the independent (and independent of the past) rv T n to predict the future value Z n+1. P i,j P (Z n+1 j Z n i) does not depend on n because the T n, n are iid. Let T t 1 exp(λ), it has density λe λt, t. We know that P (t) e Qt, t. Thus, conditional on T t, E( P T t) P (t) e Qt, and so E( P T ) e QT. Taking expected values then yields P E(P (T )) (6) P (t)λe λt dt (7) P (u/λ)e u du (8) P (u/λ)e ui du (9) e Qu/λ e ui du (1) e u(i (Q/λ)) du (11) [I (Q/λ)] 1. (12) Irreducibility follows since in fact P i,j > for all pairs (i, j), a condition known as strong irreducibility: We know that P i,j (t) > for some t and since holding times are continuous (exponential) it follows that P i,j (t) > within a time interval t (s 1, s 2 ), s 1 < s 2, and hence P i,j P i,j (t)λe λt dt s2 s 1 P i,j (t)λe λt dt >. (c) Prove that {X(t)} is positive recurrent if and only if {Z n } is positive recurrent in which case π P ; they share the same stationary distribution. SOLUTION: By irreducibility, it suffices to prove that a probability solution to π π P (for {Z n }) exists if and only if a probability solution to P Q (for {X(t)}) exists and that π P. This is immediate: π π P if and only if π π[i (Q/λ)] 1 if and only if π[i (Q/λ)] π if and only if πq. 6

1 IEOR 4701: Continuous-Time Markov Chains

1 IEOR 4701: Continuous-Time Markov Chains Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

Introduction to Queuing Networks Solutions to Problem Sheet 3

Introduction to Queuing Networks Solutions to Problem Sheet 3 Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus

More information

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011 Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Lecture 20: Reversible Processes and Queues

Lecture 20: Reversible Processes and Queues Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015 Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

MARKOV PROCESSES. Valerio Di Valerio

MARKOV PROCESSES. Valerio Di Valerio MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some

More information

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis TCOM 50: Networking Theory & Fundamentals Lecture 6 February 9, 003 Prof. Yannis A. Korilis 6- Topics Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues

More information

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What

More information

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains ECE-517: Reinforcement Learning in Artificial Intelligence Lecture 4: Discrete-Time Markov Chains September 1, 215 Dr. Itamar Arel College of Engineering Department of Electrical Engineering & Computer

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Queueing Theory. VK Room: M Last updated: October 17, 2013. Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server

More information

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only

More information

Data analysis and stochastic modeling

Data analysis and stochastic modeling Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt

More information

Stationary remaining service time conditional on queue length

Stationary remaining service time conditional on queue length Stationary remaining service time conditional on queue length Karl Sigman Uri Yechiali October 7, 2006 Abstract In Mandelbaum and Yechiali (1979) a simple formula is derived for the expected stationary

More information

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

reversed chain is ergodic and has the same equilibrium probabilities (check that π j = Lecture 10 Networks of queues In this lecture we shall finally get around to consider what happens when queues are part of networks (which, after all, is the topic of the course). Firstly we shall need

More information

Introduction to queuing theory

Introduction to queuing theory Introduction to queuing theory Queu(e)ing theory Queu(e)ing theory is the branch of mathematics devoted to how objects (packets in a network, people in a bank, processes in a CPU etc etc) join and leave

More information

Time Reversibility and Burke s Theorem

Time Reversibility and Burke s Theorem Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

GI/M/1 and GI/M/m queuing systems

GI/M/1 and GI/M/m queuing systems GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting

More information

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0. IEOR 46: Introduction to Operations Research: Stochastic Models Chapters 5-6 in Ross, Thursday, April, 4:5-5:35pm SOLUTIONS to Second Midterm Exam, Spring 9, Open Book: but only the Ross textbook, the

More information

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems  M/M/1  M/M/m  M/M/1/K Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals

More information

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days? IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross

More information

LECTURE #6 BIRTH-DEATH PROCESS

LECTURE #6 BIRTH-DEATH PROCESS LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death

More information

Exact Simulation of the Stationary Distribution of M/G/c Queues

Exact Simulation of the Stationary Distribution of M/G/c Queues 1/36 Exact Simulation of the Stationary Distribution of M/G/c Queues Professor Karl Sigman Columbia University New York City USA Conference in Honor of Søren Asmussen Monday, August 1, 2011 Sandbjerg Estate

More information

Birth-Death Processes

Birth-Death Processes Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death

More information

Performance Evaluation of Queuing Systems

Performance Evaluation of Queuing Systems Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems

More information

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010 Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]

More information

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

More information

NEW FRONTIERS IN APPLIED PROBABILITY

NEW FRONTIERS IN APPLIED PROBABILITY J. Appl. Prob. Spec. Vol. 48A, 209 213 (2011) Applied Probability Trust 2011 NEW FRONTIERS IN APPLIED PROBABILITY A Festschrift for SØREN ASMUSSEN Edited by P. GLYNN, T. MIKOSCH and T. ROLSKI Part 4. Simulation

More information

M/G/1 and M/G/1/K systems

M/G/1 and M/G/1/K systems M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded

More information

1 Continuous-time chains, finite state space

1 Continuous-time chains, finite state space Université Paris Diderot 208 Markov chains Exercises 3 Continuous-time chains, finite state space Exercise Consider a continuous-time taking values in {, 2, 3}, with generator 2 2. 2 2 0. Draw the diagramm

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory

Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory Contents Preface... v 1 The Exponential Distribution and the Poisson Process... 1 1.1 Introduction... 1 1.2 The Density, the Distribution, the Tail, and the Hazard Functions... 2 1.2.1 The Hazard Function

More information

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014 Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note

More information

Part II: continuous time Markov chain (CTMC)

Part II: continuous time Markov chain (CTMC) Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1

More information

Chapter 8 Queuing Theory Roanna Gee. W = average number of time a customer spends in the system.

Chapter 8 Queuing Theory Roanna Gee. W = average number of time a customer spends in the system. 8. Preliminaries L, L Q, W, W Q L = average number of customers in the system. L Q = average number of customers waiting in queue. W = average number of time a customer spends in the system. W Q = average

More information

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1 Queueing systems Renato Lo Cigno Simulation and Performance Evaluation 2014-15 Queueing systems - Renato Lo Cigno 1 Queues A Birth-Death process is well modeled by a queue Indeed queues can be used to

More information

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Introduction to Markov Chains, Queuing Theory, and Network Performance

Introduction to Markov Chains, Queuing Theory, and Network Performance Introduction to Markov Chains, Queuing Theory, and Network Performance Marceau Coupechoux Telecom ParisTech, departement Informatique et Réseaux marceau.coupechoux@telecom-paristech.fr IT.2403 Modélisation

More information

Performance Modelling of Computer Systems

Performance Modelling of Computer Systems Performance Modelling of Computer Systems Mirco Tribastone Institut für Informatik Ludwig-Maximilians-Universität München Fundamentals of Queueing Theory Tribastone (IFI LMU) Performance Modelling of Computer

More information

Continuous Time Markov Chains

Continuous Time Markov Chains Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov

More information

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 This exam is closed book. YOU NEED TO SHOW YOUR WORK. Honor Code: Students are expected to behave honorably, following the accepted

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS Andrea Bobbio Anno Accademico 999-2000 Queueing Systems 2 Notation for Queueing Systems /λ mean time between arrivals S = /µ ρ = λ/µ N mean service time traffic

More information

Statistics 992 Continuous-time Markov Chains Spring 2004

Statistics 992 Continuous-time Markov Chains Spring 2004 Summary Continuous-time finite-state-space Markov chains are stochastic processes that are widely used to model the process of nucleotide substitution. This chapter aims to present much of the mathematics

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

T. Liggett Mathematics 171 Final Exam June 8, 2011

T. Liggett Mathematics 171 Final Exam June 8, 2011 T. Liggett Mathematics 171 Final Exam June 8, 2011 1. The continuous time renewal chain X t has state space S = {0, 1, 2,...} and transition rates (i.e., Q matrix) given by q(n, n 1) = δ n and q(0, n)

More information

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline). Large-Time Behavior for Continuous-Time Markov Chains Friday, December 02, 2011 10:58 AM Homework 4 due on Thursday, December 15 at 5 PM (hard deadline). How are formulas for large-time behavior of discrete-time

More information

Renewal theory and its applications

Renewal theory and its applications Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem Coyright c 2017 by Karl Sigman 1 Gambler s Ruin Problem Let N 2 be an integer and let 1 i N 1. Consider a gambler who starts with an initial fortune of $i and then on each successive gamble either wins

More information

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011

Queuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011 Queuing Theory Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Queuing Theory STAT 870 Summer 2011 1 / 15 Purposes of Today s Lecture Describe general

More information

Non Markovian Queues (contd.)

Non Markovian Queues (contd.) MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where

More information

Review of Queuing Models

Review of Queuing Models Review of Queuing Models Recitation, Apr. 1st Guillaume Roels 15.763J Manufacturing System and Supply Chain Design http://michael.toren.net/slides/ipqueue/slide001.html 2005 Guillaume Roels Outline Overview,

More information

Chapter 1. Introduction. 1.1 Stochastic process

Chapter 1. Introduction. 1.1 Stochastic process Chapter 1 Introduction Process is a phenomenon that takes place in time. In many practical situations, the result of a process at any time may not be certain. Such a process is called a stochastic process.

More information

IEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.

IEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22. IEOR 46: Spring Solutions to Homework Assignment 7: Due on Tuesday, March. More of Chapter 5: Read the rest of Section 5.3, skipping Examples 5.7 (Coupon Collecting), 5. (Insurance claims)and Subsection

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ),

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10. x n+1 = f(x n ), MS&E 321 Spring 12-13 Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 Section 4: Steady-State Theory Contents 4.1 The Concept of Stochastic Equilibrium.......................... 1 4.2

More information

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There

More information

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities

More information

arxiv: v2 [math.pr] 24 Mar 2018

arxiv: v2 [math.pr] 24 Mar 2018 Exact sampling for some multi-dimensional queueing models with renewal input arxiv:1512.07284v2 [math.pr] 24 Mar 2018 Jose Blanchet Yanan Pei Karl Sigman October 9, 2018 Abstract Using a recent result

More information

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin Queuing Networks Florence Perronnin Polytech Grenoble - UGA March 23, 27 F. Perronnin (UGA) Queuing Networks March 23, 27 / 46 Outline Introduction to Queuing Networks 2 Refresher: M/M/ queue 3 Reversibility

More information

MATH37012 Week 10. Dr Jonathan Bagley. Semester

MATH37012 Week 10. Dr Jonathan Bagley. Semester MATH37012 Week 10 Dr Jonathan Bagley Semester 2-2018 2.18 a) Finding and µ j for a particular category of B.D. processes. Consider a process where the destination of the next transition is determined by

More information

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013 IEOR 316: Second Midterm Exam, Chapters 5-6, November 7, 13 SOLUTIONS Honor Code: Students are expected to behave honorably, following the accepted code of academic honesty. You may keep the exam itself.

More information

Lecture 21. David Aldous. 16 October David Aldous Lecture 21

Lecture 21. David Aldous. 16 October David Aldous Lecture 21 Lecture 21 David Aldous 16 October 2015 In continuous time 0 t < we specify transition rates or informally P(X (t+δ)=j X (t)=i, past ) q ij = lim δ 0 δ P(X (t + dt) = j X (t) = i) = q ij dt but note these

More information

Other properties of M M 1

Other properties of M M 1 Other properties of M M 1 Přemysl Bejda premyslbejda@gmail.com 2012 Contents 1 Reflected Lévy Process 2 Time dependent properties of M M 1 3 Waiting times and queue disciplines in M M 1 Contents 1 Reflected

More information

Continuous time Markov chains

Continuous time Markov chains Chapter 2 Continuous time Markov chains As before we assume that we have a finite or countable statespace I, but now the Markov chains X {X(t) : t } have a continuous time parameter t [, ). In some cases,

More information

2905 Queueing Theory and Simulation PART IV: SIMULATION

2905 Queueing Theory and Simulation PART IV: SIMULATION 2905 Queueing Theory and Simulation PART IV: SIMULATION 22 Random Numbers A fundamental step in a simulation study is the generation of random numbers, where a random number represents the value of a random

More information

Lecture 10: Semi-Markov Type Processes

Lecture 10: Semi-Markov Type Processes Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov

More information

An M/M/1 Queue in Random Environment with Disasters

An M/M/1 Queue in Random Environment with Disasters An M/M/1 Queue in Random Environment with Disasters Noam Paz 1 and Uri Yechiali 1,2 1 Department of Statistics and Operations Research School of Mathematical Sciences Tel Aviv University, Tel Aviv 69978,

More information

1 Some basic renewal theory: The Renewal Reward Theorem

1 Some basic renewal theory: The Renewal Reward Theorem Copyright c 27 by Karl Sigman Some basic renewal theory: The Renewal Reward Theorem Here, we will present some basic results in renewal theory such as the elementary renewal theorem, and then the very

More information

Renewal processes and Queues

Renewal processes and Queues Renewal processes and Queues Last updated by Serik Sagitov: May 6, 213 Abstract This Stochastic Processes course is based on the book Probabilities and Random Processes by Geoffrey Grimmett and David Stirzaker.

More information

Structured Markov Chains

Structured Markov Chains Structured Markov Chains Ivo Adan and Johan van Leeuwaarden Where innovation starts Book on Analysis of structured Markov processes (arxiv:1709.09060) I Basic methods Basic Markov processes Advanced Markov

More information

Quantitative Model Checking (QMC) - SS12

Quantitative Model Checking (QMC) - SS12 Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over

More information

Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING

Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING 2 CHAPTER 5. QUEUEING Chapter 5 Queueing Systems are often modeled by automata, and discrete events are transitions from one state to another. In this chapter we want to analyze such discrete events systems.

More information

1 Gambler s Ruin Problem

1 Gambler s Ruin Problem 1 Gambler s Ruin Problem Consider a gambler who starts with an initial fortune of $1 and then on each successive gamble either wins $1 or loses $1 independent of the past with probabilities p and q = 1

More information

Exponential Distribution and Poisson Process

Exponential Distribution and Poisson Process Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential

More information

A review of Continuous Time MC STA 624, Spring 2015

A review of Continuous Time MC STA 624, Spring 2015 A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic

More information

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Stochastic Processes Theory for Applications Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS Contents Preface page xv Swgg&sfzoMj ybr zmjfr%cforj owf fmdy xix Acknowledgements xxi 1 Introduction and review

More information

IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1.

IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1. IEOR 46: Introduction to Operations Research: Stochastic Models Spring, Professor Whitt Class Lecture Notes: Tuesday, March. Continuous-Time Markov Chains, Ross Chapter 6 Problems for Discussion and Solutions.

More information

Systems Simulation Chapter 6: Queuing Models

Systems Simulation Chapter 6: Queuing Models Systems Simulation Chapter 6: Queuing Models Fatih Cavdur fatihcavdur@uludag.edu.tr April 2, 2014 Introduction Introduction Simulation is often used in the analysis of queuing models. A simple but typical

More information

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013 Networking = Plumbing TELE302 Lecture 7 Queueing Analysis: I Jeremiah Deng University of Otago 29 July 2013 Jeremiah Deng (University of Otago) TELE302 Lecture 7 29 July 2013 1 / 33 Lecture Outline Jeremiah

More information

Simple queueing models

Simple queueing models Simple queueing models c University of Bristol, 2012 1 M/M/1 queue This model describes a queue with a single server which serves customers in the order in which they arrive. Customer arrivals constitute

More information

1 Stationary point processes

1 Stationary point processes Copyright c 22 by Karl Sigman Stationary point processes We present here a brief introduction to stationay point processes on the real line.. Basic notation for point processes We consider simple point

More information

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time Math 416 Lecture 11 Math 416 Lecture 16 Exam 2 next time Birth and death processes, queueing theory In arrival processes, the state only jumps up. In a birth-death process, it can either jump up or down

More information