Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Similar documents
IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1.

IEOR 6711: Stochastic Models I SOLUTIONS to the First Midterm Exam, October 7, 2008

IEOR 6711: Stochastic Models I Fall 2012, Professor Whitt, Thursday, October 4 Renewal Theory: Renewal Reward Processes

IEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

1 Delayed Renewal Processes: Exploiting Laplace Transforms

Exponential Distribution and Poisson Process

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

IEOR 6711, HMWK 5, Professor Sigman

Random Walk on a Graph

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

IEOR 3106: Introduction to Operations Research: Stochastic Models Final Exam, Thursday, December 16, 2010

Lecture 20: Reversible Processes and Queues

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Stabilizing Customer Abandonment in Many-Server Queues with Time-Varying Arrivals

Online Supplement to Delay-Based Service Differentiation with Many Servers and Time-Varying Arrival Rates

Introduction to Queuing Networks Solutions to Problem Sheet 3

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

Data analysis and stochastic modeling

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

Making Delay Announcements

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

The Performance Impact of Delay Announcements

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

Figure 10.1: Recording when the event E occurs

Statistics 150: Spring 2007

Queueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1

ec1 e-companion to Liu and Whitt: Stabilizing Performance

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Lecture 9: Deterministic Fluid Models and Many-Server Heavy-Traffic Limits. IEOR 4615: Service Engineering Professor Whitt February 19, 2015

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

M/G/1 and M/G/1/K systems

Solutions For Stochastic Process Final Exam

Part I Stochastic variables and Markov chains

Renewal theory and its applications

Stochastic process. X, a series of random variables indexed by t

Continuous-Time Markov Chain

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

The Transition Probability Function P ij (t)

Performance Evaluation of Queuing Systems

Stationary remaining service time conditional on queue length

TOWARDS BETTER MULTI-CLASS PARAMETRIC-DECOMPOSITION APPROXIMATIONS FOR OPEN QUEUEING NETWORKS

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

ENGINEERING SOLUTION OF A BASIC CALL-CENTER MODEL

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Second Midterm Exam, November 13, Based on Chapters 1-3 in Ross

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Simple queueing models

P (L d k = n). P (L(t) = n),

Lecture 4a: Continuous-Time Markov Chain Models

MARKOV PROCESSES. Valerio Di Valerio

Exercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems

Probability and Statistics Concepts

Session-Based Queueing Systems

IEOR 8100: Topics in OR: Asymptotic Methods in Queueing Theory. Fall 2009, Professor Whitt. Class Lecture Notes: Wednesday, September 9.

Delay Announcements. Predicting Queueing Delays for Delay Announcements. IEOR 4615, Service Engineering, Professor Whitt. Lecture 21, April 21, 2015

CS 798: Homework Assignment 3 (Queueing Theory)

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Birth-Death Processes

Poisson Processes. Stochastic Processes. Feb UC3M

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

Northwestern University Department of Electrical Engineering and Computer Science

Name of the Student:

THE HEAVY-TRAFFIC BOTTLENECK PHENOMENON IN OPEN QUEUEING NETWORKS. S. Suresh and W. Whitt AT&T Bell Laboratories Murray Hill, New Jersey 07974

Continuous time Markov chains

Wait-Time Predictors for Customer Service Systems with Time-Varying Demand and Capacity

A Diffusion Approximation for the G/GI/n/m Queue

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Examination paper for TMA4265 Stochastic Processes

REAL-TIME DELAY ESTIMATION BASED ON DELAY HISTORY IN MANY-SERVER SERVICE SYSTEMS WITH TIME-VARYING ARRIVALS

1 Some basic renewal theory: The Renewal Reward Theorem

NEW FRONTIERS IN APPLIED PROBABILITY

4452 Mathematical Modeling Lecture 16: Markov Processes

PBW 654 Applied Statistics - I Urban Operations Research

Name of the Student: Problems on Discrete & Continuous R.Vs

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Notes on Continuous Random Variables

EECS 126 Probability and Random Processes University of California, Berkeley: Spring 2017 Kannan Ramchandran March 21, 2017.

16:330:543 Communication Networks I Midterm Exam November 7, 2005

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion

Question Points Score Total: 70

Continuous-time Markov Chains

Queueing Theory. VK Room: M Last updated: October 17, 2013.

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

STABILIZING PERFORMANCE IN NETWORKS OF QUEUES WITH TIME-VARYING ARRIVAL RATES

Bernoulli Counting Process with p=0.1

Transcription:

IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross textbook plus one 8 11 page of notes Justify your answers; show your work. 1. The IEOR Department Ricoh Printer (30 points) The Columbia IEOR Department has a versatile Ricoh printer that can rapidly print onesided and two-sided copies, but unfortunately it often goes down. Ricoh is alternately up (working) and down (waiting for repair or under repair). The average up time (time until breakdown) is 4 days, while the average down time (time until repair) is 3 days. Assume continuous operation. Let X(t) = 1 if the Ricoh is working at time t, and let X(t) = 0 otherwise. (a) What do we need to assume about the successive up and down times in order to make the stochastic process {X(t) : t 0} a continuous-time Markov chain (CTMC)? We need the successive up times and down times to be mutually independent random variables. In addition, these random variables should have exponential distributions. The lack-of-memory property of the exponential distribution is critical for getting the Markov property for the stochastic process {X(t) : t 0}. The exponential distribution has a single parameter, which can be taken to be its mean. Since the means are already specified, nothing more needs to be assumed, beyond assuming that the means agree with the specified averages. Henceforth assume that these extra assumptions are in place, so that indeed the stochastic process {X(t) : t 0} is a CTMC. (b) Construct the CTMC; i.e., specify the model. A CTMC can be specified by its rate matrix, Q. That takes a simple form here because there are only two states: 0 and 1. The rate matrix here, ordering the two states 0 and 1, is ( ) 1/3 1/3 Q = 1/4 1/4 where the rates are expressed per day. That is, the repair rate is 1/3 per day, while the failure rate is 1/4 per day. In this case we have a birth-and-death (BD) process, so that λ 0 = Q 0,1 = 1/3, while µ 1 = Q 1,0 = 1/4. That is an equivalent specification; i.e., we say that it is BD and we specify these single λ i and µ i values. (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

The holding time in each state is exponentially distributed. The exponential up time has mean 4 days, and thus rate 1/4, as indicated above. Let T be the failure time. Then P (T > 8 + 7 T = 7) = P (T > 8) = e (1/4)8 = e 2. (d) Suppose that Ricoh has been working continuously for 12 days. From that moment forward, let T be the time until the second breakdown. What is the expected value E[T ]? First, by the lack-of-memory property, the history (the 12 days) does not matter. Let U i be the i th up time and let D i be the i th down time, starting from the initial time. Then T = U 1 + D 1 + U 2, which is the sum of three independent exponential random variables. Thus E[T ] = E[U 1 + D 1 + U 2 ] = E[U 1 ] + E[D 1 ] + E[U 2 ] = 4 + 3 + 4 = 11 days. (e) What is the long-run proportion of time that Ricoh is up? You find the steady-state distribution by solving αq = 0 or by using the local balance equation α 0 λ 0 = α 1 µ 1, which here is α 0 (1/4) = α 1 (1/3), used together with α 0 + α 1 = 1, which agrees with intuition E[U] lim P (X(t) = 1) = t E[U] + E[D] = 4 4 + 3 = 4 7. That is the formula for an alternating renewal process. As we will see in Chapter 7, the limiting steady-state distribution holds even if the up and down times do not have exponential distributions. (f) Suppose that Ricoh is initially down. Approximately, what is the probability that Ricoh will break down at least 9 more times within the next 93 days? If Ricoh is initially down, then we have a down time D and an up time U between each successive new breakdown. The time at which 9 breakdowns occurs is the sum S 9 X 1 + + X 9, where X i is distributed as D i + U i. Notice that E[X 1 ] = 4 + 3 = 7 days, while V ar(x 1 ) = 4 2 + 3 2 = 16 + 9 = 25. By the central limit theorem, S 9 is approximately normally distributed with mean E[S 9 ] = 9 7 = 63 days and variance V ar(s 9 ) = 9 25 = 225. Hence, we can use a normal approximation ( ) S9 63 93 63 P (S 9 93) = P P (N(0, 1) 2) 0.977 ; 225 225 2

see the normal table on page 81. Bonus Question. (5 points) Give the moment generating function of T (defined in part (d) above). Continuing from part (d), E[e st ] = E[e s(u 1+D 1 +U 2 ) ] = E[e su 1 e sd 1 e su 2 ] ( ) ( ) ( ) = E[e su 1 ]E[e sd 1 ]E[e su 2 1/4 1/3 1/4 ] = (1/4) s (1/3) s (1/4) s ; (1) see Example 2.4.1 on page 66, which can be found by looking up moment generating function (MGF) in the index. Since the random variables are independent, the MGF of the sum is the product of the MGF s; see the middle of page 68. 2. The Toledo Taxi Stand (42 points) In the city of Toledo there is a small taxi stand served by three taxis. Prospective groups of customers arrive at the taxi stand at a rate of 5 per hour. (Assume that each group of customers can be served by a single taxi.) The groups are served on a first-come first-served basis by the first available taxi. The taxis take the customers to their desired destinations and then return to the taxi stand. Suppose that the time of each taxi round trip, from the taxi stand and back, is an exponentially distributed random variable with mean 10 minutes. The taxis wait in order of arrival if there are no customers to serve. The groups of customers also wait if there are no taxis, except that groups will not wait at all if there are already four groups waiting for taxis. Moreover, the waiting groups of customers have limited patience. Each group is only willing to wait an exponentially distributed time with mean 15 minutes before they will leave, without receiving service. Part I. (16 points) For this part only, suppose that at some instant of time there is precisely one taxi at the taxi stand. (a) What is the probability that nothing happens (no arrivals of taxis or customers) during the next 10 minutes? When there is one taxi present, there are two things that can happen: the arrival of one of the other two taxis or the arrival of a group of customers. The time until each of the other taxis return is an exponential random variable with mean 10 minutes (and thus rate 6 per hour). The time until the next group arrives is an exponential random variable with mean 12 minutes (and thus rate 5 per hour). The time until one of these three events occurs is the minimum of three independent exponential random variables, and so is itself an exponential random variable with a rate equal to the sum of the rates. The rate of this exponential random variable is thus 6 + 6 + 5 = 17 per hour. Since 10 minutes is 1/6 hour. The time, say T, until 3

the next event is exponential with rate 17 per hour. Thus P (T > 1/6) = e 17(1/6) = e ( 17/6). (b) What is the expected time until one of the other two taxis returns? As indicated in part (a), the time until each of the other taxis returns is exponential with rate 6 per hour. The time until the first of these two events occurs is exponential with rate 6 + 6 = 12 per hour. The expected time is the reciprocal of the rate, which is 1/12 hours, which is 5 minutes. (c) What is the probability that one of the other taxis arrives before another group of customers arrives? The rate for one of the two taxis arriving is 12 per hour; the rate for the group to arrive is 5 per hour. Thus the probability that one of the two taxis arrives before the group arrives is 12/(12 + 5) = 12/17. (d) What is the probability that both the other two taxis arrive before any groups of customers arrive? We want the probability of the intersection of two independent events, so it is the product of two probabilities. The first event is that one of the taxis arrives before the group of customers. Conditioned on that event, the second event is that the other remaining taxi arrives before the group of customers. Thus the overall probability is (12/17) (6/11) = 72/187 0.385 Part II. (10 points) (e) Construct a Markov stochastic process for the taxi stand that can be used to find the long-run proportion of time that any specified number of taxis is available to serve arriving groups of customers. We present two modelling approaches, which are actually equivalent. The first modelling approach is to recognize that we can regard this as a standard M/M/s/r + M Markovian queueing model, having Poisson arrival process (the first M), IID exponential service times (the second M), r = 3 servers, s = 4 extra waiting spaces and IID exponential times to abandon for waiting customers (the +M). That gives us the M/M/3/4+M model. This model is characterized by three parameters: the arrival rate λ = 5 per hour, the individual service rate µ = 6 per hour and the individual abandonment rate θ = 4 per hour. We get the service rate µ = 6 from the given mean 1/µ = 1/6 hour = 10 minutes; we get the abandonment rate θ = 4 per hour from the given mean 1/θ = 1/4 hour = 15 minutes. To have this model fit our circumstances, we let X(t) be the number of customers 4

in the system, either waiting or being served, where we consider the customer to be in service until the taxi he took returns to the taxi stand. With this interpretation, the possible states are 0, 1,..., 7. For example, state 4 means that three customers are being served and one is waiting, while state 2 means that 2 taxis are away, while 1 is at the taxi stand, and state 0 means that all three taxis are at the taxi stand. The advantage of this modelling approach is it uses a familiar model. We draw the rate diagram for the birth-and-death process X(t) in Figure 1. Rate Diagram for a Birth-and-Death Process birth rates 0 1 2 6 0 1 2 6 7 1 2 6 death rates 7 Figure 1: A rate diagram showing the transition rates for the birth-and-death process. To complete the model specification, we need to specify the birth rates λ i, 0 i 6, and the death rates µ i, 1 i 7. We define these as follows: Since the arrival rate is λ = 5, we have λ i = λ = 5 for all i, 0 i 6. We let λ 7 = 0 because arrivals cannot occur when the system is full The death rates are more complicated, because we need to account for service completions and abandonments. We have µ 1 = 6 because only one taxi is out, available to return; we have µ 2 = 12 because two taxis are out, available to return; µ 3 = 18 because three taxis are out, available to return. Then, for i 4, we must include abandonment. Each waiting customer abandons at rate 4. Thus µ 4 = 18 + 4 = 22, µ 5 = 18 + 8 = 26, µ 6 = 18 + 12 = 30 and µ 7 = 18 + 16 = 34. We have thus specified all the individual birth rates and death rates. We now describe the second modelling approach, which is chosen to more directly describe the system. With this second modelling approach, let the state k designate the difference between the number of customer groups present and the number of taxis present. With that scheme the states range from 3 - all three taxis are there - to +4 - there are 4 customer groups present. (Additional customer groups would balk (or be blocked) and not wait, by assumption.) Hence there are again 8 states: 3, 2, 1, 0, 1, 2, 3, 4. Alternatively, we could label the states in the customary way: 0, 1, 2, 3, 4, 5, 6, 7; we then understand that state j means the number of customers in the system, where we regard a customer as being 5

in the system if that customer is either waiting or its taxi has not yet returned. In other words, we say that the customer is in the system until its taxi has returned to the taxi stand. But that is just relabeling the same 8 states. The new state is the original state minus 3. The corresponding rates are the same. The associated rate matrix, numbering the states in increasing order (either 3, 2, 1, 0, 1, 2, 3, 4 or 0, 1, 2, 3, 4, 5, 6, 7) is Q = 5.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 6.0 11.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 12.0 17.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 18.0 23.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 22.0 27.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 26.0 31.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 30.0 35.0 5.0 0.0 0.0 0.0 0.0 0.0 0.0 34.0 34.0 (f) Without performing any detailed calculations, indicate how the limiting steady-state distribution for this Markov process can be efficiently calculated. In general for a CTMC, we can find the limiting steady-state probability vector α by solving αq = 0, which corresponds to a system of 8 linear equations with 8 unknowns. In addition, if we number the states 0, 1, 2,..., 7, then we need to use the equation α 0 + + α 7 = 1.0. But here we have a BD process, so we can solve the local-balance equations α i λ i = α i+1 µ i+1 for all i, plus the equation α 0 + + α 7 = 1.0. That leads to the explicit formula α i = 1 + n k=1 λ 0 λ i 1 µ 1 µ i λ 0 λ k 1 µ 1 µ k for 1 i 7 and see page 371 of Ross. α 0 = 1 + n k=1 1 λ 0 ; λ k 1 µ 1 µ k Part III. (16 points) For this part, assume that the limiting steady-state distribution of the Markov process has been found, specifying the steady-state probability that the process is in each of the states. Introduce notation for the states and the steady-state probabilities of those states. Use that notation to answer the following questions. (g) Give an expression for the probability that a group of potential customers will be able to be served by a taxi immediately upon arrival? 6

Let the states be labeled as initially: 3, 2,..., 4. Let α i be the steady-state probability of state i, 3 i 4. Then the answer here is α 3 + α 2 + α 1. (h) Give an expression for the long-run proportion of groups of potential customers that leave immediately upon arrival without receiving service. α 4 (i) Give an expression for the long-run average number of taxis waiting at any time. 3 α 3 + 2 α 2 + 1 α 1. (j) Give an expression for the long-run proportion of arriving groups of customers that elect to wait upon arrival, but abandon before receiving service? This one is more complicated. First, the question is not too clearly worded, so we it might not be clear what is being asked. Suppose that we are looking for the conditional probability that a customer abandons given that he enters and must wait. We can write P (Abandon enters and waits) = = = abandonment rate rate of arrivals that enter and wait α 1 θ + α 2 2θ + α 3 3θ + α 4 4θ λ(1 α 3 α 2 α 1 α 4 ) 4α 1 + 8α 2 + 12α 3 + 16α 4 5(1 α 3 α 2 α 1 α 4 ) We could also proceed in other ways. For example, we could write P (Abandon enters and waits) = P (enters and waits and then abandons) P (enters and waits) We then need to calculate the numerator and denominator. The denominator is P (enters and waits) = α 0 + α 1 + α 2 + α 3. The numerator is more complicated. We find the probability that there are i customers waiting upon arrival. Then we note that the arriving customer makes an additional customer present. Then we find the probability in question. ( ) 4 P (enters and waits and then abandons) = α 0 [( 4 +α 1 18 + 8 [( 4 +α 2 ) 18 + 12 +α 3 [( 4 18 + 16 ( 22 + 18 + 8 ) ( 26 + ) + ) ( )] 4 18 + 4 ) [( ) 4 18 + 12 18 + 8 ( ) [( ) 30 4 + 18 + 16 18 + 12 7 18 + 4 ( 22 + ) ( )]] 4 18 + 8 18 + 4 ) [( 4 18 + 8 ( 26 18 + 12. ) ( ) ( )]]] 22 4 + 18 + 8 18 + 4

To explain, consider the first term. The arrival finding 0 will himself be the sole waiting customer. Then 4/(18 + 4) is the probability that an abandonment is the next event, whereas 18/(18 + 4) is the probability that a taxi arrives and the newly arrived customer group enters service. When the customer group is initially number k in line it can abandon as the first event, the second event, and so on, up to the k th event. Next consider the second term. Here is what can happen: Our new customer can abandon, which happens with probability of 4/(18 + 8); we divide by 18 + 8 now because 3 taxis can arrive (3 6 = 18) and 2 customers can abandon (2 4 = 8). Alternatively, one of the other events can occur - the other customer abandons or one of the taxis returns - which happens with probability of 22/(18 + 8); then our customer will become first in line. Thereafter he abandons with probability 4/(18 + 4). We then move on to the α 2 term. Part IV. Bonus Questions. (8 points) (k) Can the Markov process constructed in part (e) above be made time reversible? If so, how? Yes, the stochastic process can be made reversible, provided that we initialize by the stationary probability vector α, because it is a birth-and-death process. To elaborate, first we say that a stochastic process {X(t) : < t < } is (time) reversible if {X( t) : < t < } has the same probability law as {X(t) : < t < }. For any Markov process to be reversible, we require that the Markov process have a steadystate limiting probability vector α. That always is the case for a birth-and-death process with a finite state space. (However, that might not be true with an infinite state space. With an infinite state space, we need to verify that a proper steady-state limiting vector α exists.) Moreover, we must look at the stochastic process in steady state. That is accomplished by letting the initial distribution be α; i.e., we let P (X(0) = j) = α j for all j. With that special initial condition, we have P (X(t) = j) = α j for all j and for all t. If we initialize a Markov process in that way, there always is a well-defined reverse-time Markov process, also having steady-state limiting vector α. The question is whether the reverse-time Markov process has the same probability law as the original (forward-time) Markov process. That is true if and only if the detailed balance condition holds; see Section 6.6. That is the case for birth-and-death processes. In that sense, all ergodic birth-and-death processes are reversible; see Proposition 6.5. So we must initialize the process with the steady-state vector α or, equivalently, we must consider the process in equilibrium (steady state). Then it becomes reversible. If it does not have the right initial conditions, then it is not reversible. (l) Consider the stochastic process counting the arrival times of taxis to the stand beginning at some time in steady state. Is that stochastic process a Poisson process? Why or why not? No, the arrival counting process for the arrivals of taxis is not a Poisson process. To understand why, it is useful to look at the M/M/3/4 model formulation, with states 0 7. With that model formulation, the arrival process of taxis corresponds to the departure process from the queueing system. We might think that implies Poisson, because we know that the 8

stochastic process X(t) counting the number of customers in the system is reversible. Since the external arrival process is a Poisson process, we might expect that reversibility implies that the departure process is also a Poisson process, by virtue of Corollary 6.6 on page 378. However, that result does not apply because of the finite waiting room. For our model, the departure process becomes an arrival process in reverse time, so it has the same distribution of the process counting arrivals that actually enter the system. But some of the original arrivals are blocked; indeed all arrivals finding the system full are blocked. Moreover, the blocking process does not act as an independent thinning of the external arrival process. Instead, the blocking only occurs in a special state, when the waiting room is full. Thus, because of reversibility, the departure process does have the same distribution as the process counting customers entering the system, but that entering counting process is not itself a Poisson process. 3. The Columbia Space Company (28 points) Columbia University has decided to start the Columbia Space Company, which will launch satellites from its planned Manhattanville launch site beginning in 2010, referred to henceforth as time 0. Allowing for steady growth, the Columbia Space Company plans to launch satellites at an increasing rate, beginning at time t. Specifically, they anticipate that they will launch satellites according to a nonhomogeneous Poisson process with rate λ(t) = 2t satellites per year for t 0. Suppose that the successive times satellites stay up in space are independent random variables, each uniformly distributed between 3 years and 5 years. (a) What is the probability (according to this model) that no satellites will actually be launched during the first three years (between times t = 0 and t = 3? The arrival process is a nonhomogeneous Poisson process. The mean number of arrivals between 0 and 3 is m(0, 3) = 3 0 λ(t) dt = 3 0 2t dt = t 2 3 0 = 9 0 = 9. Let N(t) denote the number of satellites that have arrived in the interval [0, t]. Then P (N(3) = 0) = e 9. (b) What is the expected number of satellites launched during the second year (between times t = 1 and t = 2)? m(1, 2) E[N(2) N(1)] = E[N(2)] E[N(1)] = 4 1 = 3. (c) What is the probability that precisely 7 satellites will be launched during the second year (between times t = 1 and t = 2)? 9

Using part (b), P (N(2) N(1) = 7) = e 3 3 7 (d) Let S(t) be the number of satellites in space at time t. What is the expected value E[S(6)]? Now we exploit properties of the M t /GI/ model, as discussed in the paper, The Physics of The Mt/G/infty Queue, by Steven G. Eick, William A. Massey and Ward Whitt, Operations Research, vol. 41, No. 4, 1993, pp. 731-742, posted on the lecture notes web page. See especially Section 1 up to Remark 1 (about one full journal page). Theorem 1 there states that S(t) has a Poisson distribution with mean E[S(t)] = t 0 7! λ(s)[1 G(t s)] ds, The idea is that the arrivals together with their service times can be represented as a Poisson random measure in the plane. The intensity at a point (s, x) in [0, ) [0, ) is λ(s)g(x) where g is the probability density function associated with the service-time cdf G. We actually get the single integral above from the double integral t ( ) E[S(t)] = λ(s)g(x) dx ds, 0 t s In our case, t = 6 and g(x) = 1/2, 3 x 5, while g(x) = 0 elsewhere. Hence, G(t) = (t 3)/2, 3 t 5. So the final range of integration is as shown in Figure 2 below. We want to integrate the intensity λ(s)g(x) = 2s 1/2 = s over the shaded rectangle and triangle, appearing above the 45 degree line between 1 and 6. To explain the rectangle for 3 s 6, note that all satellites launched in the interval [3, 6] will still be in space; the expected number of these is m(3, 6) E[N(6)] E[N(3)] = 36 9 = 27. Any launches before time 1 will have returned, and so need not be considered. So we need to more carefully consider the interval [1, 3]. That is the shaded triangle in Figure 2 for 1 s 3. To do so, write 3 1 λ(s)[1 G(t s)] ds = Hence, E[S(6)] = 27 + 14/3 = 95/3. 3 1 (s 1) 2s ds = 2 3 1. (s 2 s) ds = 14/3. (e) Let R(t) be the number of satellites that have been launched and returned to earth by time t. What is the covariance between S(6) and R(6)? The covariance is 0, because the random variables R(t) and S(t) are independent Poisson random variables. To see that independence implies 0 covariance, look at pages 52 and 53 of Ross. From the M t /GI/ model, these two random variables represent the number of points in disjoint subsets of the plane (above and below the 45 degree line), when we focus on the Poisson process in the plane; again see the Physics paper. 10

Diagram for an Infinite-Server Queue 6 5 3 (s,x) x 0 1 3 s 6 Figure 2: A diagram showing the intensity of arrivals and service times. (f) Give an expression for the joint probability P (S(6) = 7, R(6) = 8). Since N(6) = S(6) + R(6), we have E[R(6)] = E[N(6)] E[S(6)] = 36 (95/3) = 13/3. By the independence, where a = 95/3 and b = 13/3. P (S(6) = 7, D(6) = 8) = e a (a) 7 7! e b (b) 8 8! (g) Give an expression for the probability P (S(6) + D(6) = 15). Since N(6) = S(6) + R(6),, P (S(6) + R(6) = 15) = P (N(6) = 15) = e 36 (36) 15 15! 4. Ultimate Bonus Questions (6 points) (a) Who is buried in Grant s Tomb? 11

The main answer is Grant. The more refined answer is Ulysses S. Grant, Civil War General and U. S. President from 1869 to 1877. But his wife - Julia Dent Grant - is also buried there. See http://www.nps.gov/gegr/index.htm (b) Where is Grant s Tomb? 122nd Street and Riverside Drive, about 4 blocks away from Mudd. Right across the street from the International House. (b) Where does this question come from? The You Bet Your Life quiz program hosted by Groucho Marx. 12