(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Similar documents
SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 3106: Introduction to Operations Research: Stochastic Models Final Exam, Thursday, December 16, 2010

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Random Walk on a Graph

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711: Professor Whitt. Introduction to Markov Chains

Statistics 150: Spring 2007

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1.

Stochastic process. X, a series of random variables indexed by t

1 Delayed Renewal Processes: Exploiting Laplace Transforms

Lecture 1: Brief Review on Stochastic Processes

Figure 10.1: Recording when the event E occurs

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

MAS275 Probability Modelling Exercises

IEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Question Points Score Total: 70

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Stochastic Processes

Part I Stochastic variables and Markov chains

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Continuous-Time Markov Chain

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

The Transition Probability Function P ij (t)

Lecture 20: Reversible Processes and Queues

Markov Processes Cont d. Kolmogorov Differential Equations

Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:

Math Homework 5 Solutions

Continuous Time Processes

1 IEOR 4701: Continuous-Time Markov Chains

Stochastic Modelling Unit 1: Markov chain models

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

The exponential distribution and the Poisson process

GI/M/1 and GI/M/m queuing systems

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Continuous-time Markov Chains

Exponential Distribution and Poisson Process

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

Data analysis and stochastic modeling

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

57:022 Principles of Design II Final Exam Solutions - Spring 1997

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Probability and Statistics Concepts

IEOR 8100: Topics in OR: Asymptotic Methods in Queueing Theory. Fall 2009, Professor Whitt. Class Lecture Notes: Wednesday, September 9.

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Stochastic Processes MIT, fall 2011 Day by day lecture outline and weekly homeworks. A) Lecture Outline Suggested reading

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

Chapter 3 Balance equations, birth-death processes, continuous Markov Chains

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Examination paper for TMA4265 Stochastic Processes

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Second Midterm Exam, November 13, Based on Chapters 1-3 in Ross

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

Continuous time Markov chains

Continuous Time Markov Chains

Readings: Finish Section 5.2

Solutions For Stochastic Process Final Exam

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

An Introduction to Stochastic Modeling

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Lecture 4a: Continuous-Time Markov Chain Models

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

Performance Evaluation of Queuing Systems

IEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1

56:171 Operations Research Fall 1998

Name of the Student:

MARKOV PROCESSES. Valerio Di Valerio

Non Markovian Queues (contd.)

Examples of Countable State Markov Chains Thursday, October 16, :12 PM

Exercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems

Birth-death chain models (countable state)

BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS

ECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018

DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition

L = λ W time average number in Line or system arrival rate W average Waiting time per customer

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

Time Reversibility and Burke s Theorem

QUEUING MODELS AND MARKOV PROCESSES

(implicitly assuming time-homogeneity from here on)

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/6/16. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA

UNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Math 493 Final Exam December 01

Introduction to Queuing Networks Solutions to Problem Sheet 3

PBW 654 Applied Statistics - I Urban Operations Research

Homework 3 posted, due Tuesday, November 29.

Transcription:

IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only the Ross textbook, three 8 11 pages of notes, and the class lecture notes on Markov mouse (9/21 and 9/28), continuous-time Markov chains (10/24) and Martingales and Brownian motion (12/07.) 1. Xu Li s Barbershop (25 points) Xu Li operates a small barbershop with room for at most three customers, one in service and two waiting. Suppose that potential customers arrive according to a Poisson process at a rate of 10 per hour. Suppose that potential arrivals finding the barber shop full, with a customer in service and two other customers waiting, will leave and not affect future arrivals. Suppose that successive service times are independent exponential random variables with mean 30 minutes. Suppose that waiting customers have limited patience, with each waiting customer being willing to wait only an independent random, exponentially distributed, time with mean 20 minutes before starting service; if the customer has not started service by that time, then the customer will abandon, leaving without receiving service. (a) What is the probability that the time until the first customer arrives, starting empty, is greater than 10 minutes? The times between arrivals in a Poisson process are independent and identically distributed (i.i.d.) exponential random variables. Let T 1 be the time until the first arrival. Then T 1 has an exponential distribution with rate 10 (per hour) and thus mean 0.1 hour or 6 minutes. Measuring time in minutes, we have λ = 1/6 per minute, so that P (T 1 > 10) = e λt = e λ10 = e (10)/6 = e 1.6666 (b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? The time until the second customer arrives is the sum of two exponential random variables, each with a mean of 6 minutes. The variance of an exponential random variable is the square of the mean. The variance of the sum of two independent random variables is the sum of the variances. If we measure time in minutes, then the variance is 6 2 + 6 2 = 72. (If instead we were to measure time in hours, then the variance would be (0.1) 2 + (0.1) 2 = 0.02.) (c) Let N(t) be the number of customers that arrive in the time interval [0, t] to find a totally empty barbershop, again starting empty. What kind of stochastic process is {N(t) : t 0}? The stochastic process {N(t) : t 0} is first clearly a stochastic counting process, because it assumes nonnegative integer values and counts the number of events in the interval [0, t].

Since the state of the system is a Markov process, these epochs at which an arrival finds an empty system are renewal points. The successive intervals between such events are i.i.d. Thus the stochastic process {N(t) : t 0} is a renewal process. Since there is a renewal at time 0, it is a genuine renewal process, not a delayed renewal process; the first interval has the same distribution as all others. It is not a Poisson process because these intervals between renewals are not exponentially distributed; they have some other complicated distribution. (d) What proportion of time is Xu Li busy serving customers in the long run? Let X(t) be the number of customers in the system at time t. Now we use the fact that the stochastic process {X(t) : t 0} can be modelled as a birth-and-death stochastic process; see Chapter 6 of Ross or the CTMC lecture notes. The birth rate in state i is denoted by λ i, while the death rate in state i is denoted by µ i. The rate diagram for this specific birth-and-death process (with state space {0, 1,..., 3}) takes the simple linear form shown in Figure 1. Rate Diagram for the Birth-and-Death Process birth rates 10 0 10 1 10 2 0 1 2 3 1 2 2 5 3 8 death rates with Figure 1: A rate diagram showing the transition rates for our birth-and-death process. For a birth-and-death process, the CTMC transition rates take the special form Q i,i+1 = λ i, Q i,i 1 = µ i and Q i,j = 0 if j / {i 1, i, i + 1}, 1 i n 1, (1) Q 0,1 = λ 0, Q 0,j = 0 if j / {0, 1}, Q n,n 1 = µ n and Q n,j = 0 if j / {n 1, n}. (2) 2

As before, the row sums of Q are zero. Here λ i = 10 for i = 0, 1, 2, while µ 1 = 2, µ 2 = 5, µ 3 = 8, as in Figure 1. Now we want to compute the limiting steady-state probabilities. Let α j lim t P (X(t) = j X(0) = i), which is independent of the initial state i. Then, since we have a BD process, we have the general formula r j α j = k=3 k=0 r, k where r k λ 0 λ 1 λ k 1 µ 1 µ 2 µ k for 1 k 3, with r 0 1. In this particular case we have so that k 0 1 2 3 r k 1 5 10 12.5 k=3 r k = 28.5 k=0 Hence we have the limiting steady-state probabilities: k 0 1 2 3 α k 2/57 10/57 20/57 25/57 Finally, we must answer the question. The proportion of time that the one barber Xu Li is busy is 1 α 0 = 55/57. Xu Li is very busy; he could use some help. (e) What proportion of all potential customers are served in the long run? Is it greater than 1/3? The rate of service completion is (1 α 0 )µ 1 = (55/57) 2 per hour, which is clearly less than 2 per hour. On the other hand potential arrivals arrive at the rate of 10 per hour. So the proportion of all customers that are served in the long run is 11/57. Since 11/57 < 11/55 = 0.2, the proportion is less than 1/5, which in turn is less than 1/3. So the final answer is no. 2. Brownian motion (25 points) Let {B(t) : t 0} be standard Brownian motion, having E[B(t)] = 0 and V ar(b(t)) = t for all t 0. Let t 1 and t 2 be deterministic times satisfying 0 < t 1 < t 2 <. Let T a be the first time that Brownian motion hits a; i.e., T a min {t 0 : B(t) = a}. (a) Is the stochastic process {B(t) + 2t : t 0} a Markov process? Explain. Yes, the stochastic process {B(t) + 2t : t 0} is a Markov process. First, the stochastic process {B(t) : t 0} a Markov process. Both are Markov processes, because the probability 3

of a future event, say at time t, conditional on the present state, say at time s, where 0 s < t, and conditional on other information on the past, depends only on the present state. That follows from the independent-increments property. The extra deterministic term +2t does not change the independence properties. If we know B(t), then we know B(t) + 2t, and vice versa. (b) What is E[B(t 1 ) + B(t 1 )B(t 2 )]? First, the expectation of a sum of random variables is the sum of the individual expectations, so that E[B(t 1 ) + B(t 1 )B(t 2 )] = E[B(t 1 )] + E[B(t 1 )B(t 2 )] = 0 + E[B(t 1 )B(t 2 )] = E[B(t 1 )B(t 2 )]. Next, we write B(t 2 ) = B(t 1 ) + (B(t 2 ) B(t 1 )) and E[B(t 1 )B(t 2 )] = E[B(t 1 )(B(t 1 )+(B(t 2 ) B(t 1 )))] = E[B(t 1 )B(t 1 )]+E[B(t 1 )(B(t 2 ) B(t 1 )))], but E[B(t 1 )B(t 1 )] = V ar(b(t 1 ) = t 1, while, because of independent increments, E[B(t 1 )(B(t 2 ) B(t 1 ))] = E[B(t 1 )]E[B(t 2 ) B(t 1 )] = 0 0 = 0. Hence the answer is E[B(t 1 ) + B(t 1 )B(t 2 )] = t 1. (c) What is the probability P (T 2 < T 1 < T 5 )? This part is just like homework exercise 10.5. We have P (T 2 < T 1 < T 5 ) = P (T 2 < T 1 )P (T 1 < T 5 T 2 < T 1 ) = P (T 2 < T 1 )P (T 3 < T 3 ) = (1/3) (1/2) = 1/6. (d) What is the conditional expectation E[B(t 2 ) 2 t 2 B(t 1 ) = x]? Again, to exploit independent increments, write B(t 2 ) = B(t 1 ) + (B(t 2 ) B(t 1 )) and get E[B(t 2 ) 2 t 2 B(t 1 ) = x] = E[(B(t 1 ) + (B(t 2 ) B(t 1 ))) 2 t 2 B(t 1 ) = x] = E[(x + (B(t 2 t 1 ))) 2 t 2 ] = x 2 + 2xE[B(t 2 t 1 )] + E[(B(t 2 t 1 )) 2 ] t 2 = x 2 + 2x0 + (t 2 t 1 ) t 2 = x 2 t 1. 4

(e) What does part (d) imply about the stochastic process {B(t) 2 t : t 0}? The fact that E[B(t 2 ) 2 t 2 B(t 1 )] = B(t 1 ) 2 t 1 together with the Markov property implies that the stochastic process {B(t) 2 t : t 0} is a continuous-time martingale with respect to its internal history or the history of Brownian motion, i.e., E[B(t 2 ) 2 t 2 B(s), 0 s t 1 ] = B(t 1 ) 2 t 1. Note that this is a special case of E[X(t 2 ) Y (s), 0 s t 1 ] = X(t 1 ), where the stochastic process {X(t) : t 0} is a martingale with respect to {Y (t) : t 0}; here X(t) B(t) 2 t and Y (t) B(t). We also need to verify the finite-moment condition in the definition of a martingale. See the notes of December 7. 3. Patterns (25 points) Consider successive independent rolls of a single six-sided die (one of two dice). On each roll, the die shows one of the numbers {1, 2, 3, 4, 5, 6}, each with probability 1/6. A given segment of finitely many consecutive outcomes is called a pattern. The pattern is said to occur at roll n if the pattern is completed at roll n. For example, the pattern A 121 occurs at rolls 7 and 9 in the sequence 123612121251136623543131236666... and at no other times among the first 30 rolls. Note that the pattern occurrences at rolls 7 and 9 overlap, in that the last 1 of the pattern occurring at n = 7 is simultaneously the first 1 in the pattern occurring at n = 9. That is allowed. (a) What is the probability that pattern 121 occurs on roll 43? This is the pattern problem discussed on November 30. The outcomes of successive rolls are i.i.d. random variables. Since we must have the specified outcomes on rolls 41, 42 and 43, the probability is (1/6) 3 = 1/216. (b) What is the probability that pattern 121 occurs on rolls 43, 48 and 55 (all three)? Explain. These three events are i.i.d. Hence, the probability is (1/216) 3 = (1/6) 9. (c) Let N(n) be a random variable counting the number of times that pattern 121 appears among the first n rolls of the die. What kind of stochastic process is {N(n) : n 0}? Explain. The stochastic process {N(n) : n 0} is a delayed renewal process. It can be considered a standard continuous-time delayed renewal process {N(t) : t 0}, where the intervals between renewals take values only in the positive integers. It is delayed because the time until the first 5

renewal is different than the other times between renewals. See Section 7.9 and the lecture notes on November 30. (d) Given that pattern 121 occurs at roll 43, what is the expected number of rolls before pattern 121 appears again? Explain. This is the reciprocal of the probability of the occurrence, by Blackwell s renewal theorem; then the expected number of events at n is the probability of a renewal at n, which we have determined In this example, the limit is attained at n for all n 3. The limit is 1/E[X], where X is the time between renewals. so the mean time until the pattern appears again is 216. (e) What is the expected number of rolls, from the beginning, until pattern 121 first appears? Let T A be the number of rolls after pattern A appears until pattern A appears again. Let N A be the number of rolls until pattern A first appears. Let N A B be the number of rolls until pattern B first appears after pattern A. Then E[N 121 ] = E[N 1 ] + E[N 1 121 ] = E[T 1 ] + E[T 121 ] = 6 + 216 = 222. 4. The Knight Errant (The Random Knight) (25 points) A knight is placed alone on one of the corner squares of a chessboard (having 8 8 = 64 squares) and then it is allowed to make a sequence of random moves, taking each of its legal moves in each step with equal probability, independent of the history of its moves up to that time. (Recall that the knight can move either (i) two squares horizontally (left or right) plus one square vertically (up or down) or (ii) one square horizontally (left or right) plus two squares vertically (up or down), provided of course it ends up at one of the other squares on the board.) (a) Let N(n) be a random variable counting the number of times that the knight visits its initial starting square among the first n random moves. What kind of stochastic process is {N(n) : n 0}? Explain. The stochastic process is {N(n) : n 0} is a renewal process. The times between successive visits are i.i.d. The renewal process is not delayed, because the first time has the same distribution as subsequent times between visits. It is not Poisson, because the distribution of each time is not exponential. (b) What is the long-run proportion of moves that the knight ends up at its initial square? Explain. This is homework exercise 4.76. We use time reversibility in Section 4.8. The long-run proportion of moves that the knight ends up at its initial square is equal to the number of 6

possible moves from its initial square (which is 2) divided by the sum over all 64 squares of the number of moves from that square. There are 4 corner squares from which there are 2 moves; there are 8 squares from which there are 3 moves; there are 20 squares from which there are 4 moves; there are 16 squares from which there are 6 moves; and there are 16 squares from which there are 8 moves. Hence, the long-run proportion (or stationary probability solving π = πp ) is 2 336 = 1 168 (c) What is the expected number of moves until the knight first returns to its initial square? Explain. By Remark (ii) on p. 208, the mean number is the reciprocal of the stationary probability, and so is 168. (d) What is the approximate probability that the knight is again at its initial square after 1, 000, 000 moves? Explain. This is tricky; we need to be careful. The discrete-time Markov chain is periodic, with period 2. This is similar to Markov mouse. It is easy to see if the squares are colored red and black, because then the knight can only move to a red square from a black square, and only can move to a black square from a red square. That is seen also if the squares are numbered consecutively starting in one corner. Then the squares of one color are all odd numbers, while the squares of the other color are all even numbers. As a consequence of this periodicity, the chain can only be back in its initial state on even numbers of moves; otherwise the probability is zero. Here the number of moves is even, so the probability is positive, but then this probability is twice the stationary probability. Thus the answer here is 2/168 = 1/84. This is consistent with above: Since the probabilities tend to alternate between 2/168 and 0, the long-run proportion of times is 1/168. (e) Explain how you could compute the expected number of moves until the knight first visits the corner square diagonally opposite to its starting square? We can create an absorbing Markov chain with a single absorbing state, corresponding to the destination corner state. We then solve for the fundamental matrix; see the lecture notes of September 28. The absorbing-chain transition matrix then has the block matrix form ( ) I 0 P =, R Q where I is an identity matrix (1 s on the diagonal and 0 s elsewhere) and 0 (zero) is a matrix of zeros. In this case, I would be 1 1, R is 63 1 and Q is 63 63). The matrix Q describes the probabilities of motion among the transient states, while the matrix R gives the probabilities of absorption in one step (going from one of the transient states to the single absorbing state in a single step). In general Q would be square, say m by m, while R would be m by k, and I would be k by k. 7

We then calculate the fundamental matrix N = (I Q) 1 Then N i,j represents the expected number of times that the chain visits transient state j before absorption, starting in transient state i. We pick the initial state i corresponding to the starting corner. To find the expected total number of moves until absorption, starting in state i (our initial square), we sum N i,j over j. 8