(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?
|
|
- Reynold Preston
- 5 years ago
- Views:
Transcription
1 IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only the Ross textbook, three 8 11 pages of notes, and the class lecture notes on Markov mouse (9/21 and 9/28), continuous-time Markov chains (10/24) and Martingales and Brownian motion (12/07.) 1. Xu Li s Barbershop (25 points) Xu Li operates a small barbershop with room for at most three customers, one in service and two waiting. Suppose that potential customers arrive according to a Poisson process at a rate of 10 per hour. Suppose that potential arrivals finding the barber shop full, with a customer in service and two other customers waiting, will leave and not affect future arrivals. Suppose that successive service times are independent exponential random variables with mean 30 minutes. Suppose that waiting customers have limited patience, with each waiting customer being willing to wait only an independent random, exponentially distributed, time with mean 20 minutes before starting service; if the customer has not started service by that time, then the customer will abandon, leaving without receiving service. (a) What is the probability that the time until the first customer arrives, starting empty, is greater than 10 minutes? The times between arrivals in a Poisson process are independent and identically distributed (i.i.d.) exponential random variables. Let T 1 be the time until the first arrival. Then T 1 has an exponential distribution with rate 10 (per hour) and thus mean 0.1 hour or 6 minutes. Measuring time in minutes, we have λ = 1/6 per minute, so that P (T 1 > 10) = e λt = e λ10 = e (10)/6 = e (b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes? The time until the second customer arrives is the sum of two exponential random variables, each with a mean of 6 minutes. The variance of an exponential random variable is the square of the mean. The variance of the sum of two independent random variables is the sum of the variances. If we measure time in minutes, then the variance is = 72. (If instead we were to measure time in hours, then the variance would be (0.1) 2 + (0.1) 2 = 0.02.) (c) Let N(t) be the number of customers that arrive in the time interval [0, t] to find a totally empty barbershop, again starting empty. What kind of stochastic process is {N(t) : t 0}? The stochastic process {N(t) : t 0} is first clearly a stochastic counting process, because it assumes nonnegative integer values and counts the number of events in the interval [0, t].
2 Since the state of the system is a Markov process, these epochs at which an arrival finds an empty system are renewal points. The successive intervals between such events are i.i.d. Thus the stochastic process {N(t) : t 0} is a renewal process. Since there is a renewal at time 0, it is a genuine renewal process, not a delayed renewal process; the first interval has the same distribution as all others. It is not a Poisson process because these intervals between renewals are not exponentially distributed; they have some other complicated distribution. (d) What proportion of time is Xu Li busy serving customers in the long run? Let X(t) be the number of customers in the system at time t. Now we use the fact that the stochastic process {X(t) : t 0} can be modelled as a birth-and-death stochastic process; see Chapter 6 of Ross or the CTMC lecture notes. The birth rate in state i is denoted by λ i, while the death rate in state i is denoted by µ i. The rate diagram for this specific birth-and-death process (with state space {0, 1,..., 3}) takes the simple linear form shown in Figure 1. Rate Diagram for the Birth-and-Death Process birth rates death rates with Figure 1: A rate diagram showing the transition rates for our birth-and-death process. For a birth-and-death process, the CTMC transition rates take the special form Q i,i+1 = λ i, Q i,i 1 = µ i and Q i,j = 0 if j / {i 1, i, i + 1}, 1 i n 1, (1) Q 0,1 = λ 0, Q 0,j = 0 if j / {0, 1}, Q n,n 1 = µ n and Q n,j = 0 if j / {n 1, n}. (2) 2
3 As before, the row sums of Q are zero. Here λ i = 10 for i = 0, 1, 2, while µ 1 = 2, µ 2 = 5, µ 3 = 8, as in Figure 1. Now we want to compute the limiting steady-state probabilities. Let α j lim t P (X(t) = j X(0) = i), which is independent of the initial state i. Then, since we have a BD process, we have the general formula r j α j = k=3 k=0 r, k where r k λ 0 λ 1 λ k 1 µ 1 µ 2 µ k for 1 k 3, with r 0 1. In this particular case we have so that k r k k=3 r k = 28.5 k=0 Hence we have the limiting steady-state probabilities: k α k 2/57 10/57 20/57 25/57 Finally, we must answer the question. The proportion of time that the one barber Xu Li is busy is 1 α 0 = 55/57. Xu Li is very busy; he could use some help. (e) What proportion of all potential customers are served in the long run? Is it greater than 1/3? The rate of service completion is (1 α 0 )µ 1 = (55/57) 2 per hour, which is clearly less than 2 per hour. On the other hand potential arrivals arrive at the rate of 10 per hour. So the proportion of all customers that are served in the long run is 11/57. Since 11/57 < 11/55 = 0.2, the proportion is less than 1/5, which in turn is less than 1/3. So the final answer is no. 2. Brownian motion (25 points) Let {B(t) : t 0} be standard Brownian motion, having E[B(t)] = 0 and V ar(b(t)) = t for all t 0. Let t 1 and t 2 be deterministic times satisfying 0 < t 1 < t 2 <. Let T a be the first time that Brownian motion hits a; i.e., T a min {t 0 : B(t) = a}. (a) Is the stochastic process {B(t) + 2t : t 0} a Markov process? Explain. Yes, the stochastic process {B(t) + 2t : t 0} is a Markov process. First, the stochastic process {B(t) : t 0} a Markov process. Both are Markov processes, because the probability 3
4 of a future event, say at time t, conditional on the present state, say at time s, where 0 s < t, and conditional on other information on the past, depends only on the present state. That follows from the independent-increments property. The extra deterministic term +2t does not change the independence properties. If we know B(t), then we know B(t) + 2t, and vice versa. (b) What is E[B(t 1 ) + B(t 1 )B(t 2 )]? First, the expectation of a sum of random variables is the sum of the individual expectations, so that E[B(t 1 ) + B(t 1 )B(t 2 )] = E[B(t 1 )] + E[B(t 1 )B(t 2 )] = 0 + E[B(t 1 )B(t 2 )] = E[B(t 1 )B(t 2 )]. Next, we write B(t 2 ) = B(t 1 ) + (B(t 2 ) B(t 1 )) and E[B(t 1 )B(t 2 )] = E[B(t 1 )(B(t 1 )+(B(t 2 ) B(t 1 )))] = E[B(t 1 )B(t 1 )]+E[B(t 1 )(B(t 2 ) B(t 1 )))], but E[B(t 1 )B(t 1 )] = V ar(b(t 1 ) = t 1, while, because of independent increments, E[B(t 1 )(B(t 2 ) B(t 1 ))] = E[B(t 1 )]E[B(t 2 ) B(t 1 )] = 0 0 = 0. Hence the answer is E[B(t 1 ) + B(t 1 )B(t 2 )] = t 1. (c) What is the probability P (T 2 < T 1 < T 5 )? This part is just like homework exercise We have P (T 2 < T 1 < T 5 ) = P (T 2 < T 1 )P (T 1 < T 5 T 2 < T 1 ) = P (T 2 < T 1 )P (T 3 < T 3 ) = (1/3) (1/2) = 1/6. (d) What is the conditional expectation E[B(t 2 ) 2 t 2 B(t 1 ) = x]? Again, to exploit independent increments, write B(t 2 ) = B(t 1 ) + (B(t 2 ) B(t 1 )) and get E[B(t 2 ) 2 t 2 B(t 1 ) = x] = E[(B(t 1 ) + (B(t 2 ) B(t 1 ))) 2 t 2 B(t 1 ) = x] = E[(x + (B(t 2 t 1 ))) 2 t 2 ] = x 2 + 2xE[B(t 2 t 1 )] + E[(B(t 2 t 1 )) 2 ] t 2 = x 2 + 2x0 + (t 2 t 1 ) t 2 = x 2 t 1. 4
5 (e) What does part (d) imply about the stochastic process {B(t) 2 t : t 0}? The fact that E[B(t 2 ) 2 t 2 B(t 1 )] = B(t 1 ) 2 t 1 together with the Markov property implies that the stochastic process {B(t) 2 t : t 0} is a continuous-time martingale with respect to its internal history or the history of Brownian motion, i.e., E[B(t 2 ) 2 t 2 B(s), 0 s t 1 ] = B(t 1 ) 2 t 1. Note that this is a special case of E[X(t 2 ) Y (s), 0 s t 1 ] = X(t 1 ), where the stochastic process {X(t) : t 0} is a martingale with respect to {Y (t) : t 0}; here X(t) B(t) 2 t and Y (t) B(t). We also need to verify the finite-moment condition in the definition of a martingale. See the notes of December Patterns (25 points) Consider successive independent rolls of a single six-sided die (one of two dice). On each roll, the die shows one of the numbers {1, 2, 3, 4, 5, 6}, each with probability 1/6. A given segment of finitely many consecutive outcomes is called a pattern. The pattern is said to occur at roll n if the pattern is completed at roll n. For example, the pattern A 121 occurs at rolls 7 and 9 in the sequence and at no other times among the first 30 rolls. Note that the pattern occurrences at rolls 7 and 9 overlap, in that the last 1 of the pattern occurring at n = 7 is simultaneously the first 1 in the pattern occurring at n = 9. That is allowed. (a) What is the probability that pattern 121 occurs on roll 43? This is the pattern problem discussed on November 30. The outcomes of successive rolls are i.i.d. random variables. Since we must have the specified outcomes on rolls 41, 42 and 43, the probability is (1/6) 3 = 1/216. (b) What is the probability that pattern 121 occurs on rolls 43, 48 and 55 (all three)? Explain. These three events are i.i.d. Hence, the probability is (1/216) 3 = (1/6) 9. (c) Let N(n) be a random variable counting the number of times that pattern 121 appears among the first n rolls of the die. What kind of stochastic process is {N(n) : n 0}? Explain. The stochastic process {N(n) : n 0} is a delayed renewal process. It can be considered a standard continuous-time delayed renewal process {N(t) : t 0}, where the intervals between renewals take values only in the positive integers. It is delayed because the time until the first 5
6 renewal is different than the other times between renewals. See Section 7.9 and the lecture notes on November 30. (d) Given that pattern 121 occurs at roll 43, what is the expected number of rolls before pattern 121 appears again? Explain. This is the reciprocal of the probability of the occurrence, by Blackwell s renewal theorem; then the expected number of events at n is the probability of a renewal at n, which we have determined In this example, the limit is attained at n for all n 3. The limit is 1/E[X], where X is the time between renewals. so the mean time until the pattern appears again is 216. (e) What is the expected number of rolls, from the beginning, until pattern 121 first appears? Let T A be the number of rolls after pattern A appears until pattern A appears again. Let N A be the number of rolls until pattern A first appears. Let N A B be the number of rolls until pattern B first appears after pattern A. Then E[N 121 ] = E[N 1 ] + E[N ] = E[T 1 ] + E[T 121 ] = = The Knight Errant (The Random Knight) (25 points) A knight is placed alone on one of the corner squares of a chessboard (having 8 8 = 64 squares) and then it is allowed to make a sequence of random moves, taking each of its legal moves in each step with equal probability, independent of the history of its moves up to that time. (Recall that the knight can move either (i) two squares horizontally (left or right) plus one square vertically (up or down) or (ii) one square horizontally (left or right) plus two squares vertically (up or down), provided of course it ends up at one of the other squares on the board.) (a) Let N(n) be a random variable counting the number of times that the knight visits its initial starting square among the first n random moves. What kind of stochastic process is {N(n) : n 0}? Explain. The stochastic process is {N(n) : n 0} is a renewal process. The times between successive visits are i.i.d. The renewal process is not delayed, because the first time has the same distribution as subsequent times between visits. It is not Poisson, because the distribution of each time is not exponential. (b) What is the long-run proportion of moves that the knight ends up at its initial square? Explain. This is homework exercise We use time reversibility in Section 4.8. The long-run proportion of moves that the knight ends up at its initial square is equal to the number of 6
7 possible moves from its initial square (which is 2) divided by the sum over all 64 squares of the number of moves from that square. There are 4 corner squares from which there are 2 moves; there are 8 squares from which there are 3 moves; there are 20 squares from which there are 4 moves; there are 16 squares from which there are 6 moves; and there are 16 squares from which there are 8 moves. Hence, the long-run proportion (or stationary probability solving π = πp ) is = (c) What is the expected number of moves until the knight first returns to its initial square? Explain. By Remark (ii) on p. 208, the mean number is the reciprocal of the stationary probability, and so is 168. (d) What is the approximate probability that the knight is again at its initial square after 1, 000, 000 moves? Explain. This is tricky; we need to be careful. The discrete-time Markov chain is periodic, with period 2. This is similar to Markov mouse. It is easy to see if the squares are colored red and black, because then the knight can only move to a red square from a black square, and only can move to a black square from a red square. That is seen also if the squares are numbered consecutively starting in one corner. Then the squares of one color are all odd numbers, while the squares of the other color are all even numbers. As a consequence of this periodicity, the chain can only be back in its initial state on even numbers of moves; otherwise the probability is zero. Here the number of moves is even, so the probability is positive, but then this probability is twice the stationary probability. Thus the answer here is 2/168 = 1/84. This is consistent with above: Since the probabilities tend to alternate between 2/168 and 0, the long-run proportion of times is 1/168. (e) Explain how you could compute the expected number of moves until the knight first visits the corner square diagonally opposite to its starting square? We can create an absorbing Markov chain with a single absorbing state, corresponding to the destination corner state. We then solve for the fundamental matrix; see the lecture notes of September 28. The absorbing-chain transition matrix then has the block matrix form ( ) I 0 P =, R Q where I is an identity matrix (1 s on the diagonal and 0 s elsewhere) and 0 (zero) is a matrix of zeros. In this case, I would be 1 1, R is 63 1 and Q is 63 63). The matrix Q describes the probabilities of motion among the transient states, while the matrix R gives the probabilities of absorption in one step (going from one of the transient states to the single absorbing state in a single step). In general Q would be square, say m by m, while R would be m by k, and I would be k by k. 7
8 We then calculate the fundamental matrix N = (I Q) 1 Then N i,j represents the expected number of times that the chain visits transient state j before absorption, starting in transient state i. We pick the initial state i corresponding to the starting corner. To find the expected total number of moves until absorption, starting in state i (our initial square), we sum N i,j over j. 8
SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012
SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 This exam is closed book. YOU NEED TO SHOW YOUR WORK. Honor Code: Students are expected to behave honorably, following the accepted
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.
IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models Final Exam, Thursday, December 16, 2010
IEOR 3106: Introduction to Operations Research: Stochastic Models Final Exam, Thursday, December 16, 2010 SOLUTIONS Honor Code: Students are expected to behave honorably, following the accepted code of
More informationQ = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?
IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross
More informationSince D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.
IEOR 46: Introduction to Operations Research: Stochastic Models Chapters 5-6 in Ross, Thursday, April, 4:5-5:35pm SOLUTIONS to Second Midterm Exam, Spring 9, Open Book: but only the Ross textbook, the
More informationRandom Walk on a Graph
IOR 67: Stochastic Models I Second Midterm xam, hapters 3 & 4, November 2, 200 SOLUTIONS Justify your answers; show your work.. Random Walk on a raph (25 points) Random Walk on a raph 2 5 F B 3 3 2 Figure
More informationIEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013
IEOR 316: Second Midterm Exam, Chapters 5-6, November 7, 13 SOLUTIONS Honor Code: Students are expected to behave honorably, following the accepted code of academic honesty. You may keep the exam itself.
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 2
IEOR 316: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 2 More Probability Review: In the Ross textbook, Introduction to Probability Models, read
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More informationIEOR 6711: Professor Whitt. Introduction to Markov Chains
IEOR 6711: Professor Whitt Introduction to Markov Chains 1. Markov Mouse: The Closed Maze We start by considering how to model a mouse moving around in a maze. The maze is a closed space containing nine
More informationStatistics 150: Spring 2007
Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More informationIEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2011, Professor Whitt Class Lecture Notes: Tuesday, March 1.
IEOR 46: Introduction to Operations Research: Stochastic Models Spring, Professor Whitt Class Lecture Notes: Tuesday, March. Continuous-Time Markov Chains, Ross Chapter 6 Problems for Discussion and Solutions.
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More information1 Delayed Renewal Processes: Exploiting Laplace Transforms
IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s
More informationLecture 1: Brief Review on Stochastic Processes
Lecture 1: Brief Review on Stochastic Processes A stochastic process is a collection of random variables {X t (s) : t T, s S}, where T is some index set and S is the common sample space of the random variables.
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationPage 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.
Final Examination Closed book. 120 minutes. Cover page plus five pages of exam. To receive full credit, show enough work to indicate your logic. Do not spend time calculating. You will receive full credit
More informationMAS275 Probability Modelling Exercises
MAS75 Probability Modelling Exercises Note: these questions are intended to be of variable difficulty. In particular: Questions or part questions labelled (*) are intended to be a bit more challenging.
More informationIEOR 4106: Spring Solutions to Homework Assignment 7: Due on Tuesday, March 22.
IEOR 46: Spring Solutions to Homework Assignment 7: Due on Tuesday, March. More of Chapter 5: Read the rest of Section 5.3, skipping Examples 5.7 (Coupon Collecting), 5. (Insurance claims)and Subsection
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationSolutions to Homework Discrete Stochastic Processes MIT, Spring 2011
Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions
More informationQuestion Points Score Total: 70
The University of British Columbia Final Examination - April 204 Mathematics 303 Dr. D. Brydges Time: 2.5 hours Last Name First Signature Student Number Special Instructions: Closed book exam, no calculators.
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationIEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion
IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there
More informationStochastic Processes
Stochastic Processes 8.445 MIT, fall 20 Mid Term Exam Solutions October 27, 20 Your Name: Alberto De Sole Exercise Max Grade Grade 5 5 2 5 5 3 5 5 4 5 5 5 5 5 6 5 5 Total 30 30 Problem :. True / False
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More informationContinuous-Time Markov Chain
Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),
More informationQueueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "
Queueing Theory I Summary Little s Law Queueing System Notation Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K " Little s Law a(t): the process that counts the number of arrivals
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationThe Transition Probability Function P ij (t)
The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it
More informationLecture 20: Reversible Processes and Queues
Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n
More informationMarkov Processes Cont d. Kolmogorov Differential Equations
Markov Processes Cont d Kolmogorov Differential Equations The Kolmogorov Differential Equations characterize the transition functions {P ij (t)} of a Markov process. The time-dependent behavior of the
More informationOperations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam. Name: Student ID:
Operations Research II, IEOR161 University of California, Berkeley Spring 2007 Final Exam 1 2 3 4 5 6 7 8 9 10 7 questions. 1. [5+5] Let X and Y be independent exponential random variables where X has
More informationMath Homework 5 Solutions
Math 45 - Homework 5 Solutions. Exercise.3., textbook. The stochastic matrix for the gambler problem has the following form, where the states are ordered as (,, 4, 6, 8, ): P = The corresponding diagram
More informationContinuous Time Processes
page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point
More information1 IEOR 4701: Continuous-Time Markov Chains
Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change
More informationStochastic Modelling Unit 1: Markov chain models
Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson
More informationExercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010
Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationGI/M/1 and GI/M/m queuing systems
GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting
More informationHomework 4 due on Thursday, December 15 at 5 PM (hard deadline).
Large-Time Behavior for Continuous-Time Markov Chains Friday, December 02, 2011 10:58 AM Homework 4 due on Thursday, December 15 at 5 PM (hard deadline). How are formulas for large-time behavior of discrete-time
More informationClass 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.
Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations
More informationContinuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationMath 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time
Math 416 Lecture 11 Math 416 Lecture 16 Exam 2 next time Birth and death processes, queueing theory In arrival processes, the state only jumps up. In a birth-death process, it can either jump up or down
More informationMAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.
MAT 4371 - SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions. Question 1: Consider the following generator for a continuous time Markov chain. 4 1 3 Q = 2 5 3 5 2 7 (a) Give
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationSolutions to Homework Discrete Stochastic Processes MIT, Spring 2011
Exercise 1 Solutions to Homework 6 6.262 Discrete Stochastic Processes MIT, Spring 2011 Let {Y n ; n 1} be a sequence of rv s and assume that lim n E[ Y n ] = 0. Show that {Y n ; n 1} converges to 0 in
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More information57:022 Principles of Design II Final Exam Solutions - Spring 1997
57:022 Principles of Design II Final Exam Solutions - Spring 1997 Part: I II III IV V VI Total Possible Pts: 52 10 12 16 13 12 115 PART ONE Indicate "+" if True and "o" if False: + a. If a component's
More informationLet (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t
2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition
More informationProbability and Statistics Concepts
University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each
More informationIEOR 8100: Topics in OR: Asymptotic Methods in Queueing Theory. Fall 2009, Professor Whitt. Class Lecture Notes: Wednesday, September 9.
IEOR 8100: Topics in OR: Asymptotic Methods in Queueing Theory Fall 2009, Professor Whitt Class Lecture Notes: Wednesday, September 9. Heavy-Traffic Limits for the GI/G/1 Queue 1. The GI/G/1 Queue We will
More informationChapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process
More informationStochastic Processes MIT, fall 2011 Day by day lecture outline and weekly homeworks. A) Lecture Outline Suggested reading
Stochastic Processes 18445 MIT, fall 2011 Day by day lecture outline and weekly homeworks A) Lecture Outline Suggested reading Part 1: Random walk on Z Lecture 1: thursday, september 8, 2011 Presentation
More informationTMA4265 Stochastic processes ST2101 Stochastic simulation and modelling
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes
More informationChapter 3 Balance equations, birth-death processes, continuous Markov Chains
Chapter 3 Balance equations, birth-death processes, continuous Markov Chains Ioannis Glaropoulos November 4, 2012 1 Exercise 3.2 Consider a birth-death process with 3 states, where the transition rate
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationIrreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1
Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate
More informationExamination paper for TMA4265 Stochastic Processes
Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Second Midterm Exam, November 13, Based on Chapters 1-3 in Ross
IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt Second Midterm Exam, November 13, 2003 Based on Chapters 1-3 in Ross There are eight questions. Do as much as you can. Show your work. 1. Alpine
More informationLIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974
LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the
More informationContinuous time Markov chains
Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017
More informationContinuous Time Markov Chains
Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov
More informationReadings: Finish Section 5.2
LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout
More informationSolutions For Stochastic Process Final Exam
Solutions For Stochastic Process Final Exam (a) λ BMW = 20 0% = 2 X BMW Poisson(2) Let N t be the number of BMWs which have passes during [0, t] Then the probability in question is P (N ) = P (N = 0) =
More informationStatistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014
Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Monday, Feb 10, 2014 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note
More informationAn Introduction to Stochastic Modeling
F An Introduction to Stochastic Modeling Fourth Edition Mark A. Pinsky Department of Mathematics Northwestern University Evanston, Illinois Samuel Karlin Department of Mathematics Stanford University Stanford,
More informationStatistics 253/317 Introduction to Probability Models. Winter Midterm Exam Friday, Feb 8, 2013
Statistics 253/317 Introduction to Probability Models Winter 2014 - Midterm Exam Friday, Feb 8, 2013 Student Name (print): (a) Do not sit directly next to another student. (b) This is a closed-book, closed-note
More informationQueuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe
Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationOutlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)
Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous
More informationPerformance Evaluation of Queuing Systems
Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Professor Whitt. SOLUTIONS to Homework Assignment 1
IEOR 3106: Introduction to Operations Research: Stochastic Models Professor Whitt SOLUTIONS to Homework Assignment 1 Probability Review: Read Chapters 1 and 2 in the textbook, Introduction to Probability
More information56:171 Operations Research Fall 1998
56:171 Operations Research Fall 1998 Quiz Solutions D.L.Bricker Dept of Mechanical & Industrial Engineering University of Iowa 56:171 Operations Research Quiz
More informationName of the Student:
SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationNon Markovian Queues (contd.)
MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where
More informationExamples of Countable State Markov Chains Thursday, October 16, :12 PM
stochnotes101608 Page 1 Examples of Countable State Markov Chains Thursday, October 16, 2008 12:12 PM Homework 2 solutions will be posted later today. A couple of quick examples. Queueing model (without
More informationExercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems
Exercises Solutions Note, that we have not formulated the answers for all the review questions. You will find the answers for many questions by reading and reflecting about the text in the book. Chapter
More informationBirth-death chain models (countable state)
Countable State Birth-Death Chains and Branching Processes Tuesday, March 25, 2014 1:59 PM Homework 3 posted, due Friday, April 18. Birth-death chain models (countable state) S = We'll characterize the
More informationBIRTH DEATH PROCESSES AND QUEUEING SYSTEMS
BIRTH DEATH PROCESSES AND QUEUEING SYSTEMS Andrea Bobbio Anno Accademico 999-2000 Queueing Systems 2 Notation for Queueing Systems /λ mean time between arrivals S = /µ ρ = λ/µ N mean service time traffic
More informationECE534, Spring 2018: Solutions for Problem Set #4 Due Friday April 6, 2018
ECE534, Spring 2018: s for Problem Set #4 Due Friday April 6, 2018 1. MMSE Estimation, Data Processing and Innovations The random variables X, Y, Z on a common probability space (Ω, F, P ) are said to
More informationDISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition
DISCRETE STOCHASTIC PROCESSES Draft of 2nd Edition R. G. Gallager January 31, 2011 i ii Preface These notes are a draft of a major rewrite of a text [9] of the same name. The notes and the text are outgrowths
More informationL = λ W time average number in Line or system arrival rate W average Waiting time per customer
IEOR 4615, Lecture 3, January 27, 2015 L λ Little s Law* L = λ W time average number in Line or system arrival rate W average Waiting time per customer *J. D. C. Little, A proof of the queueing formula:
More informationEXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00
Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013
More informationTime Reversibility and Burke s Theorem
Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal
More informationQUEUING MODELS AND MARKOV PROCESSES
QUEUING MODELS AND MARKOV ROCESSES Queues form when customer demand for a service cannot be met immediately. They occur because of fluctuations in demand levels so that models of queuing are intrinsically
More information(implicitly assuming time-homogeneity from here on)
Continuous-Time Markov Chains Models Tuesday, November 15, 2011 2:02 PM The fundamental object describing the dynamics of a CTMC (continuous-time Markov chain) is the probability transition (matrix) function:
More informationIEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2011, Professor Whitt Class Lecture Notes: Thursday, September 15. Random Variables, Conditional Expectation and Transforms 1. Random
More informationQueueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements
Queueing Systems: Lecture 3 Amedeo R. Odoni October 18, 006 Announcements PS #3 due tomorrow by 3 PM Office hours Odoni: Wed, 10/18, :30-4:30; next week: Tue, 10/4 Quiz #1: October 5, open book, in class;
More informationQueueing Review. Christos Alexopoulos and Dave Goldsman 10/6/16. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA
1 / 24 Queueing Review (mostly from BCNN) Christos Alexopoulos and Dave Goldsman Georgia Institute of Technology, Atlanta, GA, USA 10/6/16 2 / 24 Outline 1 Introduction 2 Queueing Notation 3 Transient
More informationUNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.
UNIVERSITY OF YORK MSc Examinations 2004 MATHEMATICS Networks Time Allowed: 3 hours. Answer 4 questions. Standard calculators will be provided but should be unnecessary. 1 Turn over 2 continued on next
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationMath 493 Final Exam December 01
Math 493 Final Exam December 01 NAME: ID NUMBER: Return your blue book to my office or the Math Department office by Noon on Tuesday 11 th. On all parts after the first show enough work in your exam booklet
More informationIntroduction to Queuing Networks Solutions to Problem Sheet 3
Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus
More informationPBW 654 Applied Statistics - I Urban Operations Research
PBW 654 Applied Statistics - I Urban Operations Research Lecture 2.I Queuing Systems An Introduction Operations Research Models Deterministic Models Linear Programming Integer Programming Network Optimization
More informationHomework 3 posted, due Tuesday, November 29.
Classification of Birth-Death Chains Tuesday, November 08, 2011 2:02 PM Homework 3 posted, due Tuesday, November 29. Continuing with our classification of birth-death chains on nonnegative integers. Last
More information