IEOR 46: Spring Solutions to Homework Assignment 7: Due on Tuesday, March. More of Chapter 5: Read the rest of Section 5.3, skipping Examples 5.7 (Coupon Collecting), 5. (Insurance claims)and Subsection 5.3.6 (software reliability). Read Section 5.4 up to (but not including) Example 5.7 (Busy periods in the single server queue). Poisson process and nonhomogeneous Poisson process. 3. Let T be the amount of time (minutes) that the :3 pm appointment spends at the doctor s office. Let the event A { pm appointment leaves before :3pm} and the event A c { pm appointment leaves after :3pm}. Denote the service times of the pm appointment and :3pm appointment by S and S, respectively. Then by assumption S and S are iid exp( 3 ). By conditioning on whether the pm appointment has left or not when the :3 pm appointment arrives on time, we obtain E[T ] E[T A]P (A) + E[T A c ]P (A c ) 3P (S 3) + (3 + 3)P (S > 3) 3( + P (S > 3)) 3( + e 3 3 ) 3( + e ). 34. Let T A and T B be the life time of the two individuals A and B, respectively. Let T be the time until the first arrival of kidney and T be the time between the first arrival and the second arrival of kidneys. Then T A exp(µ A ) and T B exp(µ B ), T i exp(), i,. All these variables are mutually independent. (a) The event that A obtains a new kidney happens when the first kidney arrives before A dies, i.e. T < T A, so P (A obtains a new kidney) P (T < T A ) + µ A. (b) To obtain the probability of B obtains a new kidney, we will condition on the first event, i.e., whether the first kidney arrives before either A or B dies or whether A dies first, or whether B dies first. First, we need not consider the probability that B dies first, because clearly B does not get the kidney then. P (B obtains a new kidney) P (B obtains a new kidney T < T A, T < T B )P (T < T A, T < T B ) + P (B obtains a new kidney T A < T, T A < T B )P (T A < T, T A < T B ) P (T < T B )P (T < T A, T < T B ) + P (T < T B )P (T A < T, T A < T B ) ( )( ) ( )( µ ) A + + µ B + µ A + µ B + µ B + µ A + µ B ( )( + µ ) A + µ B + µ A + µ B Directly we get the last line, by first having either the kidney arrive or A die first and then, afterwards (and thus multiplying) the kidney arrive before B dies.
39. Recall Problem 4 (e) on the midterm exam. We can use a normal approximation by virtue of the central limit theorem. See Section.7 if you need a refresher. By the assumptions, the lifetime is the sum of 96 IID exponential random variables each with mean /.5 years, i.e., where EX i /.5. Thus, S 96 X + + X 96, (a) the mean is E[S 96 ] 96/.5 78.4. (b) and the variance is V ar[s 96 ] 96(/.5) 3.36. For parts (c) - (e), use the normal approximation, which follows from the central limit theorem; see Section.7. Use the normal probabilities in Table.3. Let Z be a standard normal random variable (with mean and variance ). (c) (d) (e) P (S 96 < 67.) P (Z < (67. 78.4)/5.6) P (Z <.).7. P (S 96 > 9) P (Z > (9 78.4)/5.6) P (Z >.7).9. P (S 96 > ) P (Z > ( 78.4)/5.6) P (Z > 3.857).6. 4. (a) This is just the sum of four IID exponential random variables: E[S 4 ] 4. (b) The conditioning event says that there are two events up to time. Hence, E[S 4 N() ] + E[time for more events] +. (c) Recall that a Poisson process has independent increments. So, E[N(4) N() N() 3] E[N(4) N()] E[N() N()] E[N()]. 6. Let {N(t), t } be the Poisson process with rate. Then by assumption N(). (a) The probability that both arrived during the first minutes is
P (N(/3) N() ) P (N(/3), N() ) P (N() ) P (N(/3), N() N(/3) ) P (N() ) P (N(/3) )P (N() N(/3) ) P (N() ) e 3 ( 3 ) e 3 e 9. by independent increments (b) The probability that at least one arrived during the first minutes is P (N(/3) N() ) P (N(/3) N() ) P (N(/3), N() ) P (N() ) P (N(/3), N() N(/3) ) P (N() ) P (N(/3) )P (N() N(/3) ) P (N() ) e 3 e 3 ( 3 ) e 4 9 5 9. by independent increments 77. in the 9 th edition. (The intended problem, not hard.) Using results for a nonhomogenous Poisson process, P (n events between t4 and t5) e (m(5) m(4)) (m(5) m(4)) n /n! e () n /n! 77. in the th edition. (Rather difficult.) Let T n be the time between the (n ) st arrival and the n th arrival and let S n be the service time of the n th customer. (a) (b) P (N ) P (T > S ) µ µ + P (N ) P (N > )P (N N > ) P (T < S )P (min {U, U } < T 3 ) 3
( + µ ) ( µ µ + ) µ (µ + )(µ + ) where we apply the lack of memory property to get U and U be i.i.d. exponential random variables distributed as S in line. (c) Similar to the part (b), one can establish that P (N > j N > j ) + (j )µ, because there are j exponential variables competing to be the next one. That is, we exploit the lack of memory property to start over after the (j ) th arrival. The remaining time to an arrival remains exponential with mean /. There are j customers in service, each of whom has an exponential remaining service time. The first one to finish is the minimum of these, which has a rate equal to the sum of the rates. But we then get P (N > j) P (N > )P (N > N > ) P (N > j N > j ) ( ) ( ) ( ). + µ + µ + jµ Then, one can write P (N j) P (N > j ) P (N > j). (d) Given that there are j customers in the system, since T,..., T j are i.i.d. exponential, every customer has the same chance to finish first. Therefore, P (the first customer departs first N j) j P (the first to arrive is the first to depart) j P (N j) j (e) Let Z be the time until the first departure. Given that N j, the time until the first departure is the sum of j + exponential random variables, with successively decreasing means. In particular, the conditional mean time until the first departure is E[Z N j] j k + kµ Recall that the minimum itself is exponential with a rate equal to the sum of the rates. Hence, E(time of first departure) E[Z] E[Z N j]p (N j), j where E[Z N j] is given above. 4
78. The number of customers that enter the store on a given day is Poisson with mean m(7) m(8) 7 8 8 63 (t)dt 4 7 4 dt + 8 dt + (t 4) dt + ( t + 38) dt 4 94. Let x be a point on the plane and B(x, r) be the circular region with radius r centered at x. X is the distance from x to its nearest event. (a) (b) P (X > t) P (no event in B(x,t)) e Area (B(x,t)) e πt. E[X] P (X > t)dt e πt dt π π. e t π dt 5