Massachusetts Institute of Technology

Similar documents
UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 9 Fall 2007

Massachusetts Institute of Technology

Massachusetts Institute of Technology

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

Massachusetts Institute of Technology

Continuous-time Markov Chains

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

ECE 313 Probability with Engineering Applications Fall 2000

EE126: Probability and Random Processes

Things to remember when learning probability distributions:

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

The exponential distribution and the Poisson process

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Guidelines for Solving Probability Problems

Part I Stochastic variables and Markov chains

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

EE126: Probability and Random Processes

ECE 302 Division 1 MWF 10:30-11:20 (Prof. Pollak) Final Exam Solutions, 5/3/2004. Please read the instructions carefully before proceeding.

STAT2201. Analysis of Engineering & Scientific Data. Unit 3

Twelfth Problem Assignment

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Special distributions

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Math Spring Practice for the final Exam.

CS 1538: Introduction to Simulation Homework 1

Section 9.1. Expected Values of Sums

Northwestern University Department of Electrical Engineering and Computer Science

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Exponential Distribution and Poisson Process

Midterm Exam 1 Solution

7 Variance Reduction Techniques

UC Berkeley Department of Electrical Engineering and Computer Science. EE 126: Probablity and Random Processes. Problem Set 4: Solutions Fall 2007

M/G/1 and M/G/1/K systems

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

Probability and Statistics Concepts

Mathematical Statistics 1 Math A 6330

Mathematics 375 Probability and Statistics I Final Examination Solutions December 14, 2009

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Lecture 4a: Continuous-Time Markov Chain Models

BINOMIAL DISTRIBUTION

6.041/6.431 Fall 2010 Quiz 2 Solutions

Birth-Death Processes

ECE 302 Division 2 Exam 2 Solutions, 11/4/2009.

Slides 8: Statistical Models in Simulation

Notes on Continuous Random Variables

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

LECTURE #6 BIRTH-DEATH PROCESS

Assignment 3 with Reference Solutions

Figure 10.1: Recording when the event E occurs

Queueing Theory and Simulation. Introduction

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Chapter 2 Queueing Theory and Simulation

THE QUEEN S UNIVERSITY OF BELFAST

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Topic 3: The Expectation of a Random Variable

EECS 126 Probability and Random Processes University of California, Berkeley: Fall 2014 Kannan Ramchandran September 23, 2014.

Massachusetts Institute of Technology

Introduction to Queuing Theory. Mathematical Modelling

(u v) = f (u,v) Equation 1

CMPSCI 240: Reasoning Under Uncertainty

Massachusetts Institute of Technology Department of Electrical Engineering & Computer Science 6.041/6.431: Probabilistic Systems Analysis

FINAL EXAM: Monday 8-10am

Learning Objectives for Stat 225

Poisson Processes. Stochastic Processes. Feb UC3M

Dr. Maddah ENMG 617 EM Statistics 10/15/12. Nonparametric Statistics (2) (Goodness of fit tests)

Basic concepts of probability theory

Known probability distributions

Basic concepts of probability theory

STAT 414: Introduction to Probability Theory

Introduction to Queueing Theory

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

ECE353: Probability and Random Processes. Lecture 5 - Cumulative Distribution Function and Expectation

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Lectures on Elementary Probability. William G. Faris

Continuous time Markov chains

Introduction to Queueing Theory

f X (x) = λe λx, , x 0, k 0, λ > 0 Γ (k) f X (u)f X (z u)du

Chapter 5. Chapter 5 sections

Introduction to Probability

Chapter 6: Random Processes 1

EDRP lecture 7. Poisson process. Pawe J. Szab owski

Basic concepts of probability theory

Discrete Random Variables

n px p x (1 p) n x. p x n(n 1)... (n x + 1) x!

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

n(1 p i ) n 1 p i = 1 3 i=1 E(X i p = p i )P(p = p i ) = 1 3 p i = n 3 (p 1 + p 2 + p 3 ). p i i=1 P(X i = 1 p = p i )P(p = p i ) = p1+p2+p3

Closed book and notes. 60 minutes. Cover page and four pages of exam. No calculators.

Discrete Distributions

MITOCW MIT6_041F11_lec15_300k.mp4

B.N.Bandodkar College of Science, Thane. Subject : Computer Simulation and Modeling.

Lecture 16. Lectures 1-15 Review

Discrete Distributions

Solutions For Stochastic Process Final Exam

ECE302 Spring 2015 HW10 Solutions May 3,

6.041/6.431 Spring 2009 Quiz 1 Wednesday, March 11, 7:30-9:30 PM. SOLUTIONS

Transcription:

6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first (i + j ) trials (since the ith success must occur no later than the trial right before the jth failure). This is equivalent to event that i or more successes occur in the first (i + j ) trials (where we can have, at most, (i + j ) successes). Let S i be the time of the ith success, F j be the time of the jth failure, and N k be the number of successes in the first k trials (so N k is a binomial random variable over k trials). So we have: P(S i < F j ) = P(N i+j i) = i+j ) ( i + j k k=i p k ( p) i+j k (b) Let K be the number of successes which occur before the jth failure, and L be the number of trials to get to the jth failure. L is simply a jth order Pascal, with probability of p (since we are now interested in the failures, not the successes.) Plugging into the formula for jth order Pascal random variable, E[L] = j,σ p K = j p ( p) Since K = L j, p p E[K] = j,σk = j p ( p) (c) This expression is the same as saying we need at least 4 trials to get the 7th success. Therefore, it can be rephrased as having a maximum of 6 successes in the first 4 trials. Hence b = 4, a = 6. 3. A successful call occurs with probability p = 4 3 =. (a) Fred will give away his first sample on the third call if the first two calls are failures and the third is a success. Since the trials are independent, the probability of this sequence of events is simply ( p)( p)p = = 8 (b) The event of interest requires failures on the ninth and tenth trials and a success on the eleventh trial. For a Bernoulli process, the outcomes of these three trials are independent of the results of any other trials and again our answer is ( p)( p)p = = 8 (c) We desire the probability that L, the time to the second arrival is equal to five trials. We know that p L (l) is a Pascal PMF of order, and we have ( 5 ) ( ) 5 p L (5) = p ( p) 5 = 4 = 8 Page of 6

6.04/6.43: Probabilistic Systems Analysis (Fall 00) (d) Here we require the conditional probability that the experimental value of L is equal to 5, given that it is greater than. p L (5) p L (5) P(L = 5 L > ) = = P(L > ) p L () ( ) (5 )p 5 ( p) 5 4 = = = ( ) p ( p) 0 ( ) 6 (e) The probability that Fred will complete at least five calls before he needs a new supply is equal to the probability that the experimental value of L is greater than or equal to 5. 4 ( ) l P(L 5) = P(L 4) = p ( p) l l= ( ) ( ) 3 5 = ( ) ( ) 3 ( ) 4 = 6 (f) Let discrete random variable F represent the number of failures before Fred runs out of samples on his mth successful call. Since L m is the number of trials up to and including the mth success, we have F = L m m. Given that Fred makes L m calls before he needs a new supply, we can regard each of the F unsuccessful calls as trials in another Bernoulli process with parameter r, where r is the probability of a success (a disappointed dog) obtained by r = P(dog lives there Fred did not leave a sample) P(dog lives there AND door not answered) 4 = = = 3 P(giving away a sample) 3 We define X to be a Bernoulli random variable with parameter r. Then, the number of dogs passed up before Fred runs out, D m, is equal to the sum of F Bernoulli random variables each with parameter r = 3, where F is a random variable. In other words, D m = X + X + X 3 + + X F. Note that D m is a sum of a random number of independent random variables. Further, F is independent of the X i s since the X i s are defined in the conditional universe where the door is not answered, in which case, whether there is a dog or not does not affect the probability of that trial being a failed trial or not. From our results in class, we can calculate its expectation and variance by E[D m ] = E[F]E[X] var(d m ) = E[F]var(X) + (E[X]) var(f), where we make the following substitutions. m E[F] = E[L m m] = m = m. p m( p) var(f) = var(l m m) = var(l m ) = p = m. E[X] = r =. 3 var(x) = r( r) =. 9 Page of 6

6.04/6.43: Probabilistic Systems Analysis (Fall 00) Finally, substituting these values, we have m E[D m ] = m = 3 3 ( ) 4m var(d m ) = m + m = 9 3 9 3. We view the random variables T and T as interarrival times in two independent Poisson processes both with rate λ S as the interarrival time in a third Poisson process (independent from the first two) with rate µ. We are interested in the expected value of the time Z until either the first process has had two arrivals or the second process has had an arrival. Given that the first arrival was from the second process, the expected wait time for that arrival µ would be µ+ λ. The probability of an arrival from the second process is µ+λ. Given that the first arrival time was from the first process, the expected wait time would be that for first arrival, µ+λ, plus the expected wait time for another arrival from the merged process. Similarily, the λ probability of an arrival from the first process is µ+λ. Thus, E[Z] = P(Arrival from second process)e[wait time Arrival from second process] + P(Arrival from first process)e[wait time Arrival from first process] µ λ = + ( + ). µ + λ µ + λ µ + λ µ + λ µ + λ After some simplifications, we see that λ E[Z] = + µ + λ µ + λ µ + λ 4. The dot location of the yarn, as related to the size of the pieces of the yarn cut for any particular customer, can be viewed in light of the random incident paradox. (a) Here, the length of each piece of yarn is exponentially distributed. As explained on page 98 of the text, due to the memorylessness of the exponential, the distribution of the length of the piece of yarn containing the red dot is a second order Erlang. Thus, the E[R] = E[L] = λ. (b) Think of exponentially-spaced marks being made on the yarn, so the length requested by the customers each involve three such sections of exponentially distributed lengths (since the PDF of L is third-order Erlang). The piece of yarn with the dot will have the dot in any one of these three sections, and the expected length of that section, by (a), will be /λ, while the expected lengths of the other two sections will be /λ. Thus, the total expected length containing the dot is 4/λ. In general, for processes, in which the interarrival intervals with distribution F X (x) are IID, E[ X] the expected length of an arbitrarily chosen interval is E[ X]. We see that for the above parts, this formula is certainly valid. (c) Using the formula stated above, E[L] = 0 l e l dl = e l (l l + )] 0 = e E[L ] = 0 l 3 e l dl = e l (l 3 3l + 6l 6)] 0 = 6 e Hence, 6 e E[R] =. e Page 3 of 6

6.04/6.43: Probabilistic Systems Analysis (Fall 00) 5. (a) We know there are n arrivals in t amount of time, so we are looking for how many extra arrivals there are in s amount of time. (b) By definition: (c) By definition: (λs) m n e λs p M N (m n) = for m n 0 (m n)! p N,M (n,m) = p M N (m n)p N (n) λ m s m n t n e λ(s+t) = for m n 0 (m n)!n! p M,N (m,n) p N M (n m) = p M (m) ( ) m sm nt n = for m n 0 n (s + t) m (d) We want to find: P(N = n M = m). Given M=m, we know that the m arrivals are uniformly distributed between 0 and t+s. Consider each arrival a success if it occurs before time t, and a failure otherwise. Therefore given M=m, N is a binomial random variable with m trials and probability of success t+ t s. We have the desired probability: ( )( ) n ( ) m n m t s P(N = n M = m) = for m n 0 n t + s t + s (e) We can rewrite the expectatation as: E[NM] = E[N(M N) + N ] = E[N]E[M N] + E[N ] ( ) = (λt)(λs) + var(n) + (E[N]) = (λt)(λs) + λt + (λt) where the second equality is obtained via the independent increment property of the poisson process. 6. The described process for cars passing the checkpoint is a Poisson process with an arrival rate of λ = cars per minute. (a) The first and third moments are, respectively, 3! 4 t 3 e t 3 E[T] = = E[T 3 ] = t 3 e t dt = dt = λ 0 3 0 3! 4 }{{} where we recognized the integrand to be a 4th-order Erlang PDF and therefore integrating it over the entire range of the random variable must sum to unity. = Page 4 of 6

6.04/6.43: Probabilistic Systems Analysis (Fall 00) (b) The Poisson process is memoryless, and thus the history of events in the previous 4 minutes does not affect the future. So, the conditional PMF for K is equivalent to the unconditional PMF that describes the number of Poisson arrivals in an interval of time, which in this case is τ = 6 minutes and thus (λτ) = : k e p K (k) =, k = 0,,,.... k! (c) The first dozen computer cards are used up upon the 36th car arrival. Letting D denote this total time, D = T + T +... + T 36, where each independent T i is exponentially distributed with parameter λ =, the distribution for D is therefore a 36th-order Erlang distribution with PDF and expected value of, respectively, 36 d 35 e d f D (d) =, d 0 E[D] = 36E[T] = 8 35! (d) In both experiments, because a card completes after registering three cars, we are considering the amount of time it takes for three cars to pass the checkpoint. In the second experiment, however, note that the manner with which the particular card is selected is biased towards cards that are in service longer. That is, the time instant at which we come to the corner is more likely to fall within a longer interarrival period one of the three interarrival times that adds up to the total time the card is in service is selected by random incidence (see the end of Section 6. in text). i. The service time of any particular completed card is given by Y = T + T + T 3, and thus Y is described by a 3rd-order Erlang distribution with paramater λ = : 3 3 3 3 E[Y ] = = var(y ) = =. λ λ 4 ii. The service time of a particular completed card with one of the three interarrival times selected by random incidence is W = T +T +L, where L is the interarrival period that contains the time instant we arrived at the corner. Following the arguments in the text, L is Erlang of order two and thus W is described by a 4th-order Erlang distribution with parameter λ = : 4 4 E[W] = = var(w) = =. λ λ G. For simplicity, introduce the notation N i = N(G i ) for i =,...,n and N G = N(G). Then P(N = k,...,n n = k n,n G = k) P(N = k,...,n n = k n N G = k) = P(N G = k) P(N = k ) P(N n = k n ) = P(N G = k) ( ) ( ) (c λ) k e c λ (cnλ) kn e cnλ k! k n! = ( ) (cλ) k e cλ k! k! ( c ) k ( cn ) kn = k!... k n! c c ( ) k (c ) k ( cn ) kn = k k n c c Page 5 of 6

6.04/6.43: Probabilistic Systems Analysis (Fall 00) The result can be interpreted as a multinomial distribution. Imagine we throw an n-sided die k times, where Side i comes up with probability p i = c i /c. The probability that side i comes up k i times is given by the expression above. Now relating it back to the Poisson process that we have, each side corresponds to an interval that we sample, and the probability that we sample it depends directly on its relative length. This is consistent with the intuition that, given a number of Poisson arrivals in a specified interval, the arrivals are uniformly distributed. Required for 6.43; optional for 6.04 Page 6 of 6

MIT OpenCourseWare http://ocw.mit.edu 6.04SC Probabilistic Systems Analysis and Applied Probability Fall 03 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.