Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Similar documents
Chapter 2. Poisson Processes

Exponential Distribution and Poisson Process

The exponential distribution and the Poisson process

Part I Stochastic variables and Markov chains

Manual for SOA Exam MLC.

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

Continuous Time Processes

M/G/1 and Priority Queueing

Data analysis and stochastic modeling

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Poisson Processes. Stochastic Processes. Feb UC3M

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Continuous-time Markov Chains

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Figure 10.1: Recording when the event E occurs

ELEMENTS OF PROBABILITY THEORY

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Lecture 4a: Continuous-Time Markov Chain Models

Renewal theory and its applications

Markov processes and queueing networks

PoissonprocessandderivationofBellmanequations

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

Intro to Queueing Theory

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Computer Networks More general queuing systems

Performance Evaluation of Queuing Systems

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

M/G/1 and M/G/1/K systems

LECTURE #6 BIRTH-DEATH PROCESS

STAT 380 Markov Chains

EDRP lecture 7. Poisson process. Pawe J. Szab owski

Arrivals and waiting times

Stochastic process. X, a series of random variables indexed by t

NICTA Short Course. Network Analysis. Vijay Sivaraman. Day 1 Queueing Systems and Markov Chains. Network Analysis, 2008s2 1-1

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Lecture 20: Reversible Processes and Queues

Introduction to Queueing Theory

Queueing Systems: Lecture 3. Amedeo R. Odoni October 18, Announcements

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

CDA5530: Performance Models of Computers and Networks. Chapter 4: Elementary Queuing Theory

Introduction to Queuing Theory. Mathematical Modelling

Introduction to Markov Chains, Queuing Theory, and Network Performance

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Computer Systems Modelling

Other properties of M M 1

Notes on Continuous Random Variables

Assignment 3 with Reference Solutions

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

An Analysis of the Preemptive Repeat Queueing Discipline

Introduction to Queueing Theory

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

M/G/1 queues and Busy Cycle Analysis

Part II: continuous time Markov chain (CTMC)

Analyzing a queueing network

Stat 426 : Homework 1.

UCSD ECE 153 Handout #20 Prof. Young-Han Kim Thursday, April 24, Solutions to Homework Set #3 (Prepared by TA Fatemeh Arbabjolfaei)

Northwestern University Department of Electrical Engineering and Computer Science

Computer Systems Modelling

Chapter 2 Queueing Theory and Simulation

Chapter 1. Introduction. 1.1 Stochastic process

ACM 116 Problem Set 4 Solutions

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Performance Modelling of Computer Systems

CS418 Operating Systems

IEOR 6711, HMWK 5, Professor Sigman

Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory

Basics of Stochastic Modeling: Part II

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Continuous time Markov chains

1 IEOR 4701: Continuous-Time Markov Chains

Name of the Student:

Link Models for Circuit Switching

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

1 Probability and Random Variables

Birth-Death Processes

Solutions to Homework Set #3 (Prepared by Yu Xiang) Let the random variable Y be the time to get the n-th packet. Find the pdf of Y.

MATH3013: Simulation and Queues

TMA 4265 Stochastic Processes

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Stochastic Proceses Poisson Process

Review of Queuing Models

Lecture 10: Semi-Markov Type Processes

[Chapter 6. Functions of Random Variables]

Systems Simulation Chapter 6: Queuing Models

1 Poisson processes, and Compound (batch) Poisson processes

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

P (L d k = n). P (L(t) = n),

Random variables and transform methods

Chapter 6: Random Processes 1

Transcription:

Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Outline Introduction to Poisson Processes Definition of arrival process Definition of renewal process Definition of Poisson process Properties of Poisson processes Inter-arrival time distribution Waiting time distribution Superposition and decomposition Non-homogeneous Poisson processes (relaxing stationary) Compound Poisson processes (relaxing single arrival) Modulated Poisson processes (relaxing independent) Poisson Arrival See Average (PASTA) Prof. Shun-Ren Yang, CS, NTHU 1

Introduction n ~ ( t ) Inter-arrival time ~ x 3 ~ x 2 ~ x 1 arrival epoch ~ ~ ~ ~ t ~ S 0 S 1 S 2 S 3 S 4 (i) n th arrival epoch S n is S n = x 1 + x 2 +... + x n = n i=1 x i S 0 = 0 (ii) Number of arrivals at time t is: ñ(t). Notice that: {ñ(t) n} iff { S n t}, {ñ(t) = n} iff { S n t and S n+1 > t} Prof. Shun-Ren Yang, CS, NTHU 2

Introduction Arrival Process: X = { x i, i = 1,2,...}; x i s can be any S = { S i, i = 0,1,2,...}; S i s can be any N = {ñ(t), t 0}; called arrival process Renewal Process: X = { x i, i = 1,2,...}; x i s are i.i.d. S = { S i, i = 0,1,2,...}; S i s are general distributed N = {ñ(t), t 0}; called renewal process Poisson Process: X = { x i, i = 1,2,...}; x i s are iid exponential distributed S = { S i, i = 0,1,2,...}; S i s are Erlang distributed N = {ñ(t), t 0}; called Poisson process Prof. Shun-Ren Yang, CS, NTHU 3

Counting Processes A stochastic process N = {ñ(t), t 0} is said to be a counting process if ñ(t) represents the total number of events that have occurred up to time t. From the definition we see that for a counting process ñ(t) must satisfy: 1. ñ(t) 0. 2. ñ(t) is integer valued. 3. If s < t, then ñ(s) ñ(t). 4. For s < t, ñ(t) ñ(s) equals the number of events that have occurred in the interval (s, t]. Prof. Shun-Ren Yang, CS, NTHU 4

Definition 1: Poisson Processes The counting process N = {ñ(t), t 0} is a Poisson process with rate λ (λ > 0), if: 1. ñ(0) = 0 2. Independent increments relaxed Modulated Poisson Process P[ñ(t) ñ(s) = k 1 ñ(r) = k 2, r s < t] = P[ñ(t) ñ(s) = k 1 ] 3. Stationary increments relaxed Non-homogeneous Poisson Process P[ñ(t + s) ñ(t) = k] = P[ñ(l + s) ñ(l) = k] 4. Single arrival relaxed Compound Poisson Process P[ñ(h) = 1] = λh + o(h) P[ñ(h) 2] = o(h) Prof. Shun-Ren Yang, CS, NTHU 5

Definition 2: Poisson Processes The counting process N = {ñ(t), t 0} is a Poisson process with rate λ (λ > 0), if: 1. ñ(0) = 0 2. Independent increments 3. The number of events in any interval of length t is Poisson distributed with mean λt. That is, for all s, t 0 P[ñ(t + s) ñ(s) = n] = e λt(λt)n n!, n = 0,1,... Prof. Shun-Ren Yang, CS, NTHU 6

Theorem: Definitions 1 and 2 are equivalent. Proof. We show that Definition 1 implies Definition 2, and leave it to the reader to prove the reverse. To start, fix u 0 and let g(t) = E[e uñ(t) ] We derive a differential equation for g(t) as follows: g(t + h) = E[e uñ(t+h) ] = E {e uñ(t) e u[ñ(t+h) ñ(t)]} [ = E e uñ(t)] E = g(t)e { e u[ñ(t+h) ñ(t)]} by independent increments [ e uñ(h)] by stationary increments (1) Prof. Shun-Ren Yang, CS, NTHU 7

Theorem: Definitions 1 and 2 are equivalent. Conditioning on whether ñ(t) = 0 or ñ(t) = 1 or ñ(t) 2 yields [ E e uñ(h)] = 1 λh + o(h) + e u (λh + o(h)) + o(h) From (1) and (2), we obtain that implying that = 1 λh + e u λh + o(h) (2) g(t + h) = g(t)(1 λh + e u λh) + o(h) g(t + h) g(t) h = g(t)λ(e u 1) + o(h) h Prof. Shun-Ren Yang, CS, NTHU 8

Theorem: Definitions 1 and 2 are equivalent. Letting h 0 gives g (t) = g(t)λ(e u 1) or, equivalently, g (t) g(t) = λ(e u 1) Integrating, and using g(0) = 1, shows that or log(g(t)) = λt(e u 1) g(t) = e λt(e u 1) the Laplace transform of a Poisson r. v. Since g(t) is also the Laplace transform of ñ(t), ñ(t) is a Poisson r. v. Prof. Shun-Ren Yang, CS, NTHU 9

The Inter-Arrival Time Distribution Theorem. Poisson Processes have exponential inter-arrival time distribution, i.e., { x n, n = 1,2,...} are i.i.d and exponentially distributed with parameter λ (i.e., mean inter-arrival time = 1/λ). Proof. x 1 : P( x 1 > t) = P(ñ(t) = 0) = e λt (λt) 0 x 1 e(t;λ) x 2 : P( x 2 > t x 1 = s) 0! = P {0 arrivals in (s, s + t] x 1 = s} = e λt = P {0 arrivals in (s, s + t]}(by independent increment) = P {0 arrivals in (0, t]}(by stationary increment) = e λt x 2 is independent of x 1 and x 2 exp(t;λ). The procedure repeats for the rest of x i s. Prof. Shun-Ren Yang, CS, NTHU 10

The Arrival Time Distribution of the nth Event Theorem. The arrival time of the n th event, S n (also called the waiting time until the n th event), is Erlang distributed with parameter (n, λ). Proof. Method 1 : Method2 : P[ S n t] = P[ñ(t) n] = f Sn (t) = λe λt (λt) n 1 (n 1)! f Sn (t)dt = df Sn (t) = P[t < S n < t + dt] k=n (exercise) e λt (λt) k = P {n 1 arrivals in (0, t] and 1 arrival in (t, t + dt)} + o(dt) = P[ñ(t) = n 1 and 1 arrival in (t, t + dt)] + o(dt) = P[ñ(t) = n 1]P[1 arrival in (t, t + dt)] + o(dt)(why?) k! Prof. Shun-Ren Yang, CS, NTHU 11

The Arrival Time Distribution of the nth Event = e λt (λt) n 1 (n 1)! lim dt 0 f Sn (t)dt dt λdt + o(dt) = f Sn (t) = λe λt (λt) n 1 (n 1)! Prof. Shun-Ren Yang, CS, NTHU 12

Conditional Distribution of the Arrival Times Theorem. Given that ñ(t) = n, the n arrival times S 1, S 2,..., S n have the same distribution as the order statistics corresponding to n i.i.d. uniformly distributed random variables from (0, t).... Order Statistics. Let x 1, x 2,..., x n be n i.i.d. continuous random variables having common pdf f. Define x (k) as the k th smallest value among all x i s, i.e., x (1) x (2) x (3)... x (n), then x (1),..., x (n) are known as the order statistics corresponding to random variables x 1,..., x n. We have that the joint pdf of x (1), x (2),..., x (n) is f x(1), x (2),..., x (n) (x 1, x 2,..., x n ) = n!f(x 1 )f(x 2 )... f(x n ), where x 1 < x 2 <... < x n (check the textbook [Ross]).... Prof. Shun-Ren Yang, CS, NTHU 13

Conditional Distribution of the Arrival Times Proof. Let 0 < t 1 < t 2 <... < t n+1 = t and let h i be small enough so that t i + h i < t i+1, i = 1,..., n. P[t i < S i < t i + h i, i = 1,..., n ñ(t) = n] exactly one arrival in each [t P i, t i + h i ] i = 1,2,..., n, and no arrival elsewhere in [0, t] = P[ñ(t) = n] = (e λh 1 λh 1 )(e λh 2 λh 2 )...(e λh n λh n )(e λ(t h 1 h 2... h n ) ) e λt (λt) n /n! = n!(h 1h 2 h 3... h n ) t n P[t i < S i < t i + h i, i = 1,..., n ñ(t) = n] = n! h 1 h 2... h n t n Prof. Shun-Ren Yang, CS, NTHU 14

Conditional Distribution of the Arrival Times Taking lim ( h i 0,i=1,...,n ), then f S1, S 2,..., S n ñ(t) (t 1, t 2,..., t n n) = n! t n, 0 < t 1 < t 2 <... < t n. Prof. Shun-Ren Yang, CS, NTHU 15

Conditional Distribution of the Arrival Times Example (see Ref [Ross], Ex. 2.3(A) p.68). Suppose that travellers arrive at a train depot in accordance with a Poisson process with rate λ. If the train departs at time t, what is the expected sum of the waiting times of travellers arriving in (0, t)? That is, E[ ñ(t) i=1 (t S i )] =?.......... ~ ~ ~ ~ S 1 S 2 S 3 S n ~(t ) t Prof. Shun-Ren Yang, CS, NTHU 16

Conditional Distribution of the Arrival Times Answer. Prof. Shun-Ren Yang, CS, NTHU 17

Superposition of Independent Poisson Processes Theorem. Superposition of independent Poisson Processes N (λ i, i = 1,..., N), is also a Poisson process with rate λ i. 1 Poisson λ1 Poisson λ2 Poisson Poisson λn rate = N 1 λi <Homework> Prove the theorem (note that a Poisson process must satisfy Definitions 1 or 2). Prof. Shun-Ren Yang, CS, NTHU 18

Decomposition of a Poisson Process Theorem. Given a Poisson process N = {ñ(t), t 0}; If ñ i (t) represents the number of type-i events that occur by time t, i = 1,2; Arrival occurring at time s is a type-1 arrival with probability p(s), and type-2 arrival with probability 1 p(s) then ñ 1,ñ 2 are independent, ñ 1 (t) P(k;λtp), and ñ 2 (t) P(k;λt(1 p)), where p = 1 t t 0 p(s)ds Prof. Shun-Ren Yang, CS, NTHU 19

Decomposition of a Poisson Process Poisson λ p(s) Poisson ( k; λ p( s) ds) 1-p(s) Poisson ( k; λ [1 p( s)] ds) special case: If p(s) = p is constant, then Poisson λ p Poisson rate λp 1-p Poisson rate λ( 1 p) Prof. Shun-Ren Yang, CS, NTHU 20

Decomposition of a Poisson Process Proof. Prof. Shun-Ren Yang, CS, NTHU 21

Decomposition of a Poisson Process Example (An Infinite Server Queue, textbook [Ross]). G Poisson λ departure G s (t) = P( S t), where S = service time G s (t) is independent of each other and of the arrival process ñ 1 (t): the number of customers which have left before t; ñ 2 (t): the number of customers which are still in the system at time t; ñ 1 (t)? and ñ 2 (t)? Prof. Shun-Ren Yang, CS, NTHU 22

Decomposition of a Poisson Process Answer. Prof. Shun-Ren Yang, CS, NTHU 23

Non-homogeneous Poisson Processes The counting process N = {ñ(t), t 0} is said to be a non-stationary or non-homogeneous Poisson Process with time-varying intensity function λ(t), t 0, if: 1. ñ(0) = 0 2. N has independent increments 3. P[ñ(t + h) ñ(t) 2] = o(h) 4. P[ñ(t + h) ñ(t) = 1] = λ(t) h + o(h) Define integrated intensity function m(t) = Theorem. t 0 λ(t )dt. P[ñ(t + s) ñ(t) = n] = e [m(t+s) m(t)] [m(t + s) m(t)] n Proof. < Homework >. n! Prof. Shun-Ren Yang, CS, NTHU 24

Non-homogeneous Poisson Processes Example. The output process of the M/G/ queue is a non-homogeneous Poisson process having intensity function λ(t) = λg(t), where G is the service distribution. Hint. Let D(s, s + r) denote the number of service completions in the interval (s, s + r] in (0, t]. If we can show that D(s, s + r) follows a Poisson distribution with mean λ s+r s and the numbers of service completions in disjoint intervals are independent, then we are finished by definition of a non-homogeneous Poisson process. G(y)dy, Prof. Shun-Ren Yang, CS, NTHU 25

Non-homogeneous Poisson Processes Answer. Prof. Shun-Ren Yang, CS, NTHU 26

Non-homogeneous Poisson Processes Because of the independent increment assumption of the Poisson arrival process, and the fact that there are always servers available for arrivals, the departure process has independent increments Prof. Shun-Ren Yang, CS, NTHU 27

Compound Poisson Processes A stochastic process { x(t), t 0} is said to be a compound Poisson process if it can be represented as x(t) = ñ(t) i=1 ỹ i, t 0 {ñ(t), t 0} is a Poisson process {ỹ i, i 1} is a family of independent and identically distributed random variables which are also independent of {ñ(t), t 0} The random variable x(t) is said to be a compound Poisson random variable. E[ x(t)] = and V ar[ x(t)] =. Prof. Shun-Ren Yang, CS, NTHU 28

Compound Poisson Processes Example (Batch Arrival Process). Consider a parallel-processing system where each job arrival consists of a possibly random number of tasks. Then we can model the arrival process as a compound Poisson process, which is also called a batch arrival process. Let ỹ i be a random variable that denotes the number of tasks comprising a job. We derive the probability generating function P x(t) (z) as follows: [ P x(t) (z) = E z x(t)] [ [ = E [E [ ]] = E E [ ]] [zỹ1+ +ỹñ(t) = E = E E [ E [ zỹ1 ] [ E zỹñ(t) = E [ ] = Pñ(t) (Pỹ(z)) zỹ1+ +ỹñ(t) ñ(t) ]] (by independence of ñ(t) and {ỹ i }) ]] (by independence of ỹ 1,,ỹñ(t) ) Prof. Shun-Ren Yang, CS, NTHU 29

Modulated Poisson Processes Assume that there are two states, 0 and 1, for a modulating process. 0 1 When the state of the modulating process equals 0 then the arrive rate of customers is given by λ 0, and when it equals 1 then the arrival rate is λ 1. The residence time in a particular modulating state is exponentially distributed with parameter µ and, after expiration of this time, the modulating process changes state. The initial state of the modulating process is randomly selected and is equally likely to be state 0 or 1. Prof. Shun-Ren Yang, CS, NTHU 30

Modulated Poisson Processes For a given period of time (0, t), let Υ be a random variable that indicates the total amount of time that the modulating process has been in state 0. Let x(t) be the number of arrivals in (0, t). Then, given Υ, the value of x(t) is distributed as a non-homogeneous Poisson process and thus P[ x(t) = n Υ = τ] = As µ 0, the probability that the modulating process makes no transitions within t seconds converges to 1, and we expect for this case that P[ x(t) = n] = Prof. Shun-Ren Yang, CS, NTHU 31

Modulated Poisson Processes As µ, then the modulating process makes an infinite number of transitions within t seconds, and we expect for this case that P[ x(t) = n] =, where β = λ 0 + λ 1 2 Example (Modeling Voice). A basic feature of speech is that it comprises an alternation of silent periods and non-silent periods. The arrival rate of packets during a talk spurt period is Poisson with rate λ 1 and silent periods produce a Poisson rate with λ 0 0. The duration of times for talk and silent periods are exponentially distributed with parameters µ 1 and µ 0, respectively. The model of the arrival stream of packets is given by a modulated Poisson process. Prof. Shun-Ren Yang, CS, NTHU 32

Poisson Arrivals See Time Averages (PASTA) PASTA says: as t Fraction of arrivals who see the system in a given state upon arrival (arrival average) = Fraction of time the system is in a given state (time average) = The system is in the given state at any random time after being steady Counter-example (textbook [Kao]: Example 2.7.1) 1/2 1/2 service time = 1/2 inter-arrival time = 1 1 1 0 1 2 3 4 5 Prof. Shun-Ren Yang, CS, NTHU 33

Poisson Arrivals See Time Averages (PASTA) Arrival average that an arrival will see an idle system = Time average of system being idle = Mathematically, Let X = { x(t), t 0} be a stochastic process with state space S, and B S Define an indicator random variable 1, if x(t) B ũ(t) = 0, otherwise Let N = {ñ(t), t 0} be a Poisson process with rate λ denoting the arrival process Prof. Shun-Ren Yang, CS, NTHU 34

Poisson Arrivals See Time Averages (PASTA) then, Condition For PASTA to hold, we need the lack of anticipation assumption (LAA): for each t 0, the arrival process {ñ(t + u) ñ(t), u 0} is independent of { x(s),0 s t} and {ñ(s),0 s t}. Application: To find the waiting time distribution of any arriving customer Given: P[system is idle] = 1 ρ; P[system is busy] = ρ Prof. Shun-Ren Yang, CS, NTHU 35

Poisson Arrivals See Time Averages (PASTA) Case 1: system is idle Poisson Case 2: system is busy Poisson P( w t) = P( w t idle) P(idle upon arrival) + P( w t busy) P(busy upon arrival) Prof. Shun-Ren Yang, CS, NTHU 36

Memoryless Property of the Exponential Distribution A random variable x is said to be without memory, or memoryless, if P[ x > s + t x > t] = P[ x > s] for all s, t 0 (3) The condition in Equation (3) is equivalent to or P[ x > s + t, x > t] P[ x > t] = P[ x > s] P[ x > s + t] = P[ x > s]p[ x > t] (4) Since Equation (4) is satisfied when x is exponentially distributed (for e λ(s+t) = e λs e λt ), it follows that exponential random variable are memoryless. Not only is the exponential distribution memoryless, but it is the unique continuous distribution possessing this property. Prof. Shun-Ren Yang, CS, NTHU 37

Comparison of Two Exponential Random Variables Suppose that x 1 and x 2 are independent exponential random variables with respective means 1/λ 1 and 1/λ 2. What is P[ x 1 < x 2 ]? P[ x 1 < x 2 ] = = = = = 0 0 0 0 λ 1 λ 1 + λ 2 P[ x 1 < x 2 x 1 = x]λ 1 e λ1x dx P[x < x 2 ]λ 1 e λ1x dx e λ2x λ 1 e λ1x dx λ 1 e (λ 1+λ 2 )x dx Prof. Shun-Ren Yang, CS, NTHU 38

Minimum of Exponential Random Variables Suppose that x 1, x 2,, x n are independent exponential random variables, with x i having rate µ i, i = 1,,n. It turns out that the smallest of the x i is exponential with a rate equal to the sum of the µ i. P[min( x 1, x 2,, x n ) > x] = P[ x i > x for each i = 1,,n] n = P[ x i > x] (by independence) = i=1 n i=1 How about max( x 1, x 2,, x n )? (exercise) e µ ix { ( n ) } = exp µ i x i=1 Prof. Shun-Ren Yang, CS, NTHU 39