ECE 313 Probability with Engineering Applications Fall 2000

Similar documents
The exponential distribution and the Poisson process

BMIR Lecture Series on Probability and Statistics Fall, 2015 Uniform Distribution

Ching-Han Hsu, BMES, National Tsing Hua University c 2015 by Ching-Han Hsu, Ph.D., BMIR Lab. = a + b 2. b a. x a b a = 12

15 Discrete Distributions

Part I Stochastic variables and Markov chains

Things to remember when learning probability distributions:

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Guidelines for Solving Probability Problems

A Probability Primer. A random walk down a probabilistic path leading to some stochastic thoughts on chance events and uncertain outcomes.

CS 237: Probability in Computing

Exponential Distribution and Poisson Process

EE126: Probability and Random Processes

Slides 8: Statistical Models in Simulation

Northwestern University Department of Electrical Engineering and Computer Science

Chapter 5. Chapter 5 sections

Continuous-time Markov Chains

Random variables. DS GA 1002 Probability and Statistics for Data Science.

System Simulation Part II: Mathematical and Statistical Models Chapter 5: Statistical Models

Probability distributions. Probability Distribution Functions. Probability distributions (contd.) Binomial distribution

Chapter 5 continued. Chapter 5 sections

STAT 430/510 Probability Lecture 12: Central Limit Theorem and Exponential Distribution

PROBABILITY DISTRIBUTIONS

Common ontinuous random variables

HW7 Solutions. f(x) = 0 otherwise. 0 otherwise. The density function looks like this: = 20 if x [10, 90) if x [90, 100]

Probability Distributions Columns (a) through (d)

EE/CpE 345. Modeling and Simulation. Fall Class 5 September 30, 2002

Random variable X is a mapping that maps each outcome s in the sample space to a unique real number x, x. X s. Real Line

Chapter 5. Statistical Models in Simulations 5.1. Prof. Dr. Mesut Güneş Ch. 5 Statistical Models in Simulations

Queueing Theory and Simulation. Introduction

Massachusetts Institute of Technology

Continuous Probability Spaces

Advanced Herd Management Probabilities and distributions

Chapter 2 Queueing Theory and Simulation

Lecture 3 Continuous Random Variable

Probability. Machine Learning and Pattern Recognition. Chris Williams. School of Informatics, University of Edinburgh. August 2014

Computer Science, Informatik 4 Communication and Distributed Systems. Simulation. Discrete-Event System Simulation. Dr.

b. ( ) ( ) ( ) ( ) ( ) 5. Independence: Two events (A & B) are independent if one of the conditions listed below is satisfied; ( ) ( ) ( )

Discrete Random Variables

Continuous Time Processes

Tom Salisbury

Probability and Distributions

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Stat 100a, Introduction to Probability.

Probability and Statistics Concepts

Continuous Distributions

Massachusetts Institute of Technology

Special distributions

Performance Modelling of Computer Systems

Introduction to Statistical Data Analysis Lecture 3: Probability Distributions

Stat410 Probability and Statistics II (F16)

Stochastic Models in Computer Science A Tutorial

Poisson processes Overview. Chapter 10

Chapter Learning Objectives. Probability Distributions and Probability Density Functions. Continuous Random Variables

POISSON PROCESSES 1. THE LAW OF SMALL NUMBERS

Continuous Random Variables and Continuous Distributions

GEOMETRIC -discrete A discrete random variable R counts number of times needed before an event occurs

Expectations. Definition Let X be a discrete rv with set of possible values D and pmf p(x). The expected value or mean value of X, denoted by E(X ) or

3 Continuous Random Variables

Queueing Theory. VK Room: M Last updated: October 17, 2013.

CS 1538: Introduction to Simulation Homework 1

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Poisson Variables. Robert DeSerio

Continuous Distributions

Continuous Probability Distributions. Uniform Distribution

Renewal theory and its applications

LECTURE #6 BIRTH-DEATH PROCESS

MA/ST 810 Mathematical-Statistical Modeling and Analysis of Complex Systems

Probability and Probability Distributions. Dr. Mohammed Alahmed

Mathematical Statistics 1 Math A 6330

Lecture Notes 2 Random Variables. Random Variable

Random Variables. Definition: A random variable (r.v.) X on the probability space (Ω, F, P) is a mapping

Week 1 Quantitative Analysis of Financial Markets Distributions A

3 Modeling Process Quality

RENEWAL PROCESSES AND POISSON PROCESSES

1 Review of Probability and Distributions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Chapter 3 sections. SKIP: 3.10 Markov Chains. SKIP: pages Chapter 3 - continued

Computer Systems Modelling

Random Variate Generation

CHAPTER 6. 1, if n =1, 2p(1 p), if n =2, n (1 p) n 1 n p + p n 1 (1 p), if n =3, 4, 5,... var(d) = 4var(R) =4np(1 p).

Lecture 10: Normal RV. Lisa Yan July 18, 2018

Fault-Tolerant Computer System Design ECE 60872/CS 590. Topic 2: Discrete Distributions

Exponential, Gamma and Normal Distribuions

Lecture Notes 2 Random Variables. Discrete Random Variables: Probability mass function (pmf)

STAT/MATH 395 PROBABILITY II

Basic concepts of probability theory

MA 250 Probability and Statistics. Nazar Khan PUCIT Lecture 15

Chapter 3, 4 Random Variables ENCS Probability and Stochastic Processes. Concordia University

Statistics for Economists. Lectures 3 & 4

STAT/MATH 395 A - PROBABILITY II UW Winter Quarter Moment functions. x r p X (x) (1) E[X r ] = x r f X (x) dx (2) (x E[X]) r p X (x) (3)

Applied Statistics and Probability for Engineers. Sixth Edition. Chapter 4 Continuous Random Variables and Probability Distributions.

Chapter 4 Continuous Random Variables and Probability Distributions

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Optimization and Simulation

Chapter 4 Multiple Random Variables

Lecture 17: The Exponential and Some Related Distributions

Probability Theory and Random Variables

PAS04 - Important discrete and continuous distributions

(Ch 3.4.1, 3.4.2, 4.1, 4.2, 4.3)

Introduction to Queuing Theory. Mathematical Modelling

Transcription:

Exponential random variables Exponential random variables arise in studies of waiting times, service times, etc X is called an exponential random variable with parameter λ if its pdf is given by f(u) = λ exp( λu) for u 0 f(u) = 0 for u < 0 Scale parameter λ > 0 E[X] = 1/λ var(x) = 1/λ 2 CDF of exponential RV X = exponential RV with parameter λ F(t) = P{X t} = area under pdf to left of t = 1 exp( λt) P{X > t} = exp( λt) = complementary CDF E[X] = P{X > t} = 1/λ P{X > t+τ X > t} = P{X > t+τ}/p{x > t} = exp( λτ) Memoryless property of exponential RVs Gamma random variables X is called a gamma random variable with parameters (t, λ) if its pdf is given by f(u) = λ exp( λu) (λu) t 1 /Γ(t) for u > 0 f(u) = 0 for u 0 t > 0 is the order parameter λ > 0 is the scale parameter Γ(t) is a number whose value is the gamma function evaluated at t ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 1 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 2 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 3 of 42 What s this Γ(t) stuff, anyway? The value of Γ(t) is given by Γ(t) = x t 1 exp( x) dx, t > 0 0 Γ(t) = (t 1) Γ(t 1) = (t 1) (t 2) Γ(t 2) = If t is an integer, Γ(t) = (t 1)! If t integer, Γ(t) = (t 1) (t 2) Γ(t t ) where t t is the fractional part of t For 0 < t < 1, numerical integration must be used to evaluate Γ(t) ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 4 of 42 Gamma pdfs: different values of t λ = 1 1 0.8 t = 1 0.6 f(u) t = 2 0.4 t = 3 t = 4 0.2 t = 5 0 0 2 4 6 8 10 u ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 5 of 42 Gamma pdfs: different values of λ t = 3 1 0.8 λ = 3 0.6 f(u) λ = 2 0.4 λ = 1 0.2 0 0 2 4 6 8 10 u ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 6 of 42 University of Illinois at Urbana-Champaign 25.1

Mean & variance of gamma RVs If X is a gamma RV with parameters (t, λ), then E[X] = t/λ and var(x) = t/λ 2 A gamma RV with order parameter t = 1 is an exponential RV with parameter λ A gamma RV with order parameter t = n is called an n-erlang random variable A gamma RV with t = n/2, λ = 1/2 is a chi-square RV with n degrees of freedom An analogy The exponential RV with parameter λ is analogous to the geometric RV with parameter p both denote the waiting time till something occurs E[X] = 1/λ E[X] = 1/p The gamma RV with parameters (n, λ) is analogous to the negative binomial RV with parameters (n, p) both denote the waiting time for the n-th occurrence E[X] = n/λ E[X] = n/p More on the analogy With geometric and negative binomial RVs, we are, in effect, measuring the waiting time till something occurs in discrete steps With exponential and gamma RVs, we are modeling time as a continuous variable With negative binomial and gamma RV, the waiting time for the n-th occurrence of something is the sum of n (independent) waiting times for one thing to occur ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 7 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 8 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 9 of 42 What thing occurs? We are going to study occurrences of various interesting phenomena telephone off-hook signals jobs arriving at a processor packets arriving at a router α-particle emissions gate failures in a TMR system cars passing a checkpoint What s common to all these? The occurrences of the phenomenon, e.g. a telephone going off-hook, a packet arriving at a router are not very frequent Actually, this infrequency depends on the time scale being used For a telephone, the chances that it will go off hook in the next millisecond are small P{off-hook at some time during next day} is close to 1 Occurrences at random The time interval between successive occurrences (called the inter-arrival time) varies at random The inter-arrival time is modeled as a random variable What is the probabilistic behavior of this random variable? Answer: geometric for discrete time exponential for continuous time ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 10 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 11 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 12 of 42 University of Illinois at Urbana-Champaign 25.2

Independence of occurrences An illustrative diagram Long term average = expectation The observed value of a random variable depends on the outcome of an experiment An experiment commences at t = 0 and ends with the first arrival (i.e. occurrence) at some random time (say t = X 1 ) later At this time, a new experiment is started, and it ends with the second arrival at some time t = X 2, and so on These experiments are independent 1st experiment starts at t = 0 X 1 X 2 X 3 X 4 2nd experiment starts at t = X 1 = inter-arrival time 3nd experiment starts at t = X 2 The inter-arrival time is a random variable The successive inter-arrival times are the observed values of this random variable on independent trials Long-term observed average = (total time till N-th arrival occurs)/n is roughly equal to the expectation of the inter-arrival time ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 13 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 14 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 15 of 42 The arrival rate The inter-arrival time is a random variable with expected value (average) = 1/µ For large N, the total of N inter-arrival times is roughly N (1/µ) The number of occurrences of the phenomenon of interest (arrivals) in a long time interval of duration T is roughly µt µ is called the arrival rate On average, µt arrivals in T seconds ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 16 of 42 Random arrivals in time We are considering some phenomenon that occurs at random times Sometimes, these occurrences are called events Here, event does not have its usual meaning in probability theory, viz. a subset of the sample space To avoid confusion, we shall call these occurrences as arrivals or points ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 17 of 42 Poisson process with arrival rate µ The sequence of random arrivals or points (with independent inter-arrival times having average value 1/µ) is called a Poisson process with arrival rate µ Why Poisson? Let N(t 1, t 2 ] denote the number of arrivals in the time interval (t 1, t 2 ]. Note endpoints N(t 1, t 2 ] is a Poisson random variable with parameter µ(t 2 t 1 ) = µ duration of interval ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 18 of 42 University of Illinois at Urbana-Champaign 25.3

Basic assumptions At any time instant t, at most one arrival can occur Assumption: For small values of T P{N(t, t+ t] = 1} µ T P{N(t, t+ t] = 0} 1 µ T Approximation improves as T 0 Approximation is nonsense if T > 1/µ Small values of T means T << 1/µ Is this consistent with Poissonity? We have not yet proved that N(t 1, t 2 ] is a Poisson RV with parameter µ (t 2 t 1 ) But, if N(t, t+ t] were a Poisson random variable with parameter µ T << 1, then P{N(t, t+ t] = 0} = exp( µ T) = 1 µ T + (µ T) 2 /2 1 µ T for small µ T P{N(t, t+ t] = 1} = (µ T) µ T The independence assumption Let t 1 < t 2 t 3 < t 4 t 5 < t 6 Independence assumption: The random variables N(t 1, t 2 ], N(t 3, t 4 ], N(t 5, t 6 ], are independent The numbers of arrivals in disjoint intervals (i.e. non-overlapping intervals) of time are independent Memoryless property of the process ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 19 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 20 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 21 of 42 Distribution of the first arrival time The first arrival occurs at random time X 1 We set up and solve a differential equation for P 0 (t) = P{no arrivals in (0, t]} = P{N(0, t] = 0} = P{X 1 > t} P 0 (t+ T) = P{no arrivals in (0, t+ T]} = P{N(0, t+ T] = 0} = P[{N(0, t]=0} {N(t, t+ T]=0}] = P{N(0, t] = 0} P{N(t, t+ T] = 0} The differential equation for P 0 (t) 0 0 0 t t+ T P 0 (t+ T) = P{N(0, t+ T] = 0} = P[{N(0, t]=0} {N(t, t+ T]=0}] = P{N(0, t] = 0} P{N(t, t+ T] = 0} independence of disjoint intervals = P 0 (t) (1 µ T) P{N(0, t] = 0} = P 0 (t) by definition P{N(t, t+ T]] = 0} = 1 µ T [P 0 (t+ T) P 0 (t)]/ T = µ P 0 (t) Solving the diff. eq. for P 0 (t) [P 0 (t+ T) P 0 (t)]/ T = µ P 0 (t) = µ P 0 (t) dp 0 (t) This is a first-order linear differential equation with constant coefficients Initial condition: P 0 (0) = P{no arrivals in (0,0]} = P{no arrivals in } = 1 (not zero!!) P 0 (t) = exp( µ t) for t 0 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 22 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 23 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 24 of 42 University of Illinois at Urbana-Champaign 25.4

Thoughts about P 0 (t) P 0 (t) = P{no arrivals in (0, t] = P{N(0, t] = 0} = P{X 1 > t} = exp( µ t) P 0 (t) decays away exponentially as t P 0 ( t) = P{no arrivals in (0, t] = exp( µ t) = 1 µ t + (µ t) 2 /2! 1 µ t for small values of t P{X 1 > t} = exp( µ t) for t 0 Distribution of the first arrival time The first arrival occurs at random time X 1 P{X 1 > t} = exp( µ t) for t 0 P{X 1 t} = F X1 (t) = 1 exp( µ t) for t 0 f X1 (t) = derivative of CDF F X1 (t) = µ exp( µ t) for t 0 The first arrival time X 1 is an exponential random variable with parameter µ E[X 1 ] = 1/µ Distribution of the inter-arrival time The first arrival occurs at time X 1 = τ At t = τ, the experiment begins again X 2 = second arrival time X 2 X 1 is the inter-arrival time Given X 1 = τ, we set up a similar diff. eq. for P{no arrivals in (τ, τ + t]} The inter-arrival time X 2 X 1 is an exponential RV with parameter µ ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 25 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 26 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 27 of 42 An illustrative diagram Distribution of inter-arrival times Distribution of waiting times 1st experiment starts at t = 0 X 1 X 2 X 3 X 4 2nd experiment starts at t = X 1 = inter-arrival time 3nd experiment starts at t = X 2 Experiment begins again after each arrival Successive experiments are repeated independent trials X k X k 1 is k-th inter-arrival time Proceeding as before, all the inter-arrival times are independent exponential RVs with parameter µ Is X 1 also an inter-arrival time? Yes, if there was an arrival at t = 0 X 1 is an inter-arrival time if there was an arrival at t = 0 Otherwise, X 1 is the waiting time for the next arrival after t = 0 pdf of X 1 is same as pdf of inter-arrivals Generally, waiting time for the next arrival after t = τ also has the same pdf It is not necessary to have an arrival at τ for this result to hold ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 28 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 29 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 30 of 42 University of Illinois at Urbana-Champaign 25.5

An illustrative diagram 1st experiment starts at t = 0 X 1 X 2 X 3 = inter-arrival time X 1 is an inter-arrival time if there was an arrival at t = 0; a waiting time otherwise ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 31 of 42 Distribution of the k-th arrival time The k-th arrival occurs at random time X k We set up and solve a differential equation for P k (t) = P{exactly k arrivals in (0, t]} = P{N(0, t] = k} P k (t+ T) = P{k arrivals in (0, t+ T] = P{N(0, t+ T] = k} = P[{N(0, t] = k} {N(t, t+ T] = 0}] + P[{N(0, t] = k 1} P{N(t, t+ T] = 1}] ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 32 of 42 The differential equation for P k (t) k 0 0 k 1 t 1 t+ T P k (t+ T) = P[{N(0, t] = k} {N(t, t+ T] = 0}] + P[{N(0, t] = k 1} P{N(t, t+ T] = 1}] = P k (t) (1 µ T) + P k 1 (t) (µ T) [P k (t+ T) P k (t)]/ T = µ P k (t) + µ P k 1 (t) dp k (t) = µ P k (t) + µ P k 1 (t) ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 33 of 42 Solving the diff. eq. for P k (t) dp k (t) = µ P k (t) + µ P k 1 (t) We can set k = 1 and solve for P 1 (t) since we know that P 0 (t) = exp( µ t) Next, set k = 2 and solve for P 2 (t) since we know P 1 (t), and so on Alternatively, we can use LaPlace transforms LaPlace transforms for P k (t) dp k (t) = µ P k (t) + µ P k 1 (t) Initial condition: P k (0) = P{k arrivals in (0,0]} = P{k arrivals in } = 0 (not one!!) s [P k (t)] = µ [P k (t)] + µ [P k 1 (t)] µ µ 2 [P k (t)] = [P k 1 (t)] = [P k 2 (t)] µ k = [P 0 (t)] = µ k 1 Invert LaPlace transform to get P k (t) µ µ 2 [P k (t)] = [P k 1 (t)] = [P k 2 (t)] µ k µ k = [P 0 (t)] = 1 P k (t) = (µ t) k exp( µ t) = P{N(0, t] = k} k! For any fixed value of t, (µ t) P{N(0, t] = k} = k exp( µ t) k! N(0, t] is a Poisson RV with parameter µ t ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 34 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 35 of 42 ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 36 of 42 University of Illinois at Urbana-Champaign 25.6

It s a Poisson process! For any fixed value of t, (µ t) P{N(0, t] = k} = k exp( µ t) k! N(0, t] is a Poisson RV with parameter µ t Start counting arrivals after time t 1 Then N(t 1, t 2 ] is a Poisson RV with parameter µ(t 2 t 1 ) = µ duration of interval The RVs N(t 1, t 2 ], N(t 3, t 4 ], N(t 5, t 6 ], are independent if the intervals (t 1, t 2 ], (t 3, t 4 ], (t 5, t 6 ], don t overlap ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 37 of 42 Distribution of the k-th arrival time The k-th arrival occurs at random time X k P{X k > t} = P{N(0, t] k 1} = exp( µt) [1+(µt)+(µt) 2 /2!+ +(µt) k 1 /(k 1)!] f Xk (t) = derivative of CDF F Xk (t) = derivative of P{X k > t} = µ ((µt) k 1 /(k 1)!) exp( µ t) for t > 0 = µ exp( µt) (µt) k 1 /Γ(k) The k-th arrival time X k is a gamma random variable with parameters (k, µ) ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 38 of 42 Waiting time for next k arrivals The k-th arrival time X k is a gamma random variable with parameters (k, µ) Starting at any time t = τ, the waiting time for the next k arrivals has the same pdf It is not necessary to have an arrival at τ for this result to hold E[X k ] = k/µ = k E[X 1 ] Average waiting time for k arrivals = k {average waiting time for one arrival} ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 39 of 42 A final observation The k-th arrival time X k is a gamma random variable with parameters (k, µ) But, X k = (X k X k 1 ) + (X k 1 X k 2 ) + (X k 2 X k 3 ) + + (X 2 X 1 ) + X 1 is the sum of k inter-arrival times, i.e. k independent (1, µ) gamma RV Special case of a general result: The sum of independent gamma RVs with same scale parameter is a gamma RV order = sum of the orders; scale is same ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 40 of 42 Summary We have discussed the Poisson process in some detail Basic assumptions: Arrival rate is µ P{one arrival in T interval} µ T P{no arrival in T interval} 1 µ T Arrivals in disjoint time intervals are independent (process is memoryless) ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 41 of 42 Consequences of assumptions Inter-arrival times are exponential random variables with parameter µ Average inter-arrival time is 1/µ Time of k-th arrival/waiting time for next k is a gamma RV with parameters (k, µ) Average waiting time for k-th arrival is k/µ Number of arrivals in a time interval of length t is a Poisson random variable with parameter µ t ECE 313 - Lecture 24 2000 Dilip V. Sarwate, University of Illinois at Urbana-Champaign, All Rights Reserved Slide 42 of 42 University of Illinois at Urbana-Champaign 25.7