Poisson processes and their properties

Similar documents
Continuous-time Markov Chains

Exponential Distribution and Poisson Process

The exponential distribution and the Poisson process

Manual for SOA Exam MLC.

4 = (0.02) 3 13, = 0.25 because = 25. Simi-

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

RENEWAL PROCESSES AND POISSON PROCESSES

Random Process Review

1 Poisson processes, and Compound (batch) Poisson processes

Lecture 4a: Continuous-Time Markov Chain Models

EDRP lecture 7. Poisson process. Pawe J. Szab owski

The degree of a typical vertex in generalized random intersection graph models

Stat410 Probability and Statistics II (F16)

ECE353: Probability and Random Processes. Lecture 18 - Stochastic Processes

ELEMENTS OF PROBABILITY THEORY

UCSD ECE250 Handout #27 Prof. Young-Han Kim Friday, June 8, Practice Final Examination (Winter 2017)

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Simulating events: the Poisson process

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Figure 10.1: Recording when the event E occurs

Continuous Distributions

Chapter 6: Random Processes 1

CSE525: Randomized Algorithms and Probabilistic Analysis May 16, Lecture 13

Probability and Stochastic Processes: A Friendly Introduction for Electrical and Computer Engineers Roy D. Yates and David J.

Study on Markov Alternative Renewal Reward. Process for VLSI Cell Partitioning

Simulation of Discrete Event Systems

List Scheduling and LPT Oliver Braun (09/05/2017)

Northwestern University Department of Electrical Engineering and Computer Science

Math 180C, Spring Supplement on the Renewal Equation

Chapter 2. Poisson Processes

Renewal theory and its applications

Statistics and Probability Letters

RAFIA(MBA) TUTOR S UPLOADED FILE Course STA301: Statistics and Probability Lecture No 1 to 5

Poisson Processes. Stochastic Processes. Feb UC3M

Solution Manual for: Applied Probability Models with Optimization Applications by Sheldon M. Ross.

The Distribution of the Covariance Matrix for a Subset of Elliptical Distributions with Extension to Two Kurtosis Parameters

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

13.2 Fully Polynomial Randomized Approximation Scheme for Permanent of Random 0-1 Matrices

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

ECE 313 Probability with Engineering Applications Fall 2000

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

MAS113 Introduction to Probability and Statistics. Proofs of theorems

Optimization and Simulation

Random Process. Random Process. Random Process. Introduction to Random Processes

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

The Weierstrass Approximation Theorem

Poornima University, For any query, contact us at: , 18

Stationary independent increments. 1. Random changes of the form X t+h X t fixed h > 0 are called increments of the process.

Quantum algorithms (CO 781, Winter 2008) Prof. Andrew Childs, University of Waterloo LECTURE 15: Unstructured search and spatial search

2 Q 10. Likewise, in case of multiple particles, the corresponding density in 2 must be averaged over all

Probability Distributions

Kinetic Theory of Gases: Elementary Ideas

Block designs and statistics

Chapter 6 1-D Continuous Groups

Estimating Parameters for a Gaussian pdf

Math Spring Practice for the final Exam.

a a a a a a a m a b a b

Research in Area of Longevity of Sylphon Scraies

Kinetic Theory of Gases: Elementary Ideas

Assignment 3 with Reference Solutions

16:330:543 Communication Networks I Midterm Exam November 7, 2005

1 Brownian motion and the Langevin equation

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Conditioning a random variable on an event

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

About the definition of parameters and regimes of active two-port networks with variable loads on the basis of projective geometry

A Simple Regression Problem

1 Probability and Random Variables

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

PHYSICS 110A : CLASSICAL MECHANICS MIDTERM EXAM #2

ANALYSIS OF HALL-EFFECT THRUSTERS AND ION ENGINES FOR EARTH-TO-MOON TRANSFER

Some Proofs: This section provides proofs of some theoretical results in section 3.

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Polygonal Designs: Existence and Construction

Biostatistics Department Technical Report

Continuous Time Processes

Lecture 7: Chapter 7. Sums of Random Variables and Long-Term Averages

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

e = n 1 ( ) 3 [ m 3] = n [ m 3] n

1 Delayed Renewal Processes: Exploiting Laplace Transforms

Continuous time Markov chains

Lecture #8-3 Oscillations, Simple Harmonic Motion

IN modern society that various systems have become more

Research Article On the Isolated Vertices and Connectivity in Random Intersection Graphs

OSCILLATIONS AND WAVES

Motion of Charges in Uniform E

ECSE B Solutions to Assignment 8 Fall 2008

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.436J/15.085J Fall 2008 Lecture 11 10/15/2008 ABSTRACT INTEGRATION I

Principles of Optimal Control Spring 2008

BRANCHING PROCESSES AND THEIR APPLICATIONS: Lecture 15: Crump-Mode-Jagers processes and queueing systems with processor sharing

Things to remember when learning probability distributions:

Solutions of some selected problems of Homework 4

E X A M. Probability Theory and Stochastic Processes Date: December 13, 2016 Duration: 4 hours. Number of pages incl.

Applied Stochastic Processes

An Extension to the Tactical Planning Model for a Job Shop: Continuous-Time Control

16.512, Rocket Propulsion Prof. Manuel Martinez-Sanchez Lecture 30: Dynamics of Turbopump Systems: The Shuttle Engine

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

12 Towards hydrodynamic equations J Nonlinear Dynamics II: Continuum Systems Lecture 12 Spring 2015

Transcription:

Poisson processes and their properties Poisson processes. collection {N(t) : t [, )} of rando variable indexed by tie t is called a continuous-tie stochastic process, Furtherore, we call N(t) a Poisson process if (a) Starting with N(), the process N(t) takes a non-negative integer, 1, 2,... for all t > ; (b) the increent N(t + s) N(t) is surely nonnegative for any s > ; (c) the increents N(t 1 ), N(t 2 ) N(t 1 ),, N(t n ) N(t n 1 ) are independent for any < t 1 < t 2 < < t n 1 < t n ; (d) the increent N(t + s) N(t) has the distribution which is dependent on the value s > but independent of t >. Counting processes. stochastic process satisfying (a) and (b) is called a counting process in which N(t) represents the total nuber of events counted up to tie t. Properties (c) and (d) are respectively called independent and stationary increents. Particularly by applying (a) and (d) together, we obtain for all k, 1, 2,.... P (N(t + s) N(t) k) P (N(s) k) (2.1) rrival tie and sojourn tie. Events counted by a Poisson process N(t) are called Poisson events. Let T n denote the tie when the n-th Poisson event occurs. Here we call T n a arrival tie (also referred as occurrence tie ), and define the sojourn tie W n (or interarrival tie ) by W n : T n T n 1 for n 1, 2,..., where T for convenience. Properties of sojourn tie. We can observe that the event {W n > s} for a sojourn tie is equivalently expressed by the event {N(T n 1 + s) N(T n 1 ) } in ters of counting process, and that it has the probability that N(s). This will justify the following properties of sojourn tie: (a) The sojourn ties W 1, W 2,... are independent; (b) the sojourn tie W n has the distribution which is independent of n. Survival function. Consider the probability K(s) : P (W 1 > s). The function K(s) is known as a survival function. Then we obtain K(t + s) P (W 1 > t + s) P (N(t), N(t + s) N(t) ) P (N(t) )P (N(t + s) N(t) ) P (N(t) )P (N(s) ) P (W 1 > t)p (W 1 > s) K(t)K(s) Page 1 Special lecture/june 216

Then the sojourn tie W 1 ust have an exponential distribution with paraeter λ; see Proble 1 below. Proble 1. Let X be a non-negative rando variable satisfying for s, t, P (X > t + s X > s) P (X > t). (2.2) The above property is generally referred as eoryless property. By copleting the following questions, we will show that X ust be an exponential rando variable. (a) Let K(t) P (X > t) be the survival function of X. Show that the eoryless property iplies that K(s + t) K(s)K(t) for s, t. (b) Let κ K(1). rgue that κ >. (c) Show that K ( ) 1 1 n κ n and K ( n ) κ n. Therefore, we can obtain K(t) κ t for any t. By letting λ ln κ, we can write K(t) e λt, and therefore, we can find the pdf f(t) λe λt, t, for X. Proble 2. Custoers arrive at a service facility according to a Poisson process of rate λ. Let N(t) be the nuber of custoers that have arrived up to tie t, and let T 1, T 2,... be the successive arrival ties of the custoers. (a) Find E(T 4 t N(t) 3). (b) Find E(T 4 N(t) 3). (c) Find E(T 5 N(t) 3). Poisson distribution. Since W 1, W 2,... are independent and exponentially distributed with the coon paraeter λ, the arrival tie T n n k1 W k has the gaa distribution with paraeter (n, λ). s we have already seen in the previous lecture note, the discrete rando variable N(t) with a fixed tie t has the Poisson distribution with paraeter λt, λt (λt)k P (N(t) k) e for k, 1, 2,.... Proble 3. Let N(t) be a Poisson process with rate λ, and let < s < t. (a) Find P (N(t) N(s) 1). (b) Find P (N(s), N(t) 1). (c) Find P (N(s) 1 N(t) 1). Page 2 Special lecture/june 216

Distribution of arrival ties. Consider the sall probability that a Poisson event occurs in the tie interval [t, t + dt). By noticing that e λdt 1, we can find such probability as P (N(dt) 1) λ dt and call λ an arrival rate. Let f 1 (s) be the conditional density of the arrival tie T 1 given N(t) 1. Then we can copute the infinitesial probability P (s < T 1 s + ds N(t) 1) f 1 (s) ds by P (s < T 1 s + ds N(t) 1) P (N(s), N(s + ds) N(s) 1, N(t) N(s + ds) N(t) 1) P (N(s) )P (N(ds) 1)P (N(t s ds) ) P (N(t) 1) e λt λds e λt λt 1 t ds (2.3) Thus, f 1 (s) is the unifor pdf on (, t). General cases. Suppose that T 1, T 2,..., T n are the arrival ties up to the n-th occurrence in a Poisson process. Then we can generalize the notion of infinitesial probability to a joint distribution of the arrival ties T 1, T 2,..., T n, and express an infinitesial probability by P (s 1 < T 1 s 1 + ds 1,..., s n < T n s n + ds n N(t) n) f(s 1, s 2,..., s n ) ds 1 ds 2 ds n, (2.4) where f(s 1, s 2,..., s n ) is the joint density function. Proble 4. Show that the joint density function of (2.4) is given by f(s 1, s 2,..., s n ) n! t n for < s 1 < s 2 < < s n < t. Joint distribution of arrival ties. Let U 1, U 2,..., U n be independent and identically distributed (iid) unifor rando variable on (, t), and let U (1) < U (2) < < U (n) be the order statistics of U i s. Then the joint infinitesial probability is given by P (s 1 < U (1) s 1 + ds 1, s 2 < U (2) s 2 + ds 2,..., s n < U (n) s n + ds n ) n! t n ds 1 ds 2 ds n. Thus, conditioned upon N(t) n, the joint distribution of the arrival tie T 1, T 2,..., T n are identical to that of the order statistics U (1), U (2),..., U (n) of iid unifor rando variables on (, t). Proble 5. Let T 1, T 2,..., T n be the arrival ties in a Poisson process N(t) as in Proble 4. (a) Find E(T i N(1) n) for i 1,..., n. (b) Find E(T 1 + T 2 + + T n N(1) n). Page 3 Special lecture/june 216

Proble 6. Let N(t) be a Poisson process with rate λ, representing the nuber of custoers entering a store. Each custoer spend a duration in the store randoly. Then let X(t) denote the nuber of custoers reaining in the store at tie t. ssuing that the durations of custoer are independent rando variables and identically distributed as the pdf g(v), deterine the distribution for X(t). Proble 7. lpha particles are eitted according to a Poisson process with rate λ. Each particle exists for a rando duration and is annihilated. Suppose that the lifeties of particle are independent rando variables, and are identically and exponentially distributed with paraeter β. Find the expected nuber of particles existing at tie t. Superposition of Poisson processes. The oent generating function (gf) M X (t) for a Poisson rando variable X with paraeter λ can be calculated as M X (t) e λ(et 1). Suppose that X and Y are independent Poisson rando variables with respective paraeter λ and µ. Then the gf for X + Y is given by M X+Y (t) M X (t) M Y (t) e (λ+µ)(et 1), which iplies that X + Y is also a Poisson rando variable with paraeter λ + µ. Now consider independent two Poisson processes N(t) and M(t) having the respective arrival rate λ and µ. Then the superposition L(t) N(t) + M(t) becoes a Poisson process with rate λ + µ. Rando sus. Let X 1, X 2,... be iid rando variables with ean µ E[X i ] and variance σ 2 Var(X i ) for all i. Suppose that a discrete rando variable N takes a nonnegative integer value, 1, 2,..., and is independent of the rando variables X 1, X 2,.... Then we define a rando su by N Z X i. i1 For the rando variable Z observe that E(Z N) µn and Var(Z N) E((Z µn) 2 N) σ 2 N. We can copute E[Z] E[E(Z N)] µe[n], Var(Z) Var(E(Z N)) + E[Var(Z N)] µ 2 Var(N) + σ 2 E[N] Copound Poisson processes. Now suppose that N(t) is a Poisson process with arrival rate λ, independent of X 1, X 2,.... Then we can introduce a copound Poisson process by N(t) Z(t) X i. The copound Poisson process is essentially the rando su. Since E[N(t)] Var[N(t)] λt, we obtain E[Z(t)] µλt and Var(Z(t)) (µ 2 + σ 2 )λt. Non-hoogeneous Poisson processes. For an interval on [, ) we can introduce the total nuber N() of Poisson events occurring on the tie interval, and calculate its expected Page 4 Special lecture/june 216 i1

nuber by E[N()] λ dt. Suppose that the stationary increent of property (d) does not hold [but (a) (c) does], and that its expectation is rather deterined by E[N()] () : τ(t) dt with nonnegative intensity function τ(t). Then the rando variable N() has the Poisson distribution with paraeter (), and the process N(t) is called a non-hoogeneous Poisson process. Spatial Poisson processes. Consider a ulti-diensional space S. We can introduce a Poisson point process on S by extending a concept of non-hoogeneous Poisson process. For a subset of S let N() denote the total nuber of Poisson events each of which occurs at a point on the subset. If (i) the rando variables N( 1 ), N( 2 ),..., N( n ) are independent for every sequence 1, 2,..., n of utually disjoint subsets and (ii) the expected nuber E[N()] is given by E[N()] () : τ(x) dx with nonnegative intensity function τ(x) on S. Then the rando variable N() has the Poisson distribution with paraeter (). In particular when τ(x) λ, we call the process hoogeneous. Proble 8. The nuber of bacteria distributed throughout a volue of liquid can be considered as a Poisson spatial process. Suppose that the intensity function τ(x).6 (organiss per 3 ) Find the probability that ore than two bacteria are detected in a 1 3 volue. Marked point process. Consider a Poisson process N(t) with arrival rate λ, and iid rando variables X 1, X 2,... having a coon pdf g(x) on a space S. Let T 1, T 2,... denote arrival tie of the Poisson process. Then we can define a arked Poisson point process by introducing Poisson events at points (T 1, X 1 ), (T 2, X 2 ),... on the space [, ) S. For any subset of [, ) S, the nuber N() of Poisson events on becoes a non-hoogeneous Poisson point process with E[N()] () : λ g(x) dx dt Proble 9. Custoers arrive according to a Poisson process with rate λ 8. Suppose that each custoer is independently classified as high priority with probability.2, or low priority with probability.8. Then find the probability that three high priority and five low priority custoers arrive by t 1. Page 5 Special lecture/june 216

Proble solutions Proble 1. (a) Observe that P (X > t + s X > s) P (X > t + s) P (X > s) K(t + s) K(s) By applying it to (2.2) we obtain K(s + t) K(s)K(t). (b) The existence of conditional probability in (2.2) requires that P (X > s) > ; thus, κ P (X > 1) >. (c) Since K(1) K ( 1 n ( n) and K ) ( n K 1, ( n) we have K 1 ) 1 n κ n and K ( ) n κ n. Proble 2. Recall that the waiting tie V t T (N(t)+1) t has an exponential distribution with paraeter λ, and that V t is independent of N(t). (a) E(T 4 t N(t) 3) E(T (N(t)+1) t N(t) 3) E(V t N(t) 3) 1 λ (b) E(T 4 N(t) 3) E(T (N(t)+1) N(t) 3) E(t + V t N(t) 3) t + 1 λ (c) E(T 5 N(t) 3) E(T 4 + W 5 N(t) 3) E(T 4 N(t) 3) + E(W 5 N(t) 3) t + 2 λ, where W 5 is the sojourn tie and it is independent of the event that N(t) 3. Proble 3. (a) P (N(t) N(s) 1) P (N(t s) 1) 1 P (N(t s) ) 1 e λ(t s) (b) P (N(s), N(t) 1) P (N(s), N(t) N(s) 1) P (N(s) ) P (N(t) N(s) 1) e λs (1 e λ(t s) ) (c) P (N(s) 1 N(t) 1) e λs (λs) e λ(t s) e λt (λt) s t P (N(s), N(t) N(s) ) P (N(t) 1) Proble 4. We can deonstrate the extension of (2.3) in the case of n 2. P (N(s) ) P (N(t) N(s) ) P (N(t) 1) P (s 1 < T 1 s 1 + ds 1, s 2 < T 2 s 2 + ds 2 N(t) 2) P (N(s 1 ), N(s 1 + ds 1 ) N(s 1 ) 1, N(s 2 ) N(s 1 + ds 1 ), N(s 2 + ds 2 ) N(s 2 ) 1, N(t) N(s 2 + ds 2 ) N(t) 2) P (N(s 1) )P (N(ds 1 ) 1)P (N(s 2 s 1 ds 1 ) )P (N(ds 2 ) 1)P (N(t s 2 ds 2 ) ) P (N(t) 2) e λt (λds 1 )(λds 2 ) e λt (λt) 2 /2! 2! t 2 ds 1 ds 2 Further extension for n 3 is obvious. Proble 5. Let U 1, U 2,..., U n be iid unifor rando variable on (, 1), and let U (1) < U (2) < < U (n) be the order statistics of U i s. (a) E(T i N(1) n) E[U (i) ] i. n+1 Page 6 Special lecture/june 216

(b) E(T 1 + T 2 + + T n N(1) n) n i1 i n+1 n 2 Proble 6. Provided that N(t) n, we call the first n custoers 1, 2,..., n. Let U i be the arrival tie of custoer i, and let V i be the duration of the sae custoer reaining in the store [which is distributed as the pdf g(v)]. The pair (U i, V i ) of rando variables has the joint density function We can calculate p P (U i + V i t) f(u, v) 1 g(v), u t, v t du f(u, v) dv 1 t u t du g(v) dv 1 t u t dz z g(v) dv Then the nuber X(t) of custoers reaining in the store at tie t counts all the events that U i + V i t, i 1,..., n. We obtain ( ) n P (X(t) k N(t) n) p k (1 p) n k k and therefore, P (X(t) k) P (X(t) k N(t) n)p (N(t) n) nk λt (λpt)k e (λ(1 p)t) n k nk (n k)! which iplies that X(t) has a Poission distribution with paraeter λpt λ dz z g(v) dv λpt (λpt)k e Proble 7. This is the iediate application of Proble 6 with g(v) βe βv. Thus, we obtain E[X(t)] λpt λ dz Proble 8. Since () (.6)(1) 6, we obtain P (N() > 2) 1 z 2 k βe βv dv λ β (1 e βt ) () ()k e.938 Proble 9. Consider the arked Poisson point process on [, ) {, 1} with probability ass function g().8 and g(1).2. Let low [, 1] {} and high [, 1] {1}. Then we can calculate the probability as P (N( low ) 5, N( high ) 3) P (N( low ) 5) P (N( high ) 3) 6.4 (6.4)5 1.6 (1.6)3 e e 5! 3!.25 Page 7 Special lecture/june 216