Inria September 22, 2015
Outline Poisson processes Markov jump processes Some queueing networks
The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution n N
The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution n N Law of rare events (a.k.a. law of small numbers) p n,i 0 such that lim n sup i p n,i = 0, lim n i p n,i = λ > 0
The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution n N Law of rare events (a.k.a. law of small numbers) p n,i 0 such that lim n sup i p n,i = 0, lim n i p n,i = λ > 0 Then X n = i Z n,i with Z n,i : independent Bernoulli(p n,i ) verifies X n D Poisson(λ)
Point process on R + Definition Point process on R + : Collection of random times {T n } n>0 with 0 < T 1 < T 2... Alternative description Collection {N t } t R+ with N t := n>0 I T n [0,t] Yet another description Collection {N(C)} for all measurable C R + where N(C) := n>0 I Tn C
Poisson process on R + Definition Point process such that for all s 0 = 0 < s 1 < s 2 <... < s n, 1 Increments {N si N si 1 } 1 i n independent 2 Law of N t+s N s only depends on t 3 for some λ > 0, N t Poisson(λt)
Poisson process on R + Definition Point process such that for all s 0 = 0 < s 1 < s 2 <... < s n, 1 Increments {N si N si 1 } 1 i n independent 2 Law of N t+s N s only depends on t 3 for some λ > 0, N t Poisson(λt) In fact, (3) follows from (1) (2)
Poisson process on R + Definition Point process such that for all s 0 = 0 < s 1 < s 2 <... < s n, 1 Increments {N si N si 1 } 1 i n independent 2 Law of N t+s N s only depends on t 3 for some λ > 0, N t Poisson(λt) In fact, (3) follows from (1) (2) λ is called the intensity of the process
A first construction Given λn i.i.d. numbers U n,i, uniform on [0, n], let N (n) t := n i=1 I U n,i t Then for any k N, s 0 = 0 < s 1 < s 2 <... < s n, {N (n) s i N (n) s i 1 } 1 i k D 1 i k Poisson(λ(s i s i 1 ))
A first construction Given λn i.i.d. numbers U n,i, uniform on [0, n], let N (n) t := n i=1 I U n,i t Then for any k N, s 0 = 0 < s 1 < s 2 <... < s n, {N (n) s i N (n) s i 1 } 1 i k D 1 i k Poisson(λ(s i s i 1 )) Proof: Multinomial distribution of {N (n) s i N (n) s i 1 } 1 i k Convergence of Laplace transform E exp( k i=1 α i(n s (n) i N s (n) i 1 )) for all α1 k Rk +
A first construction Given λn i.i.d. numbers U n,i, uniform on [0, n], let N (n) t := n i=1 I U n,i t Then for any k N, s 0 = 0 < s 1 < s 2 <... < s n, {N (n) s i N (n) s i 1 } 1 i k D 1 i k Poisson(λ(s i s i 1 )) Proof: Multinomial distribution of {N (n) s i N (n) s i 1 } 1 i k Convergence of Laplace transform E exp( k i=1 α i(n s (n) i N s (n) i 1 )) for all α1 k Rk + Suggests Poisson processes exist and are limits of this construction
A second characterization Proposition For Poisson process {T n } n>0 of intensity λ, its interarrival times τ i = T i+1 T i, where T 0 = 0, verify {τ n } n 0 i.i.d. with common distribution Exp(λ)
A second characterization Proposition For Poisson process {T n } n>0 of intensity λ, its interarrival times τ i = T i+1 T i, where T 0 = 0, verify {τ n } n 0 i.i.d. with common distribution Exp(λ) Density of Exp(λ): λe λx I x>0
A second characterization Proposition For Poisson process {T n } n>0 of intensity λ, its interarrival times τ i = T i+1 T i, where T 0 = 0, verify {τ n } n 0 i.i.d. with common distribution Exp(λ) Density of Exp(λ): λe λx I x>0 Key property: Exponential random variable τ is memoryless, i.e. t > 0, P(τ t τ > t) = P(τ )
A third characterization Proposition Process with i.i.d., Exp(λ) interarrival times {τ i } i 0 can be constructed on [0, t] by 1) Drawing N t Poisson(λt) 2) Putting N t points U 1,..., U Nt on [0, t] where U i : i.i.d. uniform on [0, t]
A third characterization Proposition Process with i.i.d., Exp(λ) interarrival times {τ i } i 0 can be constructed on [0, t] by 1) Drawing N t Poisson(λt) 2) Putting N t points U 1,..., U Nt on [0, t] where U i : i.i.d. uniform on [0, t] Proof. Establish identity for all n N, φ : R n + R: where S n 1 E[φ(τ 0, τ 0 + τ 1,..., τ 0 +... + τ n 1 )I Nt=n] = e λt (λt) n n! n! (0,t] n φ(s 1, s 2,..., s n )I s1 <s 2 <...<s n n i=1 ds i = P(Poisson(λt) = n) E[φ(S 1,..., S n )] : sorted version of i.i.d. variables uniform on [0, t]
Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))).
Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))). Proposition: Knowledge of L N (f ) on sufficiently rich class of functions f : R + R + (e.g. piecewise continuous with compact support) characterizes law of point process N.
Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))). Proposition: Knowledge of L N (f ) on sufficiently rich class of functions f : R + R + (e.g. piecewise continuous with compact support) characterizes law of point process N. Laplace transform of Poisson process with intensity λ: L N (f ) = exp( λ(1 e f (x) )dx) R +
Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))). Proposition: Knowledge of L N (f ) on sufficiently rich class of functions f : R + R + (e.g. piecewise continuous with compact support) characterizes law of point process N. Laplace transform of Poisson process with intensity λ: L N (f ) = exp( λ(1 e f (x) )dx) R + (i) Previous construction yields expression for L N (f ) (ii) For f = i α i I Ci N(C i ) Poisson(λ C i dx), with independence for disjoint C i. Hence existence of Poisson process...
Poisson process with general space and intensity Definition For λ : R d R + locally integrable function, N {T n } n>0 point process on R d is Poisson with intensity function λ if and only if for measurable, disjoint C i R d, i = 1,..., n, N(C i ) independent, Poisson( C i λ(x)dx)
Poisson process with general space and intensity Definition For λ : R d R + locally integrable function, N {T n } n>0 point process on R d is Poisson with intensity function λ if and only if for measurable, disjoint C i R d, i = 1,..., n, N(C i ) independent, Poisson( C i λ(x)dx) Proposition Such a process exists and admits Laplace transform ( ) L N (f ) = exp λ(x)(1 e f (x) )dx R d
Further properties Strong Markov property: Poisson process N with intensity λ, stopping time T (i.e. t 0, {T t} σ(n s, s t)) then on {T < + }, {N T +t N T } t 0 : Poisson with intensity λ and independent of {N s } s T
Further properties Strong Markov property: Poisson process N with intensity λ, stopping time T (i.e. t 0, {T t} σ(n s, s t)) then on {T < + }, {N T +t N T } t 0 : Poisson with intensity λ and independent of {N s } s T Superposition: For independent Poisson processes N i with intensities λ i, i = 1,..., n then N = i N i: Poisson with intensity i λ i
Further properties Strong Markov property: Poisson process N with intensity λ, stopping time T (i.e. t 0, {T t} σ(n s, s t)) then on {T < + }, {N T +t N T } t 0 : Poisson with intensity λ and independent of {N s } s T Superposition: For independent Poisson processes N i with intensities λ i, i = 1,..., n then N = i N i: Poisson with intensity i λ i Thinning: For Poisson process N {T n } n>0 with intensity λ, {Z n } n>0 independent of N, i.i.d., valued in [k], processes N i : N i (C) = n>0 I T n C I Zn=i are independent, Poisson with intensities λ i = λp(z n = i)
Markov jump processes Process {X t } t R+ with values in E, countable or finite, is Markov if P(X tn = x n X tn 1 = x n 1,... X t1 = x 1 ) = P(X tn = x n X tn 1 = x n 1 ), t n 1 Rn +, t 1 < < t n, x n 1 E n Homogeneous if P(X t+s = y X s = x) =: p xy (t) independent of s, s, t R +, x, y E
Markov jump processes Process {X t } t R+ with values in E, countable or finite, is Markov if P(X tn = x n X tn 1 = x n 1,... X t1 = x 1 ) = P(X tn = x n X tn 1 = x n 1 ), t n 1 Rn +, t 1 < < t n, x n 1 E n Homogeneous if P(X t+s = y X s = x) =: p xy (t) independent of s, s, t R +, x, y E Semi-group property p xy (t + s) = z E p xz(t)p zy (s), or P(t + s) = P(t)P(s) with P(t) = {p xy (t)} x,y E
Markov jump processes Process {X t } t R+ with values in E, countable or finite, is Markov if P(X tn = x n X tn 1 = x n 1,... X t1 = x 1 ) = P(X tn = x n X tn 1 = x n 1 ), t n 1 Rn +, t 1 < < t n, x n 1 E n Homogeneous if P(X t+s = y X s = x) =: p xy (t) independent of s, s, t R +, x, y E Semi-group property p xy (t + s) = z E p xz(t)p zy (s), or P(t + s) = P(t)P(s) with P(t) = {p xy (t)} x,y E Definition {X t } t R+ is a pure jump Markov process if in addition (i) It spends with probability 1 a strictly positive time in each state (ii) Trajectories t X t are right-continuous
Markov jump processes: examples Poisson process {N t } t R+ : then Markov jump process with p xy (t) = P(Poisson(λt) = y x)
Markov jump processes: examples Poisson process {N t } t R+ : then Markov jump process with p xy (t) = P(Poisson(λt) = y x) Single-server queue, FIFO ( First-in-first-out ) discipline, arrival times: N Poisson (λ), service times: i.i.d. Exp(µ) independent of N X t = number of customers present at time t: Markov jump process by Memoryless property of Exponential distribution + Markov property of Poisson process (the M/M/1/ queue)
Markov jump processes: examples Poisson process {N t } t R+ : then Markov jump process with p xy (t) = P(Poisson(λt) = y x) Single-server queue, FIFO ( First-in-first-out ) discipline, arrival times: N Poisson (λ), service times: i.i.d. Exp(µ) independent of N X t = number of customers present at time t: Markov jump process by Memoryless property of Exponential distribution + Markov property of Poisson process (the M/M/1/ queue) Infinite server queue with Poisson arrivals and Exponential service times: customer arrived at T n stays in system till T n + σ n, where σ n : service time X t = number of customers present at time t: Markov jump process (the M/M/ / queue)
Structure of Markov jump processes Infinitesimal Generator 1 p x, y, y x E, limits q x := lim xx (t) p t 0 t, q xy = lim xy (t) t 0 t exist in R + and satisfy y x q xy = q x q xy : Jump rate from x to y Q := {q xy } x,y E where q xx = q x : Infinitesimal Generator of process {X t } t R+ 1 Formally: Q = lim h 0 h [P(h) I ] where I : identity matrix
Structure of Markov jump processes Infinitesimal Generator 1 p x, y, y x E, limits q x := lim xx (t) p t 0 t, q xy = lim xy (t) t 0 t exist in R + and satisfy y x q xy = q x q xy : Jump rate from x to y Q := {q xy } x,y E where q xx = q x : Infinitesimal Generator of process {X t } t R+ 1 Formally: Q = lim h 0 h [P(h) I ] where I : identity matrix Structure of Markov jump processes Sequence {Y n } n N of visited states: Markov chain with transition q matrix p xy = I xy x y q x Conditionally on {Y n } n N, sojourn times {τ n } n N in successive states Y n : independent, with distributions Exp(q Yn )
Examples for Poisson process (λ): only non-zero jump rate q x,x+1 = λ = q x, x N
Examples for Poisson process (λ): only non-zero jump rate q x,x+1 = λ = q x, x N For FIFO M/M/1/ queue, non-zero rates: q x,x+1 = λ, q x,x 1 = µi x>0, x N hence q x = λ + µi x>0
Examples for Poisson process (λ): only non-zero jump rate q x,x+1 = λ = q x, x N For FIFO M/M/1/ queue, non-zero rates: q x,x+1 = λ, q x,x 1 = µi x>0, x N hence q x = λ + µi x>0 For M/M/ / queue, non-zero rates: q x,x+1 = λ, q x,x 1 = µx, x N hence q x = λ + µx
Structure of Markov jump processes (continued) Let T n := n 1 k=0 τ k: time of n-th jump. If T = + almost surely: trajectory determined on R +, hence generator Q determines law of process {X t } t R+
Structure of Markov jump processes (continued) Let T n := n 1 k=0 τ k: time of n-th jump. If T = + almost surely: trajectory determined on R +, hence generator Q determines law of process {X t } t R+ Process is called explosive if instead T < + with positive probability. Then process not completely characterized by generator
Structure of Markov jump processes (continued) Let T n := n 1 k=0 τ k: time of n-th jump. If T = + almost surely: trajectory determined on R +, hence generator Q determines law of process {X t } t R+ Process is called explosive if instead T < + with positive probability. Then process not completely characterized by generator Sufficient conditions for non-explosiveness: sup x E q x < + Recurrence of induced chain {Y n } n N For Birth and Death processes (i.e. E = N, only non-zero rates: β n = q n,n+1, birth rate; δ n = q n,n 1, death rate), non-explosiveness holds if 1 = + β n + δ n n>0
Kolmogorov s forward and backward equations Formal differentiation of P(t + h) = P(t)P(h) = P(h)P(t) yields d dt P(t) = P(t)Q d dt p xy(t) = z E p xz(t)q zy d dt d P(t) = QP(t) dt p xy(t) = z E q xzp zy (t) Kolmogorov s forward equation Kolmogorov s backward equation
Kolmogorov s forward and backward equations Formal differentiation of P(t + h) = P(t)P(h) = P(h)P(t) yields d dt P(t) = P(t)Q d dt p xy(t) = z E p xz(t)q zy d dt d P(t) = QP(t) dt p xy(t) = z E q xzp zy (t) Kolmogorov s forward equation Kolmogorov s backward equation 1 Follow directly from Q = lim h 0 h [P(h) I ] for finite E, in which case P(t) = exp(tq), t 0 Hold more generally in particular for non-explosive processes with a more involved proof (justifying exchange of summation and differentiation)
Stationary distributions and measures Definition Measure {π x } x E is stationary if it satisfies π T Q = 0, or equivalently the global balance equations x E, π x y x q xy = y x π y q yx flow out of x flow into x,
Stationary distributions and measures Definition Measure {π x } x E is stationary if it satisfies π T Q = 0, or equivalently the global balance equations x E, π x y x q xy = y x π y q yx flow out of x flow into x Kolmogorov s equations suggest that, if X 0 π for stationary π then X t π for all t 0,
Stationary distributions and measures Definition Measure {π x } x E is stationary if it satisfies π T Q = 0, or equivalently the global balance equations x E, π x y x q xy = y x π y q yx flow out of x flow into x Kolmogorov s equations suggest that, if X 0 π for stationary π then X t π for all t 0, Example: stationarity for birth and death processes π 0 β 0 = π 1 δ 1, π x (β x + δ x ) = π x 1 β x 1 + π x+1 δ x+1, x 1
Irreducibility, recurrence, invariance Definition Process {X t } t R+ is irreducible (respectively, irreducible recurrent) if induced chain {Y n } n N is. State x is positive recurrent if E x (R x ) < +, where R x = inf{t > τ 0 : X t = x}. Measure π is invariant for process {X t } t R+ if for all t > 0, π T P(t) = π T, i.e. x E, y E π y p yx (t) = π x.
Limit theorems 1 Theorem For irreducible recurrent {X t } t R+, invariant measure π, unique up to some scalar factor. It can be defined as, for any x E: Rx y E, π y = E x I Xt=y dt, or alternatively with T x := inf{n > 0 : Y n = x}, y E, π y = 1 T x E x I Yn=y. q y 0 n=1
Limit theorems 1 Theorem For irreducible recurrent {X t } t R+, invariant measure π, unique up to some scalar factor. It can be defined as, for any x E: Rx y E, π y = E x I Xt=y dt, or alternatively with T x := inf{n > 0 : Y n = x}, y E, π y = 1 T x E x I Yn=y. q y 0 n=1 Corollaries {ˆπ y } invariant for {Y n } n N {ˆπ y /q y } invariant for {X t } t R+. For irreducible recurrent {X t } t R+, either all or no state x E is positive recurrent.
Limit theorems 2 Theorem {X t } t R+ is ergodic (i.e. irreducible, positive recurrent) iff it is irreducible, non-explosive and such that π satisfying global balance equations. Then π is also the unique invariant probability distribution.
Limit theorems 2 Theorem {X t } t R+ is ergodic (i.e. irreducible, positive recurrent) iff it is irreducible, non-explosive and such that π satisfying global balance equations. Then π is also the unique invariant probability distribution. Theorem For ergodic {X t } t R+ with stationary distribution π, any initial distribution for X 0 and π-integrable f, 1 t almost surely lim f (X s )ds = π x f (x) (ergodic theorem) t t 0 x E and in distribution X t D π as t.
Limit theorems 3 Theorem For irreducible, non-ergodic {X t } t R+, any initial distribution for X 0, then for all x E, lim P(X t = x) = 0. t
Time reversal and reversibility For stationary ergodic {X t } t R with stationary distribution π, time-reversed process X t = X t : πy qyx Markov with transition rates q xy = π x
Time reversal and reversibility For stationary ergodic {X t } t R with stationary distribution π, time-reversed process X t = X t : πy qyx Markov with transition rates q xy = π x Definition Stationary ergodic {X t } t R with stationary distribution π reversible iff distributed as time-reversal { X t } t R, i.e. x y E, π x q xy = π y q yx, flow from x to y flow from y to x detailed balance equations.
Time reversal and reversibility For stationary ergodic {X t } t R with stationary distribution π, time-reversed process X t = X t : πy qyx Markov with transition rates q xy = π x Definition Stationary ergodic {X t } t R with stationary distribution π reversible iff distributed as time-reversal { X t } t R, i.e. x y E, π x q xy = π y q yx, flow from x to y flow from y to x detailed balance equations. Detailed balance, i.e. reversibility for π implies global balance for π. Example: for birth and death processes, detailed balance always holds for stationary measure.
Reversibility and truncation Proposition Let generator Q on E admit reversible measure π. Then for subset F E, truncated generator ˆQ: ˆQ xy = Q xy, x y F, ˆQ xx = ˆQ y x xy, x F admits {π x } x F as reversible measure.
Erlang s model of telephone network Call types s S: type-s calls arrive at instants of Poisson (λ s ) process, last (if accepted) for duration Exponential (µ s ) type-s calls require one circuit (unit of capacity) per link l s Link l has capacity C l circuits
Erlang s model of telephone network Call types s S: type-s calls arrive at instants of Poisson (λ s ) process, last (if accepted) for duration Exponential (µ s ) type-s calls require one circuit (unit of capacity) per link l s Link l has capacity C l circuits Stationary probability distribution: π x = 1 Z s S ρ xs s x s! I s l xs C, l where: ρ s = λ s /µ s, Z: normalizing constant. l
Erlang s model of telephone network Call types s S: type-s calls arrive at instants of Poisson (λ s ) process, last (if accepted) for duration Exponential (µ s ) type-s calls require one circuit (unit of capacity) per link l s Link l has capacity C l circuits Stationary probability distribution: π x = 1 Z s S ρ xs s x s! I s l xs C, l where: ρ s = λ s /µ s, Z: normalizing constant. Basis for dimensioning studies of telephone networks (prediction of call rejection probabilities) More recent application: performance analysis of peer-to-peer systems for video streaming. l
Jackson networks Stations i I receive external arrivals at Poisson rate λ i Station i when processing x i customers completes service at rate µ i φ i (x i ) (e.g.: φ i (x) = min(x i, n i ): queue with n i servers and service times Exponential (µ i )) After completing service at station i, customer joins station j with probability p ij, j I, and leaves system with probability 1 j I p ij Matrix P = (p ij ): sub-stochastic, such that (I P) 1
Jackson networks Stations i I receive external arrivals at Poisson rate λ i Station i when processing x i customers completes service at rate µ i φ i (x i ) (e.g.: φ i (x) = min(x i, n i ): queue with n i servers and service times Exponential (µ i )) After completing service at station i, customer joins station j with probability p ij, j I, and leaves system with probability 1 j I p ij Matrix P = (p ij ): sub-stochastic, such that (I P) 1 Traffic equations i I, λ i = λ i + j I λ j p ji or λ = (I P T ) 1 λ
Jackson networks (continued) Stationary measure: π x = i I xi ρ x i i m=1 φ i(m), where ρ i = λ i /µ i, and λ i : solutions of traffic equations
Jackson networks (continued) Stationary measure: π x = i I xi ρ x i i m=1 φ i(m), where ρ i = λ i /µ i, and λ i : solutions of traffic equations Application: process ergodic when π has finite mass. e.g. for φ i (x) = min(x, n i ), ergodicity iff i I, ρ i < n i.
Jackson networks (continued) Stationary measure: π x = i I xi ρ x i i m=1 φ i(m), where ρ i = λ i /µ i, and λ i : solutions of traffic equations Application: process ergodic when π has finite mass. e.g. for φ i (x) = min(x, n i ), ergodicity iff i I, ρ i < n i. Proof: verify partial balance equations for all x N I : i I, π x [ j i q x,x e i +e j + q x,x ei ] = j i π x e i +e j q x ei +e j,x + π x ei q x ei,x, π x i I q x,x+e i = i I π x+e i q x+ei,x, which imply global balance equations π x [ i I (q x,x e i + q x,x+ei + j i q x,x e i +e j )] = i I (π x e i q x ei,x + π x+ei q x+ei,x + j i π x e i +e j q x ei +e j,x)
Takeaway messages Poisson process a fundamental continuous-time process, adequate model for aggregate of infrequent independent events Markov jump processes: i) generator Q characterizes distribution if not explosive ii) Balance equation characterizes invariant distribution if irreducible non-explosive iii) Limit theorems: stationary distribution reflects long-term performance Exactly solvable models include reversible processes, plus several other important classes (e.g. Jackson networks)