Markov processes and queueing networks

Similar documents
Part I Stochastic variables and Markov chains

IEOR 6711, HMWK 5, Professor Sigman

Lecture 20: Reversible Processes and Queues

Stochastic process. X, a series of random variables indexed by t

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Statistics 150: Spring 2007

Data analysis and stochastic modeling

STAT 380 Continuous Time Markov Chains

Figure 10.1: Recording when the event E occurs

Performance Evaluation of Queuing Systems

Introduction to Markov Chains, Queuing Theory, and Network Performance

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

The Transition Probability Function P ij (t)

1 Continuous-time chains, finite state space

ELEMENTS OF PROBABILITY THEORY

GI/M/1 and GI/M/m queuing systems

A review of Continuous Time MC STA 624, Spring 2015

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Introduction to Queuing Networks Solutions to Problem Sheet 3

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Continuous Time Processes

STATS 3U03. Sang Woo Park. March 29, Textbook: Inroduction to stochastic processes. Requirement: 5 assignments, 2 tests, and 1 final

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

Contents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory

LECTURE #6 BIRTH-DEATH PROCESS

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Continuous-Time Markov Chain

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Queueing Networks and Insensitivity

1 IEOR 4701: Continuous-Time Markov Chains

Quantitative Model Checking (QMC) - SS12

M/G/1 and M/G/1/K systems

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Chapter 1. Introduction. 1.1 Stochastic process

Exact Simulation of the Stationary Distribution of M/G/c Queues

TCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis

Intro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin

STAT STOCHASTIC PROCESSES. Contents

Lecture 4a: Continuous-Time Markov Chain Models

Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.

Poisson Processes. Stochastic Processes. Feb UC3M

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

SMSTC (2007/08) Probability.

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

Statistics 253/317 Introduction to Probability Models. Winter Midterm Exam Monday, Feb 10, 2014

STOCHASTIC PROCESSES Basic notions

Simple queueing models

Continuous-time Markov Chains

Continuous time Markov chains

MARKOV PROCESSES. Valerio Di Valerio

Stochastic Processes (Week 6)

Solution: The process is a compound Poisson Process with E[N (t)] = λt/p by Wald's equation.

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Exponential Distribution and Poisson Process

Other properties of M M 1

Performance Modelling of Computer Systems

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Time Reversibility and Burke s Theorem

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

The exponential distribution and the Poisson process

Part II: continuous time Markov chain (CTMC)

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Stochastic Modelling Unit 1: Markov chain models

Continuous Time Markov Chains

Lecture 7. We can regard (p(i, j)) as defining a (maybe infinite) matrix P. Then a basic fact is

Queueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

T. Liggett Mathematics 171 Final Exam June 8, 2011

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

Name of the Student:

Statistics 992 Continuous-time Markov Chains Spring 2004

Markov Processes Cont d. Kolmogorov Differential Equations

Quality of Real-Time Streaming in Wireless Cellular Networks : Stochastic Modeling and Analysis

Multi Stage Queuing Model in Level Dependent Quasi Birth Death Process

QUEUING SYSTEM. Yetunde Folajimi, PhD

Stochastic Processes. Theory for Applications. Robert G. Gallager CAMBRIDGE UNIVERSITY PRESS

Queueing Theory I Summary! Little s Law! Queueing System Notation! Stationary Analysis of Elementary Queueing Systems " M/M/1 " M/M/m " M/M/1/K "

Introduction to queuing theory

Lecture 10: Semi-Markov Type Processes

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Stochastic Proceses Poisson Process

Chapter 5. Chapter 5 sections

Queueing Review. Christos Alexopoulos and Dave Goldsman 10/6/16. (mostly from BCNN) Georgia Institute of Technology, Atlanta, GA, USA

Examination paper for TMA4265 Stochastic Processes

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Non Markovian Queues (contd.)

Queueing Theory II. Summary. ! M/M/1 Output process. ! Networks of Queue! Method of Stages. ! General Distributions

Networking = Plumbing. Queueing Analysis: I. Last Lecture. Lecture Outline. Jeremiah Deng. 29 July 2013

arxiv: v2 [math.pr] 24 Mar 2018

Exercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010

Northwestern University Department of Electrical Engineering and Computer Science

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Continuous time Markov chains

STAT 380 Markov Chains

Transcription:

Inria September 22, 2015

Outline Poisson processes Markov jump processes Some queueing networks

The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution n N

The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution n N Law of rare events (a.k.a. law of small numbers) p n,i 0 such that lim n sup i p n,i = 0, lim n i p n,i = λ > 0

The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution n N Law of rare events (a.k.a. law of small numbers) p n,i 0 such that lim n sup i p n,i = 0, lim n i p n,i = λ > 0 Then X n = i Z n,i with Z n,i : independent Bernoulli(p n,i ) verifies X n D Poisson(λ)

Point process on R + Definition Point process on R + : Collection of random times {T n } n>0 with 0 < T 1 < T 2... Alternative description Collection {N t } t R+ with N t := n>0 I T n [0,t] Yet another description Collection {N(C)} for all measurable C R + where N(C) := n>0 I Tn C

Poisson process on R + Definition Point process such that for all s 0 = 0 < s 1 < s 2 <... < s n, 1 Increments {N si N si 1 } 1 i n independent 2 Law of N t+s N s only depends on t 3 for some λ > 0, N t Poisson(λt)

Poisson process on R + Definition Point process such that for all s 0 = 0 < s 1 < s 2 <... < s n, 1 Increments {N si N si 1 } 1 i n independent 2 Law of N t+s N s only depends on t 3 for some λ > 0, N t Poisson(λt) In fact, (3) follows from (1) (2)

Poisson process on R + Definition Point process such that for all s 0 = 0 < s 1 < s 2 <... < s n, 1 Increments {N si N si 1 } 1 i n independent 2 Law of N t+s N s only depends on t 3 for some λ > 0, N t Poisson(λt) In fact, (3) follows from (1) (2) λ is called the intensity of the process

A first construction Given λn i.i.d. numbers U n,i, uniform on [0, n], let N (n) t := n i=1 I U n,i t Then for any k N, s 0 = 0 < s 1 < s 2 <... < s n, {N (n) s i N (n) s i 1 } 1 i k D 1 i k Poisson(λ(s i s i 1 ))

A first construction Given λn i.i.d. numbers U n,i, uniform on [0, n], let N (n) t := n i=1 I U n,i t Then for any k N, s 0 = 0 < s 1 < s 2 <... < s n, {N (n) s i N (n) s i 1 } 1 i k D 1 i k Poisson(λ(s i s i 1 )) Proof: Multinomial distribution of {N (n) s i N (n) s i 1 } 1 i k Convergence of Laplace transform E exp( k i=1 α i(n s (n) i N s (n) i 1 )) for all α1 k Rk +

A first construction Given λn i.i.d. numbers U n,i, uniform on [0, n], let N (n) t := n i=1 I U n,i t Then for any k N, s 0 = 0 < s 1 < s 2 <... < s n, {N (n) s i N (n) s i 1 } 1 i k D 1 i k Poisson(λ(s i s i 1 )) Proof: Multinomial distribution of {N (n) s i N (n) s i 1 } 1 i k Convergence of Laplace transform E exp( k i=1 α i(n s (n) i N s (n) i 1 )) for all α1 k Rk + Suggests Poisson processes exist and are limits of this construction

A second characterization Proposition For Poisson process {T n } n>0 of intensity λ, its interarrival times τ i = T i+1 T i, where T 0 = 0, verify {τ n } n 0 i.i.d. with common distribution Exp(λ)

A second characterization Proposition For Poisson process {T n } n>0 of intensity λ, its interarrival times τ i = T i+1 T i, where T 0 = 0, verify {τ n } n 0 i.i.d. with common distribution Exp(λ) Density of Exp(λ): λe λx I x>0

A second characterization Proposition For Poisson process {T n } n>0 of intensity λ, its interarrival times τ i = T i+1 T i, where T 0 = 0, verify {τ n } n 0 i.i.d. with common distribution Exp(λ) Density of Exp(λ): λe λx I x>0 Key property: Exponential random variable τ is memoryless, i.e. t > 0, P(τ t τ > t) = P(τ )

A third characterization Proposition Process with i.i.d., Exp(λ) interarrival times {τ i } i 0 can be constructed on [0, t] by 1) Drawing N t Poisson(λt) 2) Putting N t points U 1,..., U Nt on [0, t] where U i : i.i.d. uniform on [0, t]

A third characterization Proposition Process with i.i.d., Exp(λ) interarrival times {τ i } i 0 can be constructed on [0, t] by 1) Drawing N t Poisson(λt) 2) Putting N t points U 1,..., U Nt on [0, t] where U i : i.i.d. uniform on [0, t] Proof. Establish identity for all n N, φ : R n + R: where S n 1 E[φ(τ 0, τ 0 + τ 1,..., τ 0 +... + τ n 1 )I Nt=n] = e λt (λt) n n! n! (0,t] n φ(s 1, s 2,..., s n )I s1 <s 2 <...<s n n i=1 ds i = P(Poisson(λt) = n) E[φ(S 1,..., S n )] : sorted version of i.i.d. variables uniform on [0, t]

Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))).

Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))). Proposition: Knowledge of L N (f ) on sufficiently rich class of functions f : R + R + (e.g. piecewise continuous with compact support) characterizes law of point process N.

Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))). Proposition: Knowledge of L N (f ) on sufficiently rich class of functions f : R + R + (e.g. piecewise continuous with compact support) characterizes law of point process N. Laplace transform of Poisson process with intensity λ: L N (f ) = exp( λ(1 e f (x) )dx) R +

Laplace transform of Poisson processes Definition The Laplace transform of point process N {T n } n>0 is the functional whose evaluation at f : R + R + is L N (f ) := E exp( N(f )) = E(exp( n>0 f (T n ))). Proposition: Knowledge of L N (f ) on sufficiently rich class of functions f : R + R + (e.g. piecewise continuous with compact support) characterizes law of point process N. Laplace transform of Poisson process with intensity λ: L N (f ) = exp( λ(1 e f (x) )dx) R + (i) Previous construction yields expression for L N (f ) (ii) For f = i α i I Ci N(C i ) Poisson(λ C i dx), with independence for disjoint C i. Hence existence of Poisson process...

Poisson process with general space and intensity Definition For λ : R d R + locally integrable function, N {T n } n>0 point process on R d is Poisson with intensity function λ if and only if for measurable, disjoint C i R d, i = 1,..., n, N(C i ) independent, Poisson( C i λ(x)dx)

Poisson process with general space and intensity Definition For λ : R d R + locally integrable function, N {T n } n>0 point process on R d is Poisson with intensity function λ if and only if for measurable, disjoint C i R d, i = 1,..., n, N(C i ) independent, Poisson( C i λ(x)dx) Proposition Such a process exists and admits Laplace transform ( ) L N (f ) = exp λ(x)(1 e f (x) )dx R d

Further properties Strong Markov property: Poisson process N with intensity λ, stopping time T (i.e. t 0, {T t} σ(n s, s t)) then on {T < + }, {N T +t N T } t 0 : Poisson with intensity λ and independent of {N s } s T

Further properties Strong Markov property: Poisson process N with intensity λ, stopping time T (i.e. t 0, {T t} σ(n s, s t)) then on {T < + }, {N T +t N T } t 0 : Poisson with intensity λ and independent of {N s } s T Superposition: For independent Poisson processes N i with intensities λ i, i = 1,..., n then N = i N i: Poisson with intensity i λ i

Further properties Strong Markov property: Poisson process N with intensity λ, stopping time T (i.e. t 0, {T t} σ(n s, s t)) then on {T < + }, {N T +t N T } t 0 : Poisson with intensity λ and independent of {N s } s T Superposition: For independent Poisson processes N i with intensities λ i, i = 1,..., n then N = i N i: Poisson with intensity i λ i Thinning: For Poisson process N {T n } n>0 with intensity λ, {Z n } n>0 independent of N, i.i.d., valued in [k], processes N i : N i (C) = n>0 I T n C I Zn=i are independent, Poisson with intensities λ i = λp(z n = i)

Markov jump processes Process {X t } t R+ with values in E, countable or finite, is Markov if P(X tn = x n X tn 1 = x n 1,... X t1 = x 1 ) = P(X tn = x n X tn 1 = x n 1 ), t n 1 Rn +, t 1 < < t n, x n 1 E n Homogeneous if P(X t+s = y X s = x) =: p xy (t) independent of s, s, t R +, x, y E

Markov jump processes Process {X t } t R+ with values in E, countable or finite, is Markov if P(X tn = x n X tn 1 = x n 1,... X t1 = x 1 ) = P(X tn = x n X tn 1 = x n 1 ), t n 1 Rn +, t 1 < < t n, x n 1 E n Homogeneous if P(X t+s = y X s = x) =: p xy (t) independent of s, s, t R +, x, y E Semi-group property p xy (t + s) = z E p xz(t)p zy (s), or P(t + s) = P(t)P(s) with P(t) = {p xy (t)} x,y E

Markov jump processes Process {X t } t R+ with values in E, countable or finite, is Markov if P(X tn = x n X tn 1 = x n 1,... X t1 = x 1 ) = P(X tn = x n X tn 1 = x n 1 ), t n 1 Rn +, t 1 < < t n, x n 1 E n Homogeneous if P(X t+s = y X s = x) =: p xy (t) independent of s, s, t R +, x, y E Semi-group property p xy (t + s) = z E p xz(t)p zy (s), or P(t + s) = P(t)P(s) with P(t) = {p xy (t)} x,y E Definition {X t } t R+ is a pure jump Markov process if in addition (i) It spends with probability 1 a strictly positive time in each state (ii) Trajectories t X t are right-continuous

Markov jump processes: examples Poisson process {N t } t R+ : then Markov jump process with p xy (t) = P(Poisson(λt) = y x)

Markov jump processes: examples Poisson process {N t } t R+ : then Markov jump process with p xy (t) = P(Poisson(λt) = y x) Single-server queue, FIFO ( First-in-first-out ) discipline, arrival times: N Poisson (λ), service times: i.i.d. Exp(µ) independent of N X t = number of customers present at time t: Markov jump process by Memoryless property of Exponential distribution + Markov property of Poisson process (the M/M/1/ queue)

Markov jump processes: examples Poisson process {N t } t R+ : then Markov jump process with p xy (t) = P(Poisson(λt) = y x) Single-server queue, FIFO ( First-in-first-out ) discipline, arrival times: N Poisson (λ), service times: i.i.d. Exp(µ) independent of N X t = number of customers present at time t: Markov jump process by Memoryless property of Exponential distribution + Markov property of Poisson process (the M/M/1/ queue) Infinite server queue with Poisson arrivals and Exponential service times: customer arrived at T n stays in system till T n + σ n, where σ n : service time X t = number of customers present at time t: Markov jump process (the M/M/ / queue)

Structure of Markov jump processes Infinitesimal Generator 1 p x, y, y x E, limits q x := lim xx (t) p t 0 t, q xy = lim xy (t) t 0 t exist in R + and satisfy y x q xy = q x q xy : Jump rate from x to y Q := {q xy } x,y E where q xx = q x : Infinitesimal Generator of process {X t } t R+ 1 Formally: Q = lim h 0 h [P(h) I ] where I : identity matrix

Structure of Markov jump processes Infinitesimal Generator 1 p x, y, y x E, limits q x := lim xx (t) p t 0 t, q xy = lim xy (t) t 0 t exist in R + and satisfy y x q xy = q x q xy : Jump rate from x to y Q := {q xy } x,y E where q xx = q x : Infinitesimal Generator of process {X t } t R+ 1 Formally: Q = lim h 0 h [P(h) I ] where I : identity matrix Structure of Markov jump processes Sequence {Y n } n N of visited states: Markov chain with transition q matrix p xy = I xy x y q x Conditionally on {Y n } n N, sojourn times {τ n } n N in successive states Y n : independent, with distributions Exp(q Yn )

Examples for Poisson process (λ): only non-zero jump rate q x,x+1 = λ = q x, x N

Examples for Poisson process (λ): only non-zero jump rate q x,x+1 = λ = q x, x N For FIFO M/M/1/ queue, non-zero rates: q x,x+1 = λ, q x,x 1 = µi x>0, x N hence q x = λ + µi x>0

Examples for Poisson process (λ): only non-zero jump rate q x,x+1 = λ = q x, x N For FIFO M/M/1/ queue, non-zero rates: q x,x+1 = λ, q x,x 1 = µi x>0, x N hence q x = λ + µi x>0 For M/M/ / queue, non-zero rates: q x,x+1 = λ, q x,x 1 = µx, x N hence q x = λ + µx

Structure of Markov jump processes (continued) Let T n := n 1 k=0 τ k: time of n-th jump. If T = + almost surely: trajectory determined on R +, hence generator Q determines law of process {X t } t R+

Structure of Markov jump processes (continued) Let T n := n 1 k=0 τ k: time of n-th jump. If T = + almost surely: trajectory determined on R +, hence generator Q determines law of process {X t } t R+ Process is called explosive if instead T < + with positive probability. Then process not completely characterized by generator

Structure of Markov jump processes (continued) Let T n := n 1 k=0 τ k: time of n-th jump. If T = + almost surely: trajectory determined on R +, hence generator Q determines law of process {X t } t R+ Process is called explosive if instead T < + with positive probability. Then process not completely characterized by generator Sufficient conditions for non-explosiveness: sup x E q x < + Recurrence of induced chain {Y n } n N For Birth and Death processes (i.e. E = N, only non-zero rates: β n = q n,n+1, birth rate; δ n = q n,n 1, death rate), non-explosiveness holds if 1 = + β n + δ n n>0

Kolmogorov s forward and backward equations Formal differentiation of P(t + h) = P(t)P(h) = P(h)P(t) yields d dt P(t) = P(t)Q d dt p xy(t) = z E p xz(t)q zy d dt d P(t) = QP(t) dt p xy(t) = z E q xzp zy (t) Kolmogorov s forward equation Kolmogorov s backward equation

Kolmogorov s forward and backward equations Formal differentiation of P(t + h) = P(t)P(h) = P(h)P(t) yields d dt P(t) = P(t)Q d dt p xy(t) = z E p xz(t)q zy d dt d P(t) = QP(t) dt p xy(t) = z E q xzp zy (t) Kolmogorov s forward equation Kolmogorov s backward equation 1 Follow directly from Q = lim h 0 h [P(h) I ] for finite E, in which case P(t) = exp(tq), t 0 Hold more generally in particular for non-explosive processes with a more involved proof (justifying exchange of summation and differentiation)

Stationary distributions and measures Definition Measure {π x } x E is stationary if it satisfies π T Q = 0, or equivalently the global balance equations x E, π x y x q xy = y x π y q yx flow out of x flow into x,

Stationary distributions and measures Definition Measure {π x } x E is stationary if it satisfies π T Q = 0, or equivalently the global balance equations x E, π x y x q xy = y x π y q yx flow out of x flow into x Kolmogorov s equations suggest that, if X 0 π for stationary π then X t π for all t 0,

Stationary distributions and measures Definition Measure {π x } x E is stationary if it satisfies π T Q = 0, or equivalently the global balance equations x E, π x y x q xy = y x π y q yx flow out of x flow into x Kolmogorov s equations suggest that, if X 0 π for stationary π then X t π for all t 0, Example: stationarity for birth and death processes π 0 β 0 = π 1 δ 1, π x (β x + δ x ) = π x 1 β x 1 + π x+1 δ x+1, x 1

Irreducibility, recurrence, invariance Definition Process {X t } t R+ is irreducible (respectively, irreducible recurrent) if induced chain {Y n } n N is. State x is positive recurrent if E x (R x ) < +, where R x = inf{t > τ 0 : X t = x}. Measure π is invariant for process {X t } t R+ if for all t > 0, π T P(t) = π T, i.e. x E, y E π y p yx (t) = π x.

Limit theorems 1 Theorem For irreducible recurrent {X t } t R+, invariant measure π, unique up to some scalar factor. It can be defined as, for any x E: Rx y E, π y = E x I Xt=y dt, or alternatively with T x := inf{n > 0 : Y n = x}, y E, π y = 1 T x E x I Yn=y. q y 0 n=1

Limit theorems 1 Theorem For irreducible recurrent {X t } t R+, invariant measure π, unique up to some scalar factor. It can be defined as, for any x E: Rx y E, π y = E x I Xt=y dt, or alternatively with T x := inf{n > 0 : Y n = x}, y E, π y = 1 T x E x I Yn=y. q y 0 n=1 Corollaries {ˆπ y } invariant for {Y n } n N {ˆπ y /q y } invariant for {X t } t R+. For irreducible recurrent {X t } t R+, either all or no state x E is positive recurrent.

Limit theorems 2 Theorem {X t } t R+ is ergodic (i.e. irreducible, positive recurrent) iff it is irreducible, non-explosive and such that π satisfying global balance equations. Then π is also the unique invariant probability distribution.

Limit theorems 2 Theorem {X t } t R+ is ergodic (i.e. irreducible, positive recurrent) iff it is irreducible, non-explosive and such that π satisfying global balance equations. Then π is also the unique invariant probability distribution. Theorem For ergodic {X t } t R+ with stationary distribution π, any initial distribution for X 0 and π-integrable f, 1 t almost surely lim f (X s )ds = π x f (x) (ergodic theorem) t t 0 x E and in distribution X t D π as t.

Limit theorems 3 Theorem For irreducible, non-ergodic {X t } t R+, any initial distribution for X 0, then for all x E, lim P(X t = x) = 0. t

Time reversal and reversibility For stationary ergodic {X t } t R with stationary distribution π, time-reversed process X t = X t : πy qyx Markov with transition rates q xy = π x

Time reversal and reversibility For stationary ergodic {X t } t R with stationary distribution π, time-reversed process X t = X t : πy qyx Markov with transition rates q xy = π x Definition Stationary ergodic {X t } t R with stationary distribution π reversible iff distributed as time-reversal { X t } t R, i.e. x y E, π x q xy = π y q yx, flow from x to y flow from y to x detailed balance equations.

Time reversal and reversibility For stationary ergodic {X t } t R with stationary distribution π, time-reversed process X t = X t : πy qyx Markov with transition rates q xy = π x Definition Stationary ergodic {X t } t R with stationary distribution π reversible iff distributed as time-reversal { X t } t R, i.e. x y E, π x q xy = π y q yx, flow from x to y flow from y to x detailed balance equations. Detailed balance, i.e. reversibility for π implies global balance for π. Example: for birth and death processes, detailed balance always holds for stationary measure.

Reversibility and truncation Proposition Let generator Q on E admit reversible measure π. Then for subset F E, truncated generator ˆQ: ˆQ xy = Q xy, x y F, ˆQ xx = ˆQ y x xy, x F admits {π x } x F as reversible measure.

Erlang s model of telephone network Call types s S: type-s calls arrive at instants of Poisson (λ s ) process, last (if accepted) for duration Exponential (µ s ) type-s calls require one circuit (unit of capacity) per link l s Link l has capacity C l circuits

Erlang s model of telephone network Call types s S: type-s calls arrive at instants of Poisson (λ s ) process, last (if accepted) for duration Exponential (µ s ) type-s calls require one circuit (unit of capacity) per link l s Link l has capacity C l circuits Stationary probability distribution: π x = 1 Z s S ρ xs s x s! I s l xs C, l where: ρ s = λ s /µ s, Z: normalizing constant. l

Erlang s model of telephone network Call types s S: type-s calls arrive at instants of Poisson (λ s ) process, last (if accepted) for duration Exponential (µ s ) type-s calls require one circuit (unit of capacity) per link l s Link l has capacity C l circuits Stationary probability distribution: π x = 1 Z s S ρ xs s x s! I s l xs C, l where: ρ s = λ s /µ s, Z: normalizing constant. Basis for dimensioning studies of telephone networks (prediction of call rejection probabilities) More recent application: performance analysis of peer-to-peer systems for video streaming. l

Jackson networks Stations i I receive external arrivals at Poisson rate λ i Station i when processing x i customers completes service at rate µ i φ i (x i ) (e.g.: φ i (x) = min(x i, n i ): queue with n i servers and service times Exponential (µ i )) After completing service at station i, customer joins station j with probability p ij, j I, and leaves system with probability 1 j I p ij Matrix P = (p ij ): sub-stochastic, such that (I P) 1

Jackson networks Stations i I receive external arrivals at Poisson rate λ i Station i when processing x i customers completes service at rate µ i φ i (x i ) (e.g.: φ i (x) = min(x i, n i ): queue with n i servers and service times Exponential (µ i )) After completing service at station i, customer joins station j with probability p ij, j I, and leaves system with probability 1 j I p ij Matrix P = (p ij ): sub-stochastic, such that (I P) 1 Traffic equations i I, λ i = λ i + j I λ j p ji or λ = (I P T ) 1 λ

Jackson networks (continued) Stationary measure: π x = i I xi ρ x i i m=1 φ i(m), where ρ i = λ i /µ i, and λ i : solutions of traffic equations

Jackson networks (continued) Stationary measure: π x = i I xi ρ x i i m=1 φ i(m), where ρ i = λ i /µ i, and λ i : solutions of traffic equations Application: process ergodic when π has finite mass. e.g. for φ i (x) = min(x, n i ), ergodicity iff i I, ρ i < n i.

Jackson networks (continued) Stationary measure: π x = i I xi ρ x i i m=1 φ i(m), where ρ i = λ i /µ i, and λ i : solutions of traffic equations Application: process ergodic when π has finite mass. e.g. for φ i (x) = min(x, n i ), ergodicity iff i I, ρ i < n i. Proof: verify partial balance equations for all x N I : i I, π x [ j i q x,x e i +e j + q x,x ei ] = j i π x e i +e j q x ei +e j,x + π x ei q x ei,x, π x i I q x,x+e i = i I π x+e i q x+ei,x, which imply global balance equations π x [ i I (q x,x e i + q x,x+ei + j i q x,x e i +e j )] = i I (π x e i q x ei,x + π x+ei q x+ei,x + j i π x e i +e j q x ei +e j,x)

Takeaway messages Poisson process a fundamental continuous-time process, adequate model for aggregate of infrequent independent events Markov jump processes: i) generator Q characterizes distribution if not explosive ii) Balance equation characterizes invariant distribution if irreducible non-explosive iii) Limit theorems: stationary distribution reflects long-term performance Exactly solvable models include reversible processes, plus several other important classes (e.g. Jackson networks)