Continuous time Markov chains

Similar documents
Continuous-time Markov Chains

The Transition Probability Function P ij (t)

Lecture 4a: Continuous-Time Markov Chain Models

Markov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued

Part I Stochastic variables and Markov chains

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Continuous-Time Markov Chain

IEOR 6711, HMWK 5, Professor Sigman

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

Stochastic process. X, a series of random variables indexed by t

Statistics 150: Spring 2007

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Exponential Distribution and Poisson Process

STAT 380 Continuous Time Markov Chains

Markov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015

Lecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012

Quantitative Model Checking (QMC) - SS12

1 IEOR 4701: Continuous-Time Markov Chains

The exponential distribution and the Poisson process

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Data analysis and stochastic modeling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

Figure 10.1: Recording when the event E occurs

(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?

Poisson Processes. Stochastic Processes. Feb UC3M

Birth-Death Processes

SMSTC (2007/08) Probability.

Continuous Time Markov Chains

Examination paper for TMA4265 Stochastic Processes

3. Poisson Processes (12/09/12, see Adult and Baby Ross)

Markov processes and queueing networks

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

MARKOV PROCESSES. Valerio Di Valerio

Lecture 20: Reversible Processes and Queues

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Markov Processes Cont d. Kolmogorov Differential Equations

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

LECTURE #6 BIRTH-DEATH PROCESS

Introduction to Queuing Networks Solutions to Problem Sheet 3

Homework 4 due on Thursday, December 15 at 5 PM (hard deadline).

Lecture 20 : Markov Chains

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Lecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking

Math 597/697: Solution 5

Markov Chains. October 5, Stoch. Systems Analysis Markov chains 1

1 Poisson processes, and Compound (batch) Poisson processes

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

Birth-death chain models (countable state)

Assignment 3 with Reference Solutions

ECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains

errors every 1 hour unless he falls asleep, in which case he just reports the total errors

Since D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.

Q = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?

6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.

Continuous Time Processes

Probability and Statistics Concepts

Continuous time Markov chains

MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science

Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Gaussian, Markov and stationary processes

Applied Stochastic Processes

Outlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)

STOCHASTIC PROCESSES Basic notions

M/G/1 and M/G/1/K systems

MAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.

Massachusetts Institute of Technology

STAT STOCHASTIC PROCESSES. Contents

Queueing Theory. VK Room: M Last updated: October 17, 2013.

Stochastic Modelling Unit 1: Markov chain models

STAT 380 Markov Chains

EE126: Probability and Random Processes

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

PoissonprocessandderivationofBellmanequations

Chapter 16 focused on decision making in the face of uncertainty about one future

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Modelling in Systems Biology

Probability Distributions

Midterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley

A review of Continuous Time MC STA 624, Spring 2015

Reading: Karlin and Taylor Ch. 5 Resnick Ch. 3. A renewal process is a generalization of the Poisson point process.

IEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013

EE126: Probability and Random Processes

Page 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.

Readings: Finish Section 5.2

Disjointness and Additivity

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

Notes on Continuous Random Variables

ACM 116 Problem Set 4 Solutions

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Markov Chain Model for ALOHA protocol

Part II: continuous time Markov chain (CTMC)

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

Transcription:

Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017 Stoch. Systems Analysis Continuous time Markov chains 1

Exponential random variables Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 2

Exponential distribution Exponential RVs are used to model times at which events occur Or in general time elapsed between occurrence of random events RV T exp(λ) is exponential with parameter λ if its pdf is f T (t) = λe λt, for all t 0 The cdf, integral of the pdf, is (t 0) F T (t) = 1 e λt Complementary (c)cdf is P(T t) = 1 F T (t) = e λt 1 pdf 1 cdf 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 0 0 0.5 1 1.5 2 2.5 3 3.5 4 4.5 Stoch. Systems Analysis Continuous time Markov chains 3

Expected value The expected value of time T exp(λ) is E [T ] = tλe λt = te λt + 0 0 0 e λt = 0 + 1 λ Integrated by parts with u = t, dv = λe λt Mean time is inverse of parameter λ λ is rate/frequency of events happening at intervals T Average of λt events in time t Bigger lambda, smaller expected times, larger frequency of events T 1 T 2 S 1 S 2 t = 0 T 3 S 3 T 4 S 4 T 5 T 6 S 5 S 6 t = 5/λ T 7 T 8 S 7 S 8 T 9 T 10 S 9 S 10 t t = 10/λ Stoch. Systems Analysis Continuous time Markov chains 4

Second moment and variance For second moment also integrate by parts (u = t 2, dv = λe λt ) E [ T 2] = t 2 λe λt = t 2 e λt + 2te λt 0 0 0 Fisrt term is 0, second is, except for a constant, as computed before E [ T 2] = 2 λ 0 tλe λt = 2 λ 2 The variance is computed from the mean and variance var [T ] = E [ T 2] E [T ] 2 = 2 λ 2 1 λ 2 = 1 λ 2 Parameter λ controls mean and variance of exponential RV Stoch. Systems Analysis Continuous time Markov chains 5

Memoryless random times Consider random time T. We say time T is memoryless if P [ T > s + t T > t ] = P [T > s] Probability of waiting s extra units of time given that we waited t seconds is just the probability of waiting s seconds System does not remember it has already waited t units Same probability irrespectively of time already elapsed Ex: Chemical reaction A + B AB occurs when molecules A and B collide. A, B move around randomly. Time T until reaction Ex: Group of molecules of type A and type B. Reaction occurs when any type A encounters any type B. Time T until next reaction Stoch. Systems Analysis Continuous time Markov chains 6

Exponential RVs are memoryless Write memoryless property in terms of joint pdf P [ T > s + t T > t ] = P [T > s + t, T > t] P [T > t] = P [T > s] Notice that having T > s + t and T > t is equivalent to T > s + t Replace P [T > s + t, T > t] = P [T > s + t] and reorder terms P [T > s + t] = P [T > t]p [T > s] If T is exponentially distributed ccdf is P [T > t] = e λt. Then P [T > s + t] = e λ(s+t) = e λt e λs = P [T > t] P [T > s] If random time T is exponential T is memoryless Stoch. Systems Analysis Continuous time Markov chains 7

Continuous memoryless RVs are exponential Consider a function g(t) with the property g(t + s) = g(t)g(s) Functional form of g(t)? Take logarithms log g(t + s) = log g(t) + log g(s) Can only be true for all t and s if log g(t) = ct for some constant c Which in turn, can only true if g(t) = e ct for some constant c Compare observation with statement of memoryless property P [T > s + t] = P [T > t] P [T > s] It must be P [T > t] = e ct for some constant c If T is continuous this can only be true for exponential T If T discrete it is geometric P [T > t] = (1 p) t with (1 p) = e c If continuous random time T is memoryless T is exponential Stoch. Systems Analysis Continuous time Markov chains 8

Combining results Theorem A continuous random variable T is memoryless if and only if it is exponentially distributed. That is P [ T > s + t T > t ] = P [T > s] if and only if f T (t) = λe λt for some λ > 0 Exponential RVs are memoryless. Do not remember elapsed time They are the only type of continuous memoryless RV Discrete RV T is memoryless if and only of it is geometric Geometrics are discrete approximations of exponentials Exponentials are continuous limits of geometrics Exponential = time until success Geometric = nr. trials until success Stoch. Systems Analysis Continuous time Markov chains 9

Time to first event Independent exponential RVs T 1, T 2 with parameters λ 1, λ 2 Probability distribution of first event, i.e., T := min(t 1, T 2 )? For having T > t we need both T 1 > t and T 2 > t Using independence of X and Y we can write P [T > t] = P [T 1 > t] P [T 2 > t] = ( 1 F T1 (t) )( 1 F T2 (t) ) Substituting expressions of exponential cdfs P [T > t] = e λ1t e λ2t = e (λ1+λ2)t T is exponentially distributed with parameter λ 1 + λ 2 Minimum of exponential variables is exponential Given two events happening at random exponential times (T 1, T 2 ) Time to any of them happening is also exponential Stoch. Systems Analysis Continuous time Markov chains 10

First event to happen Prob. P [T 1 < T 2 ] of T 1 exp(λ 1 ) happening before T 2 exp(λ 2 ) Condition on T 2 = t and integrate over the pdf of T 2 P [T 1 < T 2 ] = 0 P [ T 1 < t T2 = t ] f T2 (t) dt = Substitute expressions for exponential pdf and cdf P [T 1 < T 2 ] = Either X comes before Y or vice versa then 0 (1 e λ1t )λ 2 e λ2t dt = λ 1 λ 1 + λ 2 P [T 2 < T 1 ] = 1 P [T 1 < T 2 ] = λ 2 λ 1 + λ 2 Probabilities are relative values of parameters 0 F T1 (t)f T2 (t) dt Larger parameter smaller average larger prob. happening first Stoch. Systems Analysis Continuous time Markov chains 11

Probability of event in infinitesimal time Probability of an event happening in infinitesimal time h? Want P [T < h] for small h, Equivalent to P [T < h] = P [T < t] t h 0 = λ t=0 λe λt dt λh Sometimes also write P [T < h] = λh + o(h) o(h) o(h) implies lim = 0. Read as negligible with respect to h h 0 h Two independent events in infinitesimal time h? P [T 1 h, T 2 h] (λ 1 h)(λ 2 h) = λ 1 λ 2 h 2 = o(h) Stoch. Systems Analysis Continuous time Markov chains 12

Counting and Poisson processes Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 13

Counting processes Stochastic process N(t) taking integer values 0, 1,..., i,... Counting process N(t) counts number of events occurred by time t Nonnegative integer valued: N(0) = 0, N(t) = 0, 1, 2,..., Nondecreasing: for s < t it holds N(s) N(t) Event counter: N(t) N(s) = number of events in interval (s, t] Continuous on the right Ex.1: # text messages (SMS) typed since beginning of class Ex.2: # economic crisis since 1900 Ex.3: # customers at Wawa since 8am this morning 6 5 4 3 2 1 n(t) S 1 S 2 S 3 S 4 S 5 S 6 t Stoch. Systems Analysis Continuous time Markov chains 14

Independent increments Number of events in disjoint time intervals are independent Consider times s 1 < t 1 < s 2 < t 2 and intervals (s 1, t 1 ] and (s 2, t 2 ] N(t 1 ) N(s 1 ) events occur in (s 1, t 1 ]. N(t 2 ) N(s 2 ) in (s 2, t 2 ] Independent increments implies latter two are independent P [N(t 1 ) N(s 1 ) = k, N(t 2 ) N(s 2 ) = l] = P [N(t 1 ) N(s 1 ) = k] P [N(t 2 ) N(s 2 ) = l] Ex.1: Likely true for SMS, except for have to send messages Ex.2: Most likely not true for economic crisis (business cycle) Ex.3: Likely true for Wawa, except for unforeseen events (storms) Does not mean N(t) independent of N(s) These events are clearly not independent, since N(t) is at least N(s) Stoch. Systems Analysis Continuous time Markov chains 15

Stationary increments Prob. dist. of number of events depends on length of interval only Consider time intervals (0, t] and (s, s + t] N(t) events occur in (0, t]. N(s + t) N(s) events in (s, s + t] Stationary increments implies latter two have same prob. dist. P [N(s + t) N(s) = k] = P [N(t) = k] Ex.1: Likely true if lecture is good and you keep interest in the class Ex.2: Maybe true if you do not believe we are becoming better at preventing economic crisis Ex.3: Most likely not true because of, e.g., rush hours and slow days Stoch. Systems Analysis Continuous time Markov chains 16

Poisson process A counting process is Poisson if it has the following properties (a) The process has stationary and independent increments (b) The number of events in (0, t] has Poisson distribution with mean λt P [N(t) = n] = e λt (λt) n An equivalent definition is the following (i) The process has stationary and independent increments (ii) Prob. of event in infinitesimal time P [N(h) = 1] = λh + o(h) (iii) At most one event in infinitesimal time P [N(h) > 1] = o(h) This is a more intuitive definition (even though difficult to believe now) Conditions (i) and (a) are the same That (b) implies (ii) and (iii) is obvious. Just substitute small h in Poisson pmf s expression for P [N(t) = n] To see that (ii) and (iii) imply (b) requires some work n! Stoch. Systems Analysis Continuous time Markov chains 17

Explanation of model (i)-(iii) Consider time T and divide interval (0, T ] in n subintervals Subintervals are of duration h = T /n, h vanishes as n increases The m th subinterval spans ( (m 1)h, mh ] Define A m as the number of events that occur in m th subinterval A m = N ( mh ) N ( (m 1)h ) The total number of events in (0,T] is the sum of A m s N(T ) = n A m= m=1 n N ( mh ) N ( (m 1)h ) In figure, N(T ) = 5, A 1, A 2, A 4, A 7, A 8 are 1 and A 3, A 5, A 6 are 0 m=1 T1 T2 T3 T4 T5 t h h h h h h h h t = 0 t = T Stoch. Systems Analysis Continuous time Markov chains 18

Probability distribution of A m (intuitive arg.) Note first that since increments are stationary as per (i), it holds P [A m = k] = P [ N ( mh ) N ( (m 1)h ) = k ] = P [N(h) = k] In particular, using (ii) and (iii) P [A m = 1] = P [N(h) = 1] = λh + o(h) P [A m > 1] = P [N(h) > 1] = o(h) Set aside o(h) probabilities They re negligible with respect to λh P [A m = 1] = λh P [A m = 0] = 1 λh A m is Bernoulli with parameter λh T1 T2 T3 T4 T5 t h h h h h h h h t = 0 t = T Stoch. Systems Analysis Continuous time Markov chains 19

Probability distribution of N(T ) (intuitive arg.) Since increments are also independent as per (i), A m are independent N(T ) is sum of n independent Bernoulli RVs with parameter λh N(T ) is binomial with parameters (n, λh) = (n, λt /n) As interval length h 0, number of intervals n The product n(λh) = λt stays constant N(T ) is Poisson with parameter λt Then (ii)-(iii) imply (b) and definitions are equivalent Not a proof because we neglected o(h) terms. But explains what a Poisson process is T1 T2 T3 T4 T5 t h h h h h h h h t = 0 t = T Stoch. Systems Analysis Continuous time Markov chains 20

What is a Poisson process? Events happen in small interval h with probability λh proportional to h Whether event happens in an interval has no effect on other intervals Modeling questions Expect probability of event proportional to length of interval? Expect subsequent intervals to behave independently? Then a Poisson process model is appropriate Typically arise in a large population of agents acting independently Larger interval, larger chance an agent takes an action Action of one agent has no effect on action of other agents Has therefore negligible effect on action of group Stoch. Systems Analysis Continuous time Markov chains 21

Examples of Poisson processes Ex.1: Number of people arriving at subway station. Number of cars arriving at a highway entrance. Number of bids in an auction. Number of customers entering a store... Large number of agents (people, drivers, bidders, customers) acting independently. Ex.2: SMS generated by all students in the class. Once you send a SMS you are likely to stay silent for a while. But in a large population this has a minimal effect in the probability of someone generating a SMS Ex.3: Count of molecule reactions. Molecules are removed from pool of reactants once they react. But effect is negligible in large population. Eventually reactants are depleted, but in small time scale process is approximately Poisson Stoch. Systems Analysis Continuous time Markov chains 22

Formal argument Define A max = max (A m), maximum nr. of events in one interval m=1,...,n If A max 1 all intervals have 0 or 1 events Easy (binomial) Consider conditional probability P [ N(T ) = k A max 1 ] For given h, N(T ) conditioned on A max 1 is binomial Parameters are n = T /h and p = λh + o(h) Interval length h 0 parameter p 0, nr. of intervals n Product np lim h 0 np = lim h 0 (T /h)(λh + o(h)) = λt N(T ) conditioned on A max 1 is Poisson with parameter λt P [ N(T ) = k A max 1 ] λt λtk = e k! Stoch. Systems Analysis Continuous time Markov chains 23

Formal argument (continued) Separate study in A max 1 and A max > 1. That is, condition P [N(T ) = k] = P [ N(T ) = k Amax 1 ] P [A max 1] + P [ N(T ) = k Amax > 1 ] P [A max > 1] Property (iii) implies that P [A max > 1] vanishes as h 0 P [A max > 1] n m=1 P [A m > 1] = no(h) = T o(h) h 0 Thus, as h 0, P [A max > 1] 0 and P [A max 1] 1. Then lim P [N(T ) = k] = lim P [ N(T ) = k Amax 1 ] h 0 h 0 Latter is, as already seen, Poisson Then N(T ) Poisson with parameter λt Stoch. Systems Analysis Continuous time Markov chains 24

Properties of Poisson processes Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 25

Interarrival times Let T 1, T 2,... be sequence of times between events T 1 is time until first event (arrival). T 2 is time between second and first event. T i is time between i 1-st and i-th event Ccdf of T 1 P [T 1 > t] = P [N(t) = 0] = e λt T 1 has exponential distribution with parameter λ Since we have independent increments this is likely true for all T i Theorem Interarrival times T i of a Poisson process are independent identically distributed exponential random variables with parameter λ, i.e., P [T i > t] = e λt Have already proved for T 1. Let us see the rest. Stoch. Systems Analysis Continuous time Markov chains 26

Interarrival times Proof. Let S i be absolute time of i-th event. Condition on S i P [T i+1 > t] = P [ T i+1 > t S i = s ] P [S i = s] ds To have T i+1 > t given that S i = s it must be N(s + t) = N(s) P [ T i+1 > t Si = s ] = P [ N(t + s) N(s) = 0 N(s) ] Since increments are independent conditioning on N(s) is moot P [ T i+1 > t Si = s ] = P [N(t + s) N(s) = 0] Since increments are also stationary the latter is P [ T i+1 > t S i = s ] = P [N(t) = 0] = e λt Substituting into integral yields P [T i+1 > t] = e λt Stoch. Systems Analysis Continuous time Markov chains 27

Alternative definition of Poisson process Start with sequence of independent random times T 1, T 2,... Times T i exp(λ) have exponential distribution with parameter λ Define time of i-th event S i S i = T 1 + T 2 +... + T i Define counting process of events happening at S i N(t) = max(s i t) i N(t) is a Poisson process N(t) T 1 T 2 T 3 T 4 T 5 T 6 6 5 4 3 2 1 S 1 S 2 S 3 S 4 S 5 S 6 t If N(t) is a Poisson process interarrival times T i are exponential To show that definition is equivalent have to show the converse I.e., if interarrival times are exponential, process is Poisson Stoch. Systems Analysis Continuous time Markov chains 28

Alternative definition of Poisson process (cont.) Exponential i.i.d interarrival times Poisson process? Show that implies definition (i)-(iii) Stationary true because all T i have same distribution Independent increments true because interarrival times are independent and exponential RVs are memoryless Can have more than two events in (0, h] only if T 1 < h and T 2 < h P [N(h) > 1] P [T 1 h] P [T 2 h] We have no event in (0, h] if T 1 > h = (1 e λh ) 2 = (λh) 2 + o(h 2 ) = o(h) P [N(h) = 0] = P [T 1 h] = e λh = 1 λh + o(h) The remaining case is N(h) = 1 whose probability is P [N(h) = 1] = 1 P [N(h) = 0] P [N(h) > 1] = λh + o(h) Stoch. Systems Analysis Continuous time Markov chains 29

Three definitions of Poisson processes Def. 1: Prob. of event proportional to interval width. Intervals independent Physical model definition. Can a phenomenon be reasonably modeled as a Poisson process? The other two definitions are used for analysis and/or simulation Def. 2: Prob. distribution of events in (0, t] is Poisson Event centric definition. Nr. of events in given time intervals Allows analysis and simulation Used when information about nr. of events in given time is desired Stoch. Systems Analysis Continuous time Markov chains 30

Three definitions of Poisson processes (continued) Def. 3: Prob. distribution of interarrival times is exponential Time centric definition. Times at which events happen Allows analysis and simulation. Used when information about event times is of interest Obs: Restrictions in Def. 1 are mild, yet they impose a lot of structure as implied by Defs. 2 & 3 Stoch. Systems Analysis Continuous time Markov chains 31

Example: Number of visitors to a web page Nr. of unique visitors to a webpage between 6:00pm to 6:10pm Def 1: Poisson process? Probability proportional to time interval and independent intervals seem reasonable assumptions Model as Poisson process with rate λ visits/second (v/s) Def 2: Arrivals in interval of duration t are Poisson with parameter λt Expected nr. of visits in 10 minutes? Prob. of exactly 10 visits in 1 sec? E [N(600)] = 600λ P [N(1) = 10] = e λ λ/10! Data shows N average visits in 10 minutes (600s). Approximate λ Since E [N(600)] = 600λ can make ˆλ = N/600 Stoch. Systems Analysis Continuous time Markov chains 32

Number of visitors to a web page (continued) Def 3: Interarrival times T i are exponential with parameter λ Expected time between visitors? E [T i ] = 1/λ Expected arrival time S n of n-th visitor? Can write time S n as sum of T i, i.e., S n = n i=1 T i. Taking expected value E [S n ] = n i=1 E [T i] = n/λ Stoch. Systems Analysis Continuous time Markov chains 33

Continuous time Markov chains Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 34

Definition Continuous time positive variable t R + States X (t) taking values in countable set, e.g., 0, 1,..., i,... Stochastic process X (t) is a continuous time Markov chain (CTMC) if P [ X (t + s) = j X (s) = i, X (u) = x(u), u < s ] = P [ X (t + s) = j X (s) = i ] Memoryless property The future X (t + s) given the present X (s) is independent of the past X (u) = x(u), u < s In principle need to specify functions P [ X (t + s) = j X (s) = i ] For all times t, for all times s and for all pairs of states (i, j) Stoch. Systems Analysis Continuous time Markov chains 35

Notation, homogeneity Notation: X [s : t] state values for all times s u t, includes borders X (s : t) values for all times s < u < t, borders excluded X (s : t] values for all times s < u t, exclude left, include right X [s : t) values for all times s u < t, include left, exclude right Homogeneous CTMC if P [ X (t + s) = j X (s) = i ] constant for all s Still need P [ X (t + s) = j ] X (s) = i for all t and pairs (i, j) We restrict consideration to homogeneous CTMCs Memoryless property makes it somewhat simpler Stoch. Systems Analysis Continuous time Markov chains 36

Transition times T i = time until transition out of state i into any other state j T i is a RV called transition time with cdf P [T i > t] = P [ X (0 : t] = i X (0) = i ] Probability of T i > t + s given that T i > s? Use cdf expression P [ T i > t + s Ti > s ] = P [ X (0 : t + s] = i X [0 : s] = i ] = P [ X (s : t + s] = i X [0 : s] = i ] = P [ X (s : t + s] = i X (s) = i ] = P [ X (0 : t] = i X (0) = i ] Equalities true because: Already observed that X [0 : s] = i. Memoryless property. Homogeneity From cdf expression P [ T i > t + s T i > s ] = P [T i > t] Transition times are exponential RVs Stoch. Systems Analysis Continuous time Markov chains 37

Alternative definition Exponential transition times is a fundamental property of CTMCs Can be used as algorithmic definition of CTMCs Continuous times stochastic process X (t) is a CTMC if Transition times T i are exponential RVs with mean 1/ν i When they occur, transitions out of i are into j with probability P ij P ij = 1, P ii = 0 j=1 Transition times T i and transitioned state j are independent Define matrix P grouping transition probabilities P ij CTMC states evolve as a discrete time MC State transitions occur at random exponential intervals T i exp(ν i ) As opposed to occurring at fixed intervals Stoch. Systems Analysis Continuous time Markov chains 38

Embedded discrete time MCs Definition Consider a CTMC with transition probs. P and rates ν i. The (discrete time) MC with transition probs. P is the CTMC s embedded MC Transition probabilities P describe a discrete time MC States do not transition into themselves (P ii = 0, P s diagonal null) Can use underlying discrete time MCs to understand CTMCs Accesibility: State j is accessible from state i if it is accessible in the discrete time MC with transition probability matrix P Communication: States i and j communicate if they communicate in the discrete time MC with transition probability matrix P Communication is a class property. Proof: It is for discrete time MC Recurrent/Transient. Recurrence is class property. Etc. More later Stoch. Systems Analysis Continuous time Markov chains 39

Transition rates Expected value of transition time T i is E [T i ] = 1/ν i Can interpret ν i as the rate of transition out of state i Of these transitions, P ij of them are into state j Can interpret q ij := ν i P ij as transition rate from i to j. Define so Transition rates are yet another specification of CTMC If q ij are given can recover ν i as ν i = ν i j=1,j i P ij = j=1,j i ν i P ij = And can recover P ij as P ij = q ij /ν i = q ij ( j=1,j i j=1,j i q ij q ij ) 1 Stoch. Systems Analysis Continuous time Markov chains 40

Example: Birth and death processes State X (t) = 0, 1,... Interpret as number of individuals Birth and deaths occur at state-dependent rates. When X (t) = i Births individuals added at exponential times with mean 1/λ i birth or arrival rate = λ i births per unit of time Deaths individuals removed at exponential times with rate 1/µ i death or departure rate = µ i births per unit of time Birth and death times are independent As are subsequent birth and death processes Birth and death (BD) processes are then CTMC Stoch. Systems Analysis Continuous time Markov chains 41

Transition times and probabilities Transition times. Leave state i 0 when birth or death occur If T B and T D are times to next birth and death, T i = min(t B, T D ) Since T B and T D are exponential, so is T i with rate ν i = λ i + µ i When leaving state i can go to i + 1 (birth first) or i 1 (death first) Birth occurs before death with probability = P i,i+1 λ i + µ i Death occurs before birth with probability µ i = P i,i 1 λ i + µ i Leave state 0 only if a birth occurs (it at all) then ν 0 = λ 0 P 01 = 1 If leaves 0, goes to 1 with probability 1. Might not leave 0 if λ 0 = λ i Stoch. Systems Analysis Continuous time Markov chains 42

Transition rates Rate of transition from i to i + 1 is (recall definition, q ij = ν i P ij ) q i,i+1 = ν i P i,i+1 = (λ i + µ i ) = λ i λ i + µ i Likewise, rate of transition from i to i 1 is q i,i 1 = ν i P i,i 1 = (λ i + µ i ) = µ i λ i + µ i For i = 0, q 01 = ν 1 P 01 = λ 0 λ i µ i 0 λ 0 λ i 1 λ i λ i+1... i 1 i i +1... µ 1 µ i µ i+1 Somewhat more natural representation. More similar to discrete MC Stoch. Systems Analysis Continuous time Markov chains 43

M/M/1 queue A M/M/1 queue is a BD process with λ i = λ and µ i = µ constant Customers arrive for service at a rate of λ per unit of time They are serviced at a rate of µ customers per time unit 0 λ λ λ λ... i 1 i i +1... µ µ µ The M/M is for Markov arrivals / Markov departures The 1 is because there is only one server Stoch. Systems Analysis Continuous time Markov chains 44

Transition probability function Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 45

Definition Two equivalent ways of specifying a CTMC Transition time averages 1/ν i + Transition probabilities P ij Easier description. Typical starting point for CTMC modeling Transition prob. function P ij (t) := P [ X (t + s) = j X (s) = i ] More complete description Compute transition prob. function from transition times and probs. Notice two obvious properties P ij (0) = 1, P ii (0) = 1 Stoch. Systems Analysis Continuous time Markov chains 46

Roadmap to determine P ij (t) Goal is to obtain a differential equation whose solution is P ij (t) For that, need to study change in P ij (t) when time changes slightly Separate in two subproblems Transition probabilities for small time h, P ij (h) Transition probabilities in t + h as function of those in t and h We can combine both results in two different ways Jump from 0 to t then to t + h Process runs a little longer Changes where the process is going to Forward equations Jump from 0 to h then to t + h Process starts a little later Changes where the process comes from Backward equations Stoch. Systems Analysis Continuous time Markov chains 47

Transition probability in infinitesimal time Theorem The transition probability functions P ii (t) and P ij (t) satisfy the following limits for time t going to 0 P ij (t) 1 P ii (t) lim = q ij, lim = ν i t 0 t t 0 t Since P ij (0) = 0, P ii (0) = 1 above limits are derivatives at t = 0 P ij (t) P ii (t) t = q ij, t=0 t = ν i t=0 Limits also imply that for small h P ij (h) = q ij h + o(h), P ii (h) = 1 ν i h + o(h) Transition rates q ij are instantaneous transition probabilities Transition probability coefficient for small time h Stoch. Systems Analysis Continuous time Markov chains 48

Transition probability in infinitesimal time (proof) Proof. Consider a small time h Since 1 P ii is the probability of transitioning out of state i 1 P ii (h) = P [T i < h] = ν i h + o(h) Divide by h and take limit For P ij (t) notice that since two or more transitions have o(h) prob P ij (h) = P [ X (h) = j ] X (0) = i = Pij P [T i h] + o(h) Since T i is exponential P [T i h] = ν i h + o(h). Then P ij (h) = ν i P ij + o(h) = q ij h + o(h) Divide by h and take limit Stoch. Systems Analysis Continuous time Markov chains 49

Chapman-Kolmogorov equations Theorem For all times s and t the transition probability functions P ij (t + s) are obtained from P ik (t) and P kj (s) as P ij (t + s) = P ik (t)p kj (s) k=0 As for discrete time MC, to go from i to j in time t + s Go from i to a certain k at time t P ik (t) In the remaining time s go from k to j Sum over all possible intermediate steps k P kj (s) Stoch. Systems Analysis Continuous time Markov chains 50

Chapman-Kolmogorov equations (proof) Proof. P ij (t + s) = P [ X (t + s) = j ] X (0) = i Definition of P ij (t + s) = P [ X (t + s) = j ] [ ] X (t) = k, X (0) = i P X (t) = k X (0) = i k=0 Sum of conditional probabilities = P [ X (t + s) = j ] X (t) = k Pik (t) Memoryless property of CTMC k=0 and definition of P ik (t) = P [ X (s) = j X (0) = k ] P ik (t) Time invariance / homogeneity k=0 = P kj (s)p ik (t) Definition of P kj (s) k=0 Stoch. Systems Analysis Continuous time Markov chains 51

Combining both results Let us combine the last two results to express P ij (t + h) Use Chapman-Kolmogorov s equations for 0 t h P ij (t + h) = P ik (t)p kj (h) = P ij (t)p jj (h) + k=0 Use infinitesimal time expression P ij (t + h) = P ij (t)(1 ν j h) + k=0,k i k=0,k i P ik (t)p kj (h) P ik (t)q kj h + o(h) Subtract P ij (t) from both sides and divide by h P ij (t + h) P ij (t) = ν j P ij (t) + P ik (t)q kj + o(h) h h k=0,k i The right hand side is a derivative ratio. Let h 0 Stoch. Systems Analysis Continuous time Markov chains 52

Kolmogorov s forward equations Theorem The transition probability functions P ij (t) of a CTMC satisfy the system of differential equations (for all pairs i, j) P ij (t) t = k=0,k j P ij (t)/ t = rate of change of P ij (t) q kj P ik (t) ν j P ij (t) q kj P ik (t) = (transition into k in 0 t) (rate of moving into j in next instant) ν j P ij (t) = (transition into j in 0 t) (rate of leaving j in next instant) Change in P ij (t) = k (moving into j from k) (leaving j) Kolmogorov s forward equations valid in most cases, but not always Stoch. Systems Analysis Continuous time Markov chains 53

Kolmogorov s backward equations For forward equations used Chapman-Kolmogorov s for 0 t h For backward equations we use 0 h t to express P ij (t + h) as P ij (t + h) = P ik (h)p kj (t) = P ii (h)p ij (t) + k=0 Use infinitesimal time expression P ij (t + h) = (1 ν i h)p ij (t) + k=0,k i k=0,k i P ik (h)p kj (t) q ik hp kj (t) + o(h) Subtract P ij (t) from both sides and divide by h P ij (t + h) P ij (t) = ν i P ij (t) + q ik P kj (t) + o(h) h h k=0,k i Stoch. Systems Analysis Continuous time Markov chains 54

Kolmogorov s backward equations Theorem The transition probability functions P ij (t) of a CTMC satisfy the system of differential equations (for all pairs i, j) P ij (t) t = k=0,k i q ik P kj (t) ν i P ij (t) ν i P ij (t) = (transition into j in h t) (do not do that if leave i in initial instant) q ik P kj (t) = (rate of transition into k in 0 h) (transition from k into j in h t) Forward equations change in P ij (t) if finish h later Backward equations change in P ij (t) if start h later Where process goes (forward) where process comes from (backward) Stoch. Systems Analysis Continuous time Markov chains 55

Determination of transition probability function Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 56

A CTMC with two states Simplest possible CTMC has only two states. Say 0 and 1 Transition rates are q 01 and q 10 Given transition rates can find mean transition times as q 01 0 1 ν 0 = j 0 q 0i = q 01 ν 1 = j 1 q 1i = q 10 q 10 Use Kolmogorov s equations to find transition probability functions P 00 (t), P 01 (t), P 10 (t), P 11 (t) Note that transition probs. out of each state sum up to one P 00 (t) + P 01 (t) = 1 P 10 (t) + P 11 (t) = 1 Stoch. Systems Analysis Continuous time Markov chains 57

Kolmogorov s forward equations Kolmogorov s forward equations (process runs a little longer) P ij(t) = q kj P ik (t) ν j P ij (t) For the two node CTMC k=0,k j P 00(t) = q 10 P 01 (t) ν 0 P 00 (t) P 10(t) = q 10 P 11 (t) ν 0 P 10 (t) P 01(t) = q 01 P 00 (t) ν 1 P 01 (t) P 11(t) = q 01 P 10 (t) ν 1 P 11 (t) Probabilities out of 0 sum up to 1 eqs. in first row are equivalent Probabilities out of 1 sum up to 1 eqs. in second row are equivalent Pick the equations for P 00 (t) and P 11 (t) Stoch. Systems Analysis Continuous time Markov chains 58

Solution of backward equations Use Relation between transition rates: ν 0 = q 01 and ν 1 = q 10 Probs. sum 1: P 01 (t) = 1 P 00 (t) and P 01 (t) = 1 P 00 (t) P 00(t) = q 10 [ 1 P00 (t) ] q 01 P 00 (t) = q 10 (q 10 + q 01 )P 00 (t) P 11(t) = q 01 [ 1 P11 (t) ] q 10 P 11 (t) = q 01 (q 10 + q 01 )P 11 (t) Can obtain exact same pair of equations from backward equations First order linear differential equations. Solutions are exponential For P 00 (t) propose candidate solution (just take derivate to check) P 00 (t) = q 10 q 10 + q 01 + ce (q10+q01)t To determine c use initial condition P 00 (t) = 1 Stoch. Systems Analysis Continuous time Markov chains 59

Solution of backward equations (continued) Evaluation of candidate solution at initial condition P 00 (t) = 1 yields 1 = q 10 q 10 + q 01 + c c = Finally transition probability function P 00 (t) P 00 (t) = q 01 q 10 + q 01 q 10 q 01 + e (q10+q01)t q 10 + q 01 q 10 + q 01 Repeat for P 11 (t). Same exponent, different constants P 11 (t) = q 01 q 10 + e (q10+q01)t q 10 + q 01 q 10 + q 01 As time goes to infinity, P 00 (t) and P 11 (t) converge exponentially Convergence rate depends on magnitude of q 10 + q 01 Stoch. Systems Analysis Continuous time Markov chains 60

Convergence of transition probabilities Recall P 01 = 1 P 00 and P 10 = 1 P 11. Steady state distributions lim P 00(t) = t lim P 11(t) = t q 10 q 10 + q 01 q 01 q 10 + q 01 lim P 01(t) = t lim P 10(t) = t q 01 q 10 + q 01 q 10 q 10 + q 01 Limit distribution exists and is independent of initial condition Compare across diagonals Stoch. Systems Analysis Continuous time Markov chains 61

Kolmogorov s forward equations in matrix form Restrict attention to finite CTMCs with N states Define matrix R R N N with elements r ij = q ij, r ii = ν i. Rewrite Kolmogorov s forward eqs. as (process runs a little longer) N N P ij(t) = q kj P ik (t) ν j P ij (t) = r kj P ik (t) k=1,k j k=1 Right hand side defines elements of a matrix product P(t) = P 11(t) P 1k (t) P 1N (t) P i1 (t) P ik (t) P in (t) P N1 (t) P Nk (t) P JN (t) r 11 r 1j r 1N r k1 r kj r kn r N1 r Nj r NN s 11 s 1j s 1N s i1 s ij s in s N1 s Nk s NN = R = P(t)R = P (t) Stoch. Systems Analysis Continuous time Markov chains 62

Kolmogorov s backward equations in matrix form Similarly, Kolmogorov s backward eqs. (process starts a little later) N P ij(t) = q ik P kj (t) ν i P ij (t) = r ik P kj (t) k=0,k i k=1 Right hand side also defines a matrix product P 11(t) P 1j (t) P 1N (t) P k1 (t) P kj (t) P kn (t) P N1 (t) P Nj (t) P NN (t) = P(t) R = r 11 r 1k r 1N r i1 r ik r in r N1 r Nk r JN s 11 s 1j s 1N s i1 s ij s in s N1 s Nk s NN = RP(t) = P (t) Stoch. Systems Analysis Continuous time Markov chains 63

Kolmogorov s equations in matrix form Matrix form of Kolmogorov s forward equation P (t) = P(t)R Matrix form of Kolmogorov s backward equation P (t) = RP(t) More similar than apparent But not equivalent because matrix product not commutative Notwithstanding both equations have to accept the same solution Stoch. Systems Analysis Continuous time Markov chains 64

Matrix exponential Kolmogorov s equations are first order linear differential equations They are coupled, though. P ij (t) depends on P kj(t) for all k Still accept exponential solution. Need to define matrix exponential Definition The matrix exponential e At of matrix At is defined as the series e At (At) n = n! n=0 = I + At + (At)2 2 + (At)3 2 3 +... Derivative of matrix exponential with respect to t e At t = 0 + A + A 2 t + A3 t 2 2 ) +... = A (I + At + (At)2 +... = Ae At 2 Putting A on right side of product shows that eat t = e At A Stoch. Systems Analysis Continuous time Markov chains 65

Solution of Kolmogorov s equations Propose solution of the form P(t) = e Rt P(t) solves backward equations. Derivative with respect to time is P(t) t = ert t It also solves forward equations = Re Rt = RP(t) P(t) t = ert t = e Rt R = P(t)R Also notice that P(0) = I. As it should (P ii (0) = 1, and P ij (0) = 0) Stoch. Systems Analysis Continuous time Markov chains 66

Unconditional probabilities P(t) is transition prob. from states at time 0 to states at time t Define unconditional probs. at time t, p j (t) := P [X (t) = i] Group in vector p(t) = [p 1 (t), p 2 (t),..., p j (t),...] T Given initial distribution p(0) find p j (t) conditioning on initial state p j (t) = P [ X (t) = j ] X (0) = i P [X (0) = i] = P ij (t)p i (0) i=0 i=0 Using matrix-vector notation p(t) = P T (t)p(0) Stoch. Systems Analysis Continuous time Markov chains 67

Limit probabilities Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 68

Recurrent and transient states Recall the embedded discrete time MC associated with any CTMC Transition probs. of MC form the matrix P of the CTMC States do not transition into themselves (P ii = 0, P s diagonal null) States i j communicate in the CTMC if i j in the MC Communication partitions MC in classes induces CTMC partition CTMC is irreducible if embedded MC contains a single class State i is recurrent if i is recurrent in the embedded MC State i is transient if i is transient in the embedded MC State i is positive recurrent if i is so in the embedded MC Transience and recurrence shared by elements of a MC class Transience and recurrence are class properties of CTMC Periodicity not possible in CTMCs Stoch. Systems Analysis Continuous time Markov chains 69

Limit probabilities Theorem Consider irreducible positive recurrent CTMC with transition rates ν i and q ij. Then, lim t P ij (t) exists and is independent of the initial state i, i.e., P j = lim P ij(t) exists for all i t Furthermore, steady state probabilities P j 0 are the unique nonnegative solution of the system of linear equations ν j P j = q kj P k, k=0,k j P j = 1 j=0 Limit distribution exists and is independent of initial condition Obtained as solution of system of linear equations Analogous to MCs. Algebraic equations slightly different Stoch. Systems Analysis Continuous time Markov chains 70

Algebraic relation to determine limit probabilities As with MCs difficult part is to prove that P j = lim t P ij (t) exists Algebraic relations obtained from Kolmogorov s forward equation P ij (t) t = k=0,k j q kj P ik (t) ν j P ij (t) If limit distribution exists we have, independent of initial state i P ij (t) lim = 0, lim t t P ij(t) = P j (t) t Considering the limit of Kolomogorov s forward equation then yields 0 = k=0,k j q kj P k (t) ν j P j (t) Reordering terms limit distribution equations follow Stoch. Systems Analysis Continuous time Markov chains 71

Example: Two state CTMC Simplest CTMC with two states 0 and 1 Transition rates are q 01 and q 10 0 1 From transition rates find mean transition times ν 0 = q 01, ν 1 = q 10 Stationary distribution equations q 01 q 10 ν 0 P 0 = q 10 P 1, ν 1 P 1 = q 01 P 0, P 0 + P 1 = 1 q 01 P 0 = q 10 P 1, q 10 P 1 = q 01 P 0, Solution yields P 0 = q 10 q 10 + q 01 P 1 = q 01 q 10 + q 01 Larger prob. q 10 of entering 0 larger prob. P 0 of being at 0 Larger prob. q 01 of entering 1 larger prob. P 1 of being at 1 Stoch. Systems Analysis Continuous time Markov chains 72

Ergodicity Consider the fraction T i (t) of time spent in state i up to time t T i (t) := 1 t t 0 I {X (t) = i} T i (t) = time/ergodic average. Its limit lim T i(t) is ergodic limit t If CTMC is irreducible, positive recurrent ergodicity holds P i = lim T 1 t i(t) = lim I {X (t) = i} t t t 0 a.s. Ergodic limit coincides with limit probabilities (almost surely) Stoch. Systems Analysis Continuous time Markov chains 73

Ergodicity (continued) Consider function f (i) associated with state i. Can write f ( X (t) ) as f ( X (t) ) = f (i)i {X (t) = i} Consider the time average of the function of the state f ( X (t) ) i=1 1 t lim f ( X (t) ) 1 t = lim t t 0 t t 0 f (i)i {X (t) = i} i=1 Interchange summation with integral and limit to say 1 t lim f ( X (t) ) = t t 0 i=1 1 t f (i) lim I {X (t) = i} = t t 0 Function s ergodic limit = functions limit average value f (i)p i i=1 Stoch. Systems Analysis Continuous time Markov chains 74

Limit distribution equations as balance equations Reconsider limit distribution equations ν j P j = P j = fraction of time spent in state j i=0,k j q kj P k ν j = rate of transition out of state j given CTMC is in state j ν j P j = rate of transition out of state j (unconditional) q kj = rate of transition from k to j given CTMC is in state k q kj P k = rate of transition from k to j (unconditional) i=0,k j q kjp k = rate of transition into j, from all states Rate of transition out of state j = rate of transition into j Balance equations Balance nr. of transitions in and out of state j Stoch. Systems Analysis Continuous time Markov chains 75

Limit distribution for birth and death process Birth/deaths occur at state-dependent rates. When X (t) = i Births individuals added at exponential times with mean 1/λ i birth rate = upward transition rate = q i,i+1 = λ i Deaths individuals removed at exponential times with rate 1/µ i birth rate = downward transition rate = q i,i 1 = µ i Mean transition times ν i = λ i + µ i 0 λ 0 λ i 1 λ i λ i+1... i 1 i i +1... µ 1 µ i µ i+1 Limit distribution/balance equations: Rate out of j = rate into j (λ i + µ i )P i = λ i 1 P i 1 + µ i+1 P i+1 λ 0 P 0 = µ 1 P 1 Stoch. Systems Analysis Continuous time Markov chains 76

Finding solution of balance equations Start expressing all probabilities in terms of P 0 Equation for P 0 λ0p 0 = µ 1P 1 Sum eqs. for P 1 λ 0P 0 = and P 0 (λ 1 + µ 1)P 1 = Sum result and λ 1P 1 = eq. for P 2 (λ 2 + µ 2)P 2 =. µ 1P 1 λ 0P 0 + µ 2P 2 λ 1P 1 = µ 2P 2 µ 2P 2 λ 1P 1 + µ 3P 3 λ 2P 2 = µ 3P 3 Sum result and eq. for P i λ i 1 P i 1 = (λ i + µ i )P i = µ i P i λ i 1 P i 1 + µ i+1 P i+1 λ i P i = µ i+1p i+1 Stoch. Systems Analysis Continuous time Markov chains 77

Finding solution of balance equations (continued) Recursive substitutions on equations on the left P 1 = λ 0 µ 1 P 0 P 2 = λ 1 µ 2 P 1 = λ 1λ 0 µ 2 µ 1 P 0 P 3 = λ 2 µ 3 P 2 = λ 2λ 1 λ 0 µ 3 µ 2 µ 1 P 0. P i+1 = λ i µ i+1 P i = λ iλ i 1... λ 0 µ i+1 µ i... µ 0 P 0 To find P 0 use i=0 P i = 1 1 = P 0 + i=0 λ i λ i 1... λ 0 µ i+1 µ i... µ 0 P 0 Stoch. Systems Analysis Continuous time Markov chains 78