Continuous time Markov chains
|
|
- Julian Maxwell
- 5 years ago
- Views:
Transcription
1 Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania October 16, 2017 Stoch. Systems Analysis Continuous time Markov chains 1
2 Exponential random variables Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 2
3 Exponential distribution Exponential RVs are used to model times at which events occur Or in general time elapsed between occurrence of random events RV T exp(λ) is exponential with parameter λ if its pdf is f T (t) = λe λt, for all t 0 The cdf, integral of the pdf, is (t 0) F T (t) = 1 e λt Complementary (c)cdf is P(T t) = 1 F T (t) = e λt 1 pdf 1 cdf Stoch. Systems Analysis Continuous time Markov chains 3
4 Expected value The expected value of time T exp(λ) is E [T ] = tλe λt = te λt e λt = λ Integrated by parts with u = t, dv = λe λt Mean time is inverse of parameter λ λ is rate/frequency of events happening at intervals T Average of λt events in time t Bigger lambda, smaller expected times, larger frequency of events T 1 T 2 S 1 S 2 t = 0 T 3 S 3 T 4 S 4 T 5 T 6 S 5 S 6 t = 5/λ T 7 T 8 S 7 S 8 T 9 T 10 S 9 S 10 t t = 10/λ Stoch. Systems Analysis Continuous time Markov chains 4
5 Second moment and variance For second moment also integrate by parts (u = t 2, dv = λe λt ) E [ T 2] = t 2 λe λt = t 2 e λt + 2te λt Fisrt term is 0, second is, except for a constant, as computed before E [ T 2] = 2 λ 0 tλe λt = 2 λ 2 The variance is computed from the mean and variance var [T ] = E [ T 2] E [T ] 2 = 2 λ 2 1 λ 2 = 1 λ 2 Parameter λ controls mean and variance of exponential RV Stoch. Systems Analysis Continuous time Markov chains 5
6 Memoryless random times Consider random time T. We say time T is memoryless if P [ T > s + t T > t ] = P [T > s] Probability of waiting s extra units of time given that we waited t seconds is just the probability of waiting s seconds System does not remember it has already waited t units Same probability irrespectively of time already elapsed Ex: Chemical reaction A + B AB occurs when molecules A and B collide. A, B move around randomly. Time T until reaction Ex: Group of molecules of type A and type B. Reaction occurs when any type A encounters any type B. Time T until next reaction Stoch. Systems Analysis Continuous time Markov chains 6
7 Exponential RVs are memoryless Write memoryless property in terms of joint pdf P [ T > s + t T > t ] = P [T > s + t, T > t] P [T > t] = P [T > s] Notice that having T > s + t and T > t is equivalent to T > s + t Replace P [T > s + t, T > t] = P [T > s + t] and reorder terms P [T > s + t] = P [T > t]p [T > s] If T is exponentially distributed ccdf is P [T > t] = e λt. Then P [T > s + t] = e λ(s+t) = e λt e λs = P [T > t] P [T > s] If random time T is exponential T is memoryless Stoch. Systems Analysis Continuous time Markov chains 7
8 Continuous memoryless RVs are exponential Consider a function g(t) with the property g(t + s) = g(t)g(s) Functional form of g(t)? Take logarithms log g(t + s) = log g(t) + log g(s) Can only be true for all t and s if log g(t) = ct for some constant c Which in turn, can only true if g(t) = e ct for some constant c Compare observation with statement of memoryless property P [T > s + t] = P [T > t] P [T > s] It must be P [T > t] = e ct for some constant c If T is continuous this can only be true for exponential T If T discrete it is geometric P [T > t] = (1 p) t with (1 p) = e c If continuous random time T is memoryless T is exponential Stoch. Systems Analysis Continuous time Markov chains 8
9 Combining results Theorem A continuous random variable T is memoryless if and only if it is exponentially distributed. That is P [ T > s + t T > t ] = P [T > s] if and only if f T (t) = λe λt for some λ > 0 Exponential RVs are memoryless. Do not remember elapsed time They are the only type of continuous memoryless RV Discrete RV T is memoryless if and only of it is geometric Geometrics are discrete approximations of exponentials Exponentials are continuous limits of geometrics Exponential = time until success Geometric = nr. trials until success Stoch. Systems Analysis Continuous time Markov chains 9
10 Time to first event Independent exponential RVs T 1, T 2 with parameters λ 1, λ 2 Probability distribution of first event, i.e., T := min(t 1, T 2 )? For having T > t we need both T 1 > t and T 2 > t Using independence of X and Y we can write P [T > t] = P [T 1 > t] P [T 2 > t] = ( 1 F T1 (t) )( 1 F T2 (t) ) Substituting expressions of exponential cdfs P [T > t] = e λ1t e λ2t = e (λ1+λ2)t T is exponentially distributed with parameter λ 1 + λ 2 Minimum of exponential variables is exponential Given two events happening at random exponential times (T 1, T 2 ) Time to any of them happening is also exponential Stoch. Systems Analysis Continuous time Markov chains 10
11 First event to happen Prob. P [T 1 < T 2 ] of T 1 exp(λ 1 ) happening before T 2 exp(λ 2 ) Condition on T 2 = t and integrate over the pdf of T 2 P [T 1 < T 2 ] = 0 P [ T 1 < t T2 = t ] f T2 (t) dt = Substitute expressions for exponential pdf and cdf P [T 1 < T 2 ] = Either X comes before Y or vice versa then 0 (1 e λ1t )λ 2 e λ2t dt = λ 1 λ 1 + λ 2 P [T 2 < T 1 ] = 1 P [T 1 < T 2 ] = λ 2 λ 1 + λ 2 Probabilities are relative values of parameters 0 F T1 (t)f T2 (t) dt Larger parameter smaller average larger prob. happening first Stoch. Systems Analysis Continuous time Markov chains 11
12 Probability of event in infinitesimal time Probability of an event happening in infinitesimal time h? Want P [T < h] for small h, Equivalent to P [T < h] = P [T < t] t h 0 = λ t=0 λe λt dt λh Sometimes also write P [T < h] = λh + o(h) o(h) o(h) implies lim = 0. Read as negligible with respect to h h 0 h Two independent events in infinitesimal time h? P [T 1 h, T 2 h] (λ 1 h)(λ 2 h) = λ 1 λ 2 h 2 = o(h) Stoch. Systems Analysis Continuous time Markov chains 12
13 Counting and Poisson processes Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 13
14 Counting processes Stochastic process N(t) taking integer values 0, 1,..., i,... Counting process N(t) counts number of events occurred by time t Nonnegative integer valued: N(0) = 0, N(t) = 0, 1, 2,..., Nondecreasing: for s < t it holds N(s) N(t) Event counter: N(t) N(s) = number of events in interval (s, t] Continuous on the right Ex.1: # text messages (SMS) typed since beginning of class Ex.2: # economic crisis since 1900 Ex.3: # customers at Wawa since 8am this morning n(t) S 1 S 2 S 3 S 4 S 5 S 6 t Stoch. Systems Analysis Continuous time Markov chains 14
15 Independent increments Number of events in disjoint time intervals are independent Consider times s 1 < t 1 < s 2 < t 2 and intervals (s 1, t 1 ] and (s 2, t 2 ] N(t 1 ) N(s 1 ) events occur in (s 1, t 1 ]. N(t 2 ) N(s 2 ) in (s 2, t 2 ] Independent increments implies latter two are independent P [N(t 1 ) N(s 1 ) = k, N(t 2 ) N(s 2 ) = l] = P [N(t 1 ) N(s 1 ) = k] P [N(t 2 ) N(s 2 ) = l] Ex.1: Likely true for SMS, except for have to send messages Ex.2: Most likely not true for economic crisis (business cycle) Ex.3: Likely true for Wawa, except for unforeseen events (storms) Does not mean N(t) independent of N(s) These events are clearly not independent, since N(t) is at least N(s) Stoch. Systems Analysis Continuous time Markov chains 15
16 Stationary increments Prob. dist. of number of events depends on length of interval only Consider time intervals (0, t] and (s, s + t] N(t) events occur in (0, t]. N(s + t) N(s) events in (s, s + t] Stationary increments implies latter two have same prob. dist. P [N(s + t) N(s) = k] = P [N(t) = k] Ex.1: Likely true if lecture is good and you keep interest in the class Ex.2: Maybe true if you do not believe we are becoming better at preventing economic crisis Ex.3: Most likely not true because of, e.g., rush hours and slow days Stoch. Systems Analysis Continuous time Markov chains 16
17 Poisson process A counting process is Poisson if it has the following properties (a) The process has stationary and independent increments (b) The number of events in (0, t] has Poisson distribution with mean λt P [N(t) = n] = e λt (λt) n An equivalent definition is the following (i) The process has stationary and independent increments (ii) Prob. of event in infinitesimal time P [N(h) = 1] = λh + o(h) (iii) At most one event in infinitesimal time P [N(h) > 1] = o(h) This is a more intuitive definition (even though difficult to believe now) Conditions (i) and (a) are the same That (b) implies (ii) and (iii) is obvious. Just substitute small h in Poisson pmf s expression for P [N(t) = n] To see that (ii) and (iii) imply (b) requires some work n! Stoch. Systems Analysis Continuous time Markov chains 17
18 Explanation of model (i)-(iii) Consider time T and divide interval (0, T ] in n subintervals Subintervals are of duration h = T /n, h vanishes as n increases The m th subinterval spans ( (m 1)h, mh ] Define A m as the number of events that occur in m th subinterval A m = N ( mh ) N ( (m 1)h ) The total number of events in (0,T] is the sum of A m s N(T ) = n A m= m=1 n N ( mh ) N ( (m 1)h ) In figure, N(T ) = 5, A 1, A 2, A 4, A 7, A 8 are 1 and A 3, A 5, A 6 are 0 m=1 T1 T2 T3 T4 T5 t h h h h h h h h t = 0 t = T Stoch. Systems Analysis Continuous time Markov chains 18
19 Probability distribution of A m (intuitive arg.) Note first that since increments are stationary as per (i), it holds P [A m = k] = P [ N ( mh ) N ( (m 1)h ) = k ] = P [N(h) = k] In particular, using (ii) and (iii) P [A m = 1] = P [N(h) = 1] = λh + o(h) P [A m > 1] = P [N(h) > 1] = o(h) Set aside o(h) probabilities They re negligible with respect to λh P [A m = 1] = λh P [A m = 0] = 1 λh A m is Bernoulli with parameter λh T1 T2 T3 T4 T5 t h h h h h h h h t = 0 t = T Stoch. Systems Analysis Continuous time Markov chains 19
20 Probability distribution of N(T ) (intuitive arg.) Since increments are also independent as per (i), A m are independent N(T ) is sum of n independent Bernoulli RVs with parameter λh N(T ) is binomial with parameters (n, λh) = (n, λt /n) As interval length h 0, number of intervals n The product n(λh) = λt stays constant N(T ) is Poisson with parameter λt Then (ii)-(iii) imply (b) and definitions are equivalent Not a proof because we neglected o(h) terms. But explains what a Poisson process is T1 T2 T3 T4 T5 t h h h h h h h h t = 0 t = T Stoch. Systems Analysis Continuous time Markov chains 20
21 What is a Poisson process? Events happen in small interval h with probability λh proportional to h Whether event happens in an interval has no effect on other intervals Modeling questions Expect probability of event proportional to length of interval? Expect subsequent intervals to behave independently? Then a Poisson process model is appropriate Typically arise in a large population of agents acting independently Larger interval, larger chance an agent takes an action Action of one agent has no effect on action of other agents Has therefore negligible effect on action of group Stoch. Systems Analysis Continuous time Markov chains 21
22 Examples of Poisson processes Ex.1: Number of people arriving at subway station. Number of cars arriving at a highway entrance. Number of bids in an auction. Number of customers entering a store... Large number of agents (people, drivers, bidders, customers) acting independently. Ex.2: SMS generated by all students in the class. Once you send a SMS you are likely to stay silent for a while. But in a large population this has a minimal effect in the probability of someone generating a SMS Ex.3: Count of molecule reactions. Molecules are removed from pool of reactants once they react. But effect is negligible in large population. Eventually reactants are depleted, but in small time scale process is approximately Poisson Stoch. Systems Analysis Continuous time Markov chains 22
23 Formal argument Define A max = max (A m), maximum nr. of events in one interval m=1,...,n If A max 1 all intervals have 0 or 1 events Easy (binomial) Consider conditional probability P [ N(T ) = k A max 1 ] For given h, N(T ) conditioned on A max 1 is binomial Parameters are n = T /h and p = λh + o(h) Interval length h 0 parameter p 0, nr. of intervals n Product np lim h 0 np = lim h 0 (T /h)(λh + o(h)) = λt N(T ) conditioned on A max 1 is Poisson with parameter λt P [ N(T ) = k A max 1 ] λt λtk = e k! Stoch. Systems Analysis Continuous time Markov chains 23
24 Formal argument (continued) Separate study in A max 1 and A max > 1. That is, condition P [N(T ) = k] = P [ N(T ) = k Amax 1 ] P [A max 1] + P [ N(T ) = k Amax > 1 ] P [A max > 1] Property (iii) implies that P [A max > 1] vanishes as h 0 P [A max > 1] n m=1 P [A m > 1] = no(h) = T o(h) h 0 Thus, as h 0, P [A max > 1] 0 and P [A max 1] 1. Then lim P [N(T ) = k] = lim P [ N(T ) = k Amax 1 ] h 0 h 0 Latter is, as already seen, Poisson Then N(T ) Poisson with parameter λt Stoch. Systems Analysis Continuous time Markov chains 24
25 Properties of Poisson processes Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 25
26 Interarrival times Let T 1, T 2,... be sequence of times between events T 1 is time until first event (arrival). T 2 is time between second and first event. T i is time between i 1-st and i-th event Ccdf of T 1 P [T 1 > t] = P [N(t) = 0] = e λt T 1 has exponential distribution with parameter λ Since we have independent increments this is likely true for all T i Theorem Interarrival times T i of a Poisson process are independent identically distributed exponential random variables with parameter λ, i.e., P [T i > t] = e λt Have already proved for T 1. Let us see the rest. Stoch. Systems Analysis Continuous time Markov chains 26
27 Interarrival times Proof. Let S i be absolute time of i-th event. Condition on S i P [T i+1 > t] = P [ T i+1 > t S i = s ] P [S i = s] ds To have T i+1 > t given that S i = s it must be N(s + t) = N(s) P [ T i+1 > t Si = s ] = P [ N(t + s) N(s) = 0 N(s) ] Since increments are independent conditioning on N(s) is moot P [ T i+1 > t Si = s ] = P [N(t + s) N(s) = 0] Since increments are also stationary the latter is P [ T i+1 > t S i = s ] = P [N(t) = 0] = e λt Substituting into integral yields P [T i+1 > t] = e λt Stoch. Systems Analysis Continuous time Markov chains 27
28 Alternative definition of Poisson process Start with sequence of independent random times T 1, T 2,... Times T i exp(λ) have exponential distribution with parameter λ Define time of i-th event S i S i = T 1 + T T i Define counting process of events happening at S i N(t) = max(s i t) i N(t) is a Poisson process N(t) T 1 T 2 T 3 T 4 T 5 T S 1 S 2 S 3 S 4 S 5 S 6 t If N(t) is a Poisson process interarrival times T i are exponential To show that definition is equivalent have to show the converse I.e., if interarrival times are exponential, process is Poisson Stoch. Systems Analysis Continuous time Markov chains 28
29 Alternative definition of Poisson process (cont.) Exponential i.i.d interarrival times Poisson process? Show that implies definition (i)-(iii) Stationary true because all T i have same distribution Independent increments true because interarrival times are independent and exponential RVs are memoryless Can have more than two events in (0, h] only if T 1 < h and T 2 < h P [N(h) > 1] P [T 1 h] P [T 2 h] We have no event in (0, h] if T 1 > h = (1 e λh ) 2 = (λh) 2 + o(h 2 ) = o(h) P [N(h) = 0] = P [T 1 h] = e λh = 1 λh + o(h) The remaining case is N(h) = 1 whose probability is P [N(h) = 1] = 1 P [N(h) = 0] P [N(h) > 1] = λh + o(h) Stoch. Systems Analysis Continuous time Markov chains 29
30 Three definitions of Poisson processes Def. 1: Prob. of event proportional to interval width. Intervals independent Physical model definition. Can a phenomenon be reasonably modeled as a Poisson process? The other two definitions are used for analysis and/or simulation Def. 2: Prob. distribution of events in (0, t] is Poisson Event centric definition. Nr. of events in given time intervals Allows analysis and simulation Used when information about nr. of events in given time is desired Stoch. Systems Analysis Continuous time Markov chains 30
31 Three definitions of Poisson processes (continued) Def. 3: Prob. distribution of interarrival times is exponential Time centric definition. Times at which events happen Allows analysis and simulation. Used when information about event times is of interest Obs: Restrictions in Def. 1 are mild, yet they impose a lot of structure as implied by Defs. 2 & 3 Stoch. Systems Analysis Continuous time Markov chains 31
32 Example: Number of visitors to a web page Nr. of unique visitors to a webpage between 6:00pm to 6:10pm Def 1: Poisson process? Probability proportional to time interval and independent intervals seem reasonable assumptions Model as Poisson process with rate λ visits/second (v/s) Def 2: Arrivals in interval of duration t are Poisson with parameter λt Expected nr. of visits in 10 minutes? Prob. of exactly 10 visits in 1 sec? E [N(600)] = 600λ P [N(1) = 10] = e λ λ/10! Data shows N average visits in 10 minutes (600s). Approximate λ Since E [N(600)] = 600λ can make ˆλ = N/600 Stoch. Systems Analysis Continuous time Markov chains 32
33 Number of visitors to a web page (continued) Def 3: Interarrival times T i are exponential with parameter λ Expected time between visitors? E [T i ] = 1/λ Expected arrival time S n of n-th visitor? Can write time S n as sum of T i, i.e., S n = n i=1 T i. Taking expected value E [S n ] = n i=1 E [T i] = n/λ Stoch. Systems Analysis Continuous time Markov chains 33
34 Continuous time Markov chains Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 34
35 Definition Continuous time positive variable t R + States X (t) taking values in countable set, e.g., 0, 1,..., i,... Stochastic process X (t) is a continuous time Markov chain (CTMC) if P [ X (t + s) = j X (s) = i, X (u) = x(u), u < s ] = P [ X (t + s) = j X (s) = i ] Memoryless property The future X (t + s) given the present X (s) is independent of the past X (u) = x(u), u < s In principle need to specify functions P [ X (t + s) = j X (s) = i ] For all times t, for all times s and for all pairs of states (i, j) Stoch. Systems Analysis Continuous time Markov chains 35
36 Notation, homogeneity Notation: X [s : t] state values for all times s u t, includes borders X (s : t) values for all times s < u < t, borders excluded X (s : t] values for all times s < u t, exclude left, include right X [s : t) values for all times s u < t, include left, exclude right Homogeneous CTMC if P [ X (t + s) = j X (s) = i ] constant for all s Still need P [ X (t + s) = j ] X (s) = i for all t and pairs (i, j) We restrict consideration to homogeneous CTMCs Memoryless property makes it somewhat simpler Stoch. Systems Analysis Continuous time Markov chains 36
37 Transition times T i = time until transition out of state i into any other state j T i is a RV called transition time with cdf P [T i > t] = P [ X (0 : t] = i X (0) = i ] Probability of T i > t + s given that T i > s? Use cdf expression P [ T i > t + s Ti > s ] = P [ X (0 : t + s] = i X [0 : s] = i ] = P [ X (s : t + s] = i X [0 : s] = i ] = P [ X (s : t + s] = i X (s) = i ] = P [ X (0 : t] = i X (0) = i ] Equalities true because: Already observed that X [0 : s] = i. Memoryless property. Homogeneity From cdf expression P [ T i > t + s T i > s ] = P [T i > t] Transition times are exponential RVs Stoch. Systems Analysis Continuous time Markov chains 37
38 Alternative definition Exponential transition times is a fundamental property of CTMCs Can be used as algorithmic definition of CTMCs Continuous times stochastic process X (t) is a CTMC if Transition times T i are exponential RVs with mean 1/ν i When they occur, transitions out of i are into j with probability P ij P ij = 1, P ii = 0 j=1 Transition times T i and transitioned state j are independent Define matrix P grouping transition probabilities P ij CTMC states evolve as a discrete time MC State transitions occur at random exponential intervals T i exp(ν i ) As opposed to occurring at fixed intervals Stoch. Systems Analysis Continuous time Markov chains 38
39 Embedded discrete time MCs Definition Consider a CTMC with transition probs. P and rates ν i. The (discrete time) MC with transition probs. P is the CTMC s embedded MC Transition probabilities P describe a discrete time MC States do not transition into themselves (P ii = 0, P s diagonal null) Can use underlying discrete time MCs to understand CTMCs Accesibility: State j is accessible from state i if it is accessible in the discrete time MC with transition probability matrix P Communication: States i and j communicate if they communicate in the discrete time MC with transition probability matrix P Communication is a class property. Proof: It is for discrete time MC Recurrent/Transient. Recurrence is class property. Etc. More later Stoch. Systems Analysis Continuous time Markov chains 39
40 Transition rates Expected value of transition time T i is E [T i ] = 1/ν i Can interpret ν i as the rate of transition out of state i Of these transitions, P ij of them are into state j Can interpret q ij := ν i P ij as transition rate from i to j. Define so Transition rates are yet another specification of CTMC If q ij are given can recover ν i as ν i = ν i j=1,j i P ij = j=1,j i ν i P ij = And can recover P ij as P ij = q ij /ν i = q ij ( j=1,j i j=1,j i q ij q ij ) 1 Stoch. Systems Analysis Continuous time Markov chains 40
41 Example: Birth and death processes State X (t) = 0, 1,... Interpret as number of individuals Birth and deaths occur at state-dependent rates. When X (t) = i Births individuals added at exponential times with mean 1/λ i birth or arrival rate = λ i births per unit of time Deaths individuals removed at exponential times with rate 1/µ i death or departure rate = µ i births per unit of time Birth and death times are independent As are subsequent birth and death processes Birth and death (BD) processes are then CTMC Stoch. Systems Analysis Continuous time Markov chains 41
42 Transition times and probabilities Transition times. Leave state i 0 when birth or death occur If T B and T D are times to next birth and death, T i = min(t B, T D ) Since T B and T D are exponential, so is T i with rate ν i = λ i + µ i When leaving state i can go to i + 1 (birth first) or i 1 (death first) Birth occurs before death with probability = P i,i+1 λ i + µ i Death occurs before birth with probability µ i = P i,i 1 λ i + µ i Leave state 0 only if a birth occurs (it at all) then ν 0 = λ 0 P 01 = 1 If leaves 0, goes to 1 with probability 1. Might not leave 0 if λ 0 = λ i Stoch. Systems Analysis Continuous time Markov chains 42
43 Transition rates Rate of transition from i to i + 1 is (recall definition, q ij = ν i P ij ) q i,i+1 = ν i P i,i+1 = (λ i + µ i ) = λ i λ i + µ i Likewise, rate of transition from i to i 1 is q i,i 1 = ν i P i,i 1 = (λ i + µ i ) = µ i λ i + µ i For i = 0, q 01 = ν 1 P 01 = λ 0 λ i µ i 0 λ 0 λ i 1 λ i λ i+1... i 1 i i µ 1 µ i µ i+1 Somewhat more natural representation. More similar to discrete MC Stoch. Systems Analysis Continuous time Markov chains 43
44 M/M/1 queue A M/M/1 queue is a BD process with λ i = λ and µ i = µ constant Customers arrive for service at a rate of λ per unit of time They are serviced at a rate of µ customers per time unit 0 λ λ λ λ... i 1 i i µ µ µ The M/M is for Markov arrivals / Markov departures The 1 is because there is only one server Stoch. Systems Analysis Continuous time Markov chains 44
45 Transition probability function Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 45
46 Definition Two equivalent ways of specifying a CTMC Transition time averages 1/ν i + Transition probabilities P ij Easier description. Typical starting point for CTMC modeling Transition prob. function P ij (t) := P [ X (t + s) = j X (s) = i ] More complete description Compute transition prob. function from transition times and probs. Notice two obvious properties P ij (0) = 1, P ii (0) = 1 Stoch. Systems Analysis Continuous time Markov chains 46
47 Roadmap to determine P ij (t) Goal is to obtain a differential equation whose solution is P ij (t) For that, need to study change in P ij (t) when time changes slightly Separate in two subproblems Transition probabilities for small time h, P ij (h) Transition probabilities in t + h as function of those in t and h We can combine both results in two different ways Jump from 0 to t then to t + h Process runs a little longer Changes where the process is going to Forward equations Jump from 0 to h then to t + h Process starts a little later Changes where the process comes from Backward equations Stoch. Systems Analysis Continuous time Markov chains 47
48 Transition probability in infinitesimal time Theorem The transition probability functions P ii (t) and P ij (t) satisfy the following limits for time t going to 0 P ij (t) 1 P ii (t) lim = q ij, lim = ν i t 0 t t 0 t Since P ij (0) = 0, P ii (0) = 1 above limits are derivatives at t = 0 P ij (t) P ii (t) t = q ij, t=0 t = ν i t=0 Limits also imply that for small h P ij (h) = q ij h + o(h), P ii (h) = 1 ν i h + o(h) Transition rates q ij are instantaneous transition probabilities Transition probability coefficient for small time h Stoch. Systems Analysis Continuous time Markov chains 48
49 Transition probability in infinitesimal time (proof) Proof. Consider a small time h Since 1 P ii is the probability of transitioning out of state i 1 P ii (h) = P [T i < h] = ν i h + o(h) Divide by h and take limit For P ij (t) notice that since two or more transitions have o(h) prob P ij (h) = P [ X (h) = j ] X (0) = i = Pij P [T i h] + o(h) Since T i is exponential P [T i h] = ν i h + o(h). Then P ij (h) = ν i P ij + o(h) = q ij h + o(h) Divide by h and take limit Stoch. Systems Analysis Continuous time Markov chains 49
50 Chapman-Kolmogorov equations Theorem For all times s and t the transition probability functions P ij (t + s) are obtained from P ik (t) and P kj (s) as P ij (t + s) = P ik (t)p kj (s) k=0 As for discrete time MC, to go from i to j in time t + s Go from i to a certain k at time t P ik (t) In the remaining time s go from k to j Sum over all possible intermediate steps k P kj (s) Stoch. Systems Analysis Continuous time Markov chains 50
51 Chapman-Kolmogorov equations (proof) Proof. P ij (t + s) = P [ X (t + s) = j ] X (0) = i Definition of P ij (t + s) = P [ X (t + s) = j ] [ ] X (t) = k, X (0) = i P X (t) = k X (0) = i k=0 Sum of conditional probabilities = P [ X (t + s) = j ] X (t) = k Pik (t) Memoryless property of CTMC k=0 and definition of P ik (t) = P [ X (s) = j X (0) = k ] P ik (t) Time invariance / homogeneity k=0 = P kj (s)p ik (t) Definition of P kj (s) k=0 Stoch. Systems Analysis Continuous time Markov chains 51
52 Combining both results Let us combine the last two results to express P ij (t + h) Use Chapman-Kolmogorov s equations for 0 t h P ij (t + h) = P ik (t)p kj (h) = P ij (t)p jj (h) + k=0 Use infinitesimal time expression P ij (t + h) = P ij (t)(1 ν j h) + k=0,k i k=0,k i P ik (t)p kj (h) P ik (t)q kj h + o(h) Subtract P ij (t) from both sides and divide by h P ij (t + h) P ij (t) = ν j P ij (t) + P ik (t)q kj + o(h) h h k=0,k i The right hand side is a derivative ratio. Let h 0 Stoch. Systems Analysis Continuous time Markov chains 52
53 Kolmogorov s forward equations Theorem The transition probability functions P ij (t) of a CTMC satisfy the system of differential equations (for all pairs i, j) P ij (t) t = k=0,k j P ij (t)/ t = rate of change of P ij (t) q kj P ik (t) ν j P ij (t) q kj P ik (t) = (transition into k in 0 t) (rate of moving into j in next instant) ν j P ij (t) = (transition into j in 0 t) (rate of leaving j in next instant) Change in P ij (t) = k (moving into j from k) (leaving j) Kolmogorov s forward equations valid in most cases, but not always Stoch. Systems Analysis Continuous time Markov chains 53
54 Kolmogorov s backward equations For forward equations used Chapman-Kolmogorov s for 0 t h For backward equations we use 0 h t to express P ij (t + h) as P ij (t + h) = P ik (h)p kj (t) = P ii (h)p ij (t) + k=0 Use infinitesimal time expression P ij (t + h) = (1 ν i h)p ij (t) + k=0,k i k=0,k i P ik (h)p kj (t) q ik hp kj (t) + o(h) Subtract P ij (t) from both sides and divide by h P ij (t + h) P ij (t) = ν i P ij (t) + q ik P kj (t) + o(h) h h k=0,k i Stoch. Systems Analysis Continuous time Markov chains 54
55 Kolmogorov s backward equations Theorem The transition probability functions P ij (t) of a CTMC satisfy the system of differential equations (for all pairs i, j) P ij (t) t = k=0,k i q ik P kj (t) ν i P ij (t) ν i P ij (t) = (transition into j in h t) (do not do that if leave i in initial instant) q ik P kj (t) = (rate of transition into k in 0 h) (transition from k into j in h t) Forward equations change in P ij (t) if finish h later Backward equations change in P ij (t) if start h later Where process goes (forward) where process comes from (backward) Stoch. Systems Analysis Continuous time Markov chains 55
56 Determination of transition probability function Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 56
57 A CTMC with two states Simplest possible CTMC has only two states. Say 0 and 1 Transition rates are q 01 and q 10 Given transition rates can find mean transition times as q ν 0 = j 0 q 0i = q 01 ν 1 = j 1 q 1i = q 10 q 10 Use Kolmogorov s equations to find transition probability functions P 00 (t), P 01 (t), P 10 (t), P 11 (t) Note that transition probs. out of each state sum up to one P 00 (t) + P 01 (t) = 1 P 10 (t) + P 11 (t) = 1 Stoch. Systems Analysis Continuous time Markov chains 57
58 Kolmogorov s forward equations Kolmogorov s forward equations (process runs a little longer) P ij(t) = q kj P ik (t) ν j P ij (t) For the two node CTMC k=0,k j P 00(t) = q 10 P 01 (t) ν 0 P 00 (t) P 10(t) = q 10 P 11 (t) ν 0 P 10 (t) P 01(t) = q 01 P 00 (t) ν 1 P 01 (t) P 11(t) = q 01 P 10 (t) ν 1 P 11 (t) Probabilities out of 0 sum up to 1 eqs. in first row are equivalent Probabilities out of 1 sum up to 1 eqs. in second row are equivalent Pick the equations for P 00 (t) and P 11 (t) Stoch. Systems Analysis Continuous time Markov chains 58
59 Solution of backward equations Use Relation between transition rates: ν 0 = q 01 and ν 1 = q 10 Probs. sum 1: P 01 (t) = 1 P 00 (t) and P 01 (t) = 1 P 00 (t) P 00(t) = q 10 [ 1 P00 (t) ] q 01 P 00 (t) = q 10 (q 10 + q 01 )P 00 (t) P 11(t) = q 01 [ 1 P11 (t) ] q 10 P 11 (t) = q 01 (q 10 + q 01 )P 11 (t) Can obtain exact same pair of equations from backward equations First order linear differential equations. Solutions are exponential For P 00 (t) propose candidate solution (just take derivate to check) P 00 (t) = q 10 q 10 + q 01 + ce (q10+q01)t To determine c use initial condition P 00 (t) = 1 Stoch. Systems Analysis Continuous time Markov chains 59
60 Solution of backward equations (continued) Evaluation of candidate solution at initial condition P 00 (t) = 1 yields 1 = q 10 q 10 + q 01 + c c = Finally transition probability function P 00 (t) P 00 (t) = q 01 q 10 + q 01 q 10 q 01 + e (q10+q01)t q 10 + q 01 q 10 + q 01 Repeat for P 11 (t). Same exponent, different constants P 11 (t) = q 01 q 10 + e (q10+q01)t q 10 + q 01 q 10 + q 01 As time goes to infinity, P 00 (t) and P 11 (t) converge exponentially Convergence rate depends on magnitude of q 10 + q 01 Stoch. Systems Analysis Continuous time Markov chains 60
61 Convergence of transition probabilities Recall P 01 = 1 P 00 and P 10 = 1 P 11. Steady state distributions lim P 00(t) = t lim P 11(t) = t q 10 q 10 + q 01 q 01 q 10 + q 01 lim P 01(t) = t lim P 10(t) = t q 01 q 10 + q 01 q 10 q 10 + q 01 Limit distribution exists and is independent of initial condition Compare across diagonals Stoch. Systems Analysis Continuous time Markov chains 61
62 Kolmogorov s forward equations in matrix form Restrict attention to finite CTMCs with N states Define matrix R R N N with elements r ij = q ij, r ii = ν i. Rewrite Kolmogorov s forward eqs. as (process runs a little longer) N N P ij(t) = q kj P ik (t) ν j P ij (t) = r kj P ik (t) k=1,k j k=1 Right hand side defines elements of a matrix product P(t) = P 11(t) P 1k (t) P 1N (t) P i1 (t) P ik (t) P in (t) P N1 (t) P Nk (t) P JN (t) r 11 r 1j r 1N r k1 r kj r kn r N1 r Nj r NN s 11 s 1j s 1N s i1 s ij s in s N1 s Nk s NN = R = P(t)R = P (t) Stoch. Systems Analysis Continuous time Markov chains 62
63 Kolmogorov s backward equations in matrix form Similarly, Kolmogorov s backward eqs. (process starts a little later) N P ij(t) = q ik P kj (t) ν i P ij (t) = r ik P kj (t) k=0,k i k=1 Right hand side also defines a matrix product P 11(t) P 1j (t) P 1N (t) P k1 (t) P kj (t) P kn (t) P N1 (t) P Nj (t) P NN (t) = P(t) R = r 11 r 1k r 1N r i1 r ik r in r N1 r Nk r JN s 11 s 1j s 1N s i1 s ij s in s N1 s Nk s NN = RP(t) = P (t) Stoch. Systems Analysis Continuous time Markov chains 63
64 Kolmogorov s equations in matrix form Matrix form of Kolmogorov s forward equation P (t) = P(t)R Matrix form of Kolmogorov s backward equation P (t) = RP(t) More similar than apparent But not equivalent because matrix product not commutative Notwithstanding both equations have to accept the same solution Stoch. Systems Analysis Continuous time Markov chains 64
65 Matrix exponential Kolmogorov s equations are first order linear differential equations They are coupled, though. P ij (t) depends on P kj(t) for all k Still accept exponential solution. Need to define matrix exponential Definition The matrix exponential e At of matrix At is defined as the series e At (At) n = n! n=0 = I + At + (At)2 2 + (At) Derivative of matrix exponential with respect to t e At t = 0 + A + A 2 t + A3 t 2 2 ) +... = A (I + At + (At) = Ae At 2 Putting A on right side of product shows that eat t = e At A Stoch. Systems Analysis Continuous time Markov chains 65
66 Solution of Kolmogorov s equations Propose solution of the form P(t) = e Rt P(t) solves backward equations. Derivative with respect to time is P(t) t = ert t It also solves forward equations = Re Rt = RP(t) P(t) t = ert t = e Rt R = P(t)R Also notice that P(0) = I. As it should (P ii (0) = 1, and P ij (0) = 0) Stoch. Systems Analysis Continuous time Markov chains 66
67 Unconditional probabilities P(t) is transition prob. from states at time 0 to states at time t Define unconditional probs. at time t, p j (t) := P [X (t) = i] Group in vector p(t) = [p 1 (t), p 2 (t),..., p j (t),...] T Given initial distribution p(0) find p j (t) conditioning on initial state p j (t) = P [ X (t) = j ] X (0) = i P [X (0) = i] = P ij (t)p i (0) i=0 i=0 Using matrix-vector notation p(t) = P T (t)p(0) Stoch. Systems Analysis Continuous time Markov chains 67
68 Limit probabilities Exponential random variables Counting processes and definition of Poisson processes Properties of Poisson processes Continuous time Markov chains Transition probability function Determination of transition probability function Limit probabilities Stoch. Systems Analysis Continuous time Markov chains 68
69 Recurrent and transient states Recall the embedded discrete time MC associated with any CTMC Transition probs. of MC form the matrix P of the CTMC States do not transition into themselves (P ii = 0, P s diagonal null) States i j communicate in the CTMC if i j in the MC Communication partitions MC in classes induces CTMC partition CTMC is irreducible if embedded MC contains a single class State i is recurrent if i is recurrent in the embedded MC State i is transient if i is transient in the embedded MC State i is positive recurrent if i is so in the embedded MC Transience and recurrence shared by elements of a MC class Transience and recurrence are class properties of CTMC Periodicity not possible in CTMCs Stoch. Systems Analysis Continuous time Markov chains 69
70 Limit probabilities Theorem Consider irreducible positive recurrent CTMC with transition rates ν i and q ij. Then, lim t P ij (t) exists and is independent of the initial state i, i.e., P j = lim P ij(t) exists for all i t Furthermore, steady state probabilities P j 0 are the unique nonnegative solution of the system of linear equations ν j P j = q kj P k, k=0,k j P j = 1 j=0 Limit distribution exists and is independent of initial condition Obtained as solution of system of linear equations Analogous to MCs. Algebraic equations slightly different Stoch. Systems Analysis Continuous time Markov chains 70
71 Algebraic relation to determine limit probabilities As with MCs difficult part is to prove that P j = lim t P ij (t) exists Algebraic relations obtained from Kolmogorov s forward equation P ij (t) t = k=0,k j q kj P ik (t) ν j P ij (t) If limit distribution exists we have, independent of initial state i P ij (t) lim = 0, lim t t P ij(t) = P j (t) t Considering the limit of Kolomogorov s forward equation then yields 0 = k=0,k j q kj P k (t) ν j P j (t) Reordering terms limit distribution equations follow Stoch. Systems Analysis Continuous time Markov chains 71
72 Example: Two state CTMC Simplest CTMC with two states 0 and 1 Transition rates are q 01 and q From transition rates find mean transition times ν 0 = q 01, ν 1 = q 10 Stationary distribution equations q 01 q 10 ν 0 P 0 = q 10 P 1, ν 1 P 1 = q 01 P 0, P 0 + P 1 = 1 q 01 P 0 = q 10 P 1, q 10 P 1 = q 01 P 0, Solution yields P 0 = q 10 q 10 + q 01 P 1 = q 01 q 10 + q 01 Larger prob. q 10 of entering 0 larger prob. P 0 of being at 0 Larger prob. q 01 of entering 1 larger prob. P 1 of being at 1 Stoch. Systems Analysis Continuous time Markov chains 72
73 Ergodicity Consider the fraction T i (t) of time spent in state i up to time t T i (t) := 1 t t 0 I {X (t) = i} T i (t) = time/ergodic average. Its limit lim T i(t) is ergodic limit t If CTMC is irreducible, positive recurrent ergodicity holds P i = lim T 1 t i(t) = lim I {X (t) = i} t t t 0 a.s. Ergodic limit coincides with limit probabilities (almost surely) Stoch. Systems Analysis Continuous time Markov chains 73
74 Ergodicity (continued) Consider function f (i) associated with state i. Can write f ( X (t) ) as f ( X (t) ) = f (i)i {X (t) = i} Consider the time average of the function of the state f ( X (t) ) i=1 1 t lim f ( X (t) ) 1 t = lim t t 0 t t 0 f (i)i {X (t) = i} i=1 Interchange summation with integral and limit to say 1 t lim f ( X (t) ) = t t 0 i=1 1 t f (i) lim I {X (t) = i} = t t 0 Function s ergodic limit = functions limit average value f (i)p i i=1 Stoch. Systems Analysis Continuous time Markov chains 74
75 Limit distribution equations as balance equations Reconsider limit distribution equations ν j P j = P j = fraction of time spent in state j i=0,k j q kj P k ν j = rate of transition out of state j given CTMC is in state j ν j P j = rate of transition out of state j (unconditional) q kj = rate of transition from k to j given CTMC is in state k q kj P k = rate of transition from k to j (unconditional) i=0,k j q kjp k = rate of transition into j, from all states Rate of transition out of state j = rate of transition into j Balance equations Balance nr. of transitions in and out of state j Stoch. Systems Analysis Continuous time Markov chains 75
76 Limit distribution for birth and death process Birth/deaths occur at state-dependent rates. When X (t) = i Births individuals added at exponential times with mean 1/λ i birth rate = upward transition rate = q i,i+1 = λ i Deaths individuals removed at exponential times with rate 1/µ i birth rate = downward transition rate = q i,i 1 = µ i Mean transition times ν i = λ i + µ i 0 λ 0 λ i 1 λ i λ i+1... i 1 i i µ 1 µ i µ i+1 Limit distribution/balance equations: Rate out of j = rate into j (λ i + µ i )P i = λ i 1 P i 1 + µ i+1 P i+1 λ 0 P 0 = µ 1 P 1 Stoch. Systems Analysis Continuous time Markov chains 76
77 Finding solution of balance equations Start expressing all probabilities in terms of P 0 Equation for P 0 λ0p 0 = µ 1P 1 Sum eqs. for P 1 λ 0P 0 = and P 0 (λ 1 + µ 1)P 1 = Sum result and λ 1P 1 = eq. for P 2 (λ 2 + µ 2)P 2 =. µ 1P 1 λ 0P 0 + µ 2P 2 λ 1P 1 = µ 2P 2 µ 2P 2 λ 1P 1 + µ 3P 3 λ 2P 2 = µ 3P 3 Sum result and eq. for P i λ i 1 P i 1 = (λ i + µ i )P i = µ i P i λ i 1 P i 1 + µ i+1 P i+1 λ i P i = µ i+1p i+1 Stoch. Systems Analysis Continuous time Markov chains 77
78 Finding solution of balance equations (continued) Recursive substitutions on equations on the left P 1 = λ 0 µ 1 P 0 P 2 = λ 1 µ 2 P 1 = λ 1λ 0 µ 2 µ 1 P 0 P 3 = λ 2 µ 3 P 2 = λ 2λ 1 λ 0 µ 3 µ 2 µ 1 P 0. P i+1 = λ i µ i+1 P i = λ iλ i 1... λ 0 µ i+1 µ i... µ 0 P 0 To find P 0 use i=0 P i = 1 1 = P 0 + i=0 λ i λ i 1... λ 0 µ i+1 µ i... µ 0 P 0 Stoch. Systems Analysis Continuous time Markov chains 78
Continuous-time Markov Chains
Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017
More informationThe Transition Probability Function P ij (t)
The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationPart I Stochastic variables and Markov chains
Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)
More informationChapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process
More informationContinuous-Time Markov Chain
Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More informationCDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical
CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationStochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationStatistics 150: Spring 2007
Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.
IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas
More informationLecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationSTAT 380 Continuous Time Markov Chains
STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous
More informationMarkov chains. 1 Discrete time Markov chains. c A. J. Ganesh, University of Bristol, 2015
Markov chains c A. J. Ganesh, University of Bristol, 2015 1 Discrete time Markov chains Example: A drunkard is walking home from the pub. There are n lampposts between the pub and his home, at each of
More informationLecture Notes 7 Random Processes. Markov Processes Markov Chains. Random Processes
Lecture Notes 7 Random Processes Definition IID Processes Bernoulli Process Binomial Counting Process Interarrival Time Process Markov Processes Markov Chains Classification of States Steady State Probabilities
More informationEXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00
Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013
More informationSOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012
SOLUTIONS IEOR 3106: Second Midterm Exam, Chapters 5-6, November 8, 2012 This exam is closed book. YOU NEED TO SHOW YOUR WORK. Honor Code: Students are expected to behave honorably, following the accepted
More informationQuantitative Model Checking (QMC) - SS12
Quantitative Model Checking (QMC) - SS12 Lecture 06 David Spieler Saarland University, Germany June 4, 2012 1 / 34 Deciding Bisimulations 2 / 34 Partition Refinement Algorithm Notation: A partition P over
More information1 IEOR 4701: Continuous-Time Markov Chains
Copyright c 2006 by Karl Sigman 1 IEOR 4701: Continuous-Time Markov Chains A Markov chain in discrete time, {X n : n 0}, remains in any state for exactly one unit of time before making a transition (change
More informationThe exponential distribution and the Poisson process
The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]
More informationIrreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1
Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationTMA4265 Stochastic processes ST2101 Stochastic simulation and modelling
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More information(b) What is the variance of the time until the second customer arrives, starting empty, assuming that we measure time in minutes?
IEOR 3106: Introduction to Operations Research: Stochastic Models Fall 2006, Professor Whitt SOLUTIONS to Final Exam Chapters 4-7 and 10 in Ross, Tuesday, December 19, 4:10pm-7:00pm Open Book: but only
More informationPoisson Processes. Stochastic Processes. Feb UC3M
Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written
More informationBirth-Death Processes
Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationContinuous Time Markov Chains
Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov
More informationExamination paper for TMA4265 Stochastic Processes
Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination
More information3. Poisson Processes (12/09/12, see Adult and Baby Ross)
3. Poisson Processes (12/09/12, see Adult and Baby Ross) Exponential Distribution Poisson Processes Poisson and Exponential Relationship Generalizations 1 Exponential Distribution Definition: The continuous
More informationMarkov processes and queueing networks
Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution
More informationExample: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected
4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X
More informationMARKOV PROCESSES. Valerio Di Valerio
MARKOV PROCESSES Valerio Di Valerio Stochastic Process Definition: a stochastic process is a collection of random variables {X(t)} indexed by time t T Each X(t) X is a random variable that satisfy some
More informationLecture 20: Reversible Processes and Queues
Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More informationMarkov Processes Cont d. Kolmogorov Differential Equations
Markov Processes Cont d Kolmogorov Differential Equations The Kolmogorov Differential Equations characterize the transition functions {P ij (t)} of a Markov process. The time-dependent behavior of the
More informationSTA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008
Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There
More informationLECTURE #6 BIRTH-DEATH PROCESS
LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death
More informationIntroduction to Queuing Networks Solutions to Problem Sheet 3
Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus
More informationHomework 4 due on Thursday, December 15 at 5 PM (hard deadline).
Large-Time Behavior for Continuous-Time Markov Chains Friday, December 02, 2011 10:58 AM Homework 4 due on Thursday, December 15 at 5 PM (hard deadline). How are formulas for large-time behavior of discrete-time
More informationLecture 20 : Markov Chains
CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationLecture 7: Simulation of Markov Processes. Pasi Lassila Department of Communications and Networking
Lecture 7: Simulation of Markov Processes Pasi Lassila Department of Communications and Networking Contents Markov processes theory recap Elementary queuing models for data networks Simulation of Markov
More informationMath 597/697: Solution 5
Math 597/697: Solution 5 The transition between the the ifferent states is governe by the transition matrix 0 6 3 6 0 2 2 P = 4 0 5, () 5 4 0 an v 0 = /4, v = 5, v 2 = /3, v 3 = /5 Hence the generator
More informationMarkov Chains. October 5, Stoch. Systems Analysis Markov chains 1
Markov Chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 5, 2011 Stoch. Systems
More information1 Poisson processes, and Compound (batch) Poisson processes
Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing
More informationSolutions to Homework Discrete Stochastic Processes MIT, Spring 2011
Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions
More informationreversed chain is ergodic and has the same equilibrium probabilities (check that π j =
Lecture 10 Networks of queues In this lecture we shall finally get around to consider what happens when queues are part of networks (which, after all, is the topic of the course). Firstly we shall need
More informationBirth-death chain models (countable state)
Countable State Birth-Death Chains and Branching Processes Tuesday, March 25, 2014 1:59 PM Homework 3 posted, due Friday, April 18. Birth-death chain models (countable state) S = We'll characterize the
More informationAssignment 3 with Reference Solutions
Assignment 3 with Reference Solutions Exercise 3.: Poisson Process Given are k independent sources s i of jobs as shown in the figure below. The interarrival time between jobs for each source is exponentially
More informationECE-517: Reinforcement Learning in Artificial Intelligence. Lecture 4: Discrete-Time Markov Chains
ECE-517: Reinforcement Learning in Artificial Intelligence Lecture 4: Discrete-Time Markov Chains September 1, 215 Dr. Itamar Arel College of Engineering Department of Electrical Engineering & Computer
More informationerrors every 1 hour unless he falls asleep, in which case he just reports the total errors
I. First Definition of a Poisson Process A. Definition: Poisson process A Poisson Process {X(t), t 0} with intensity λ > 0 is a counting process with the following properties. INDEPENDENT INCREMENTS. For
More informationSince D has an exponential distribution, E[D] = 0.09 years. Since {A(t) : t 0} is a Poisson process with rate λ = 10, 000, A(0.
IEOR 46: Introduction to Operations Research: Stochastic Models Chapters 5-6 in Ross, Thursday, April, 4:5-5:35pm SOLUTIONS to Second Midterm Exam, Spring 9, Open Book: but only the Ross textbook, the
More informationQ = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?
IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross
More information6.041/6.431 Fall 2010 Final Exam Solutions Wednesday, December 15, 9:00AM - 12:00noon.
604/643 Fall 200 Final Exam Solutions Wednesday, December 5, 9:00AM - 2:00noon Problem (32 points) Consider a Markov chain {X n ; n 0,, }, specified by the following transition diagram 06 05 09 04 03 2
More informationContinuous Time Processes
page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point
More informationProbability and Statistics Concepts
University of Central Florida Computer Science Division COT 5611 - Operating Systems. Spring 014 - dcm Probability and Statistics Concepts Random Variable: a rule that assigns a numerical value to each
More informationContinuous time Markov chains
Chapter 2 Continuous time Markov chains As before we assume that we have a finite or countable statespace I, but now the Markov chains X {X(t) : t } have a continuous time parameter t [, ). In some cases,
More informationMASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science
MASSACHUSETTS INSTITUTE OF TECHNOLOGY Department of Electrical Engineering and Computer Science 6.262 Discrete Stochastic Processes Midterm Quiz April 6, 2010 There are 5 questions, each with several parts.
More informationQueuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe
Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What
More informationMarkov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains
Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationApplied Stochastic Processes
STAT455/855 Fall 26 Applied Stochastic Processes Final Exam, Brief Solutions 1 (15 marks (a (7 marks For 3 j n, starting at the jth best point we condition on the rank R of the point we jump to next By
More informationOutlines. Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC)
Markov Chains (2) Outlines Discrete Time Markov Chain (DTMC) Continuous Time Markov Chain (CTMC) 2 pj ( n) denotes the pmf of the random variable p ( n) P( X j) j We will only be concerned with homogenous
More informationSTOCHASTIC PROCESSES Basic notions
J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving
More informationM/G/1 and M/G/1/K systems
M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded
More informationMAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.
MAT 4371 - SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions. Question 1: Consider the following generator for a continuous time Markov chain. 4 1 3 Q = 2 5 3 5 2 7 (a) Give
More informationMassachusetts Institute of Technology
6.04/6.43: Probabilistic Systems Analysis (Fall 00) Problem Set 7: Solutions. (a) The event of the ith success occuring before the jth failure is equivalent to the ith success occurring within the first
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationQueueing Theory. VK Room: M Last updated: October 17, 2013.
Queueing Theory VK Room: M1.30 knightva@cf.ac.uk www.vincent-knight.com Last updated: October 17, 2013. 1 / 63 Overview Description of Queueing Processes The Single Server Markovian Queue Multi Server
More informationStochastic Modelling Unit 1: Markov chain models
Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson
More informationSTAT 380 Markov Chains
STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 18: Poisson Process Abhay Parekh UC Berkeley March 17, 2011 1 1 Review 2 Poisson Process 2 Bernoulli Process An arrival process comprised of a sequence of
More informationPoisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.
Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,
More information2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES
295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.
More informationPoissonprocessandderivationofBellmanequations
APPENDIX B PoissonprocessandderivationofBellmanequations 1 Poisson process Let us first define the exponential distribution Definition B1 A continuous random variable X is said to have an exponential distribution
More informationChapter 16 focused on decision making in the face of uncertainty about one future
9 C H A P T E R Markov Chains Chapter 6 focused on decision making in the face of uncertainty about one future event (learning the true state of nature). However, some decisions need to take into account
More informationLet (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t
2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition
More informationModelling in Systems Biology
Modelling in Systems Biology Maria Grazia Vigliotti thanks to my students Anton Stefanek, Ahmed Guecioueur Imperial College Formal representation of chemical reactions precise qualitative and quantitative
More informationProbability Distributions
Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)
More informationMidterm 2 Review. CS70 Summer Lecture 6D. David Dinh 28 July UC Berkeley
Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley Midterm 2: Format 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten
More informationA review of Continuous Time MC STA 624, Spring 2015
A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic
More informationReading: Karlin and Taylor Ch. 5 Resnick Ch. 3. A renewal process is a generalization of the Poisson point process.
Renewal Processes Wednesday, December 16, 2015 1:02 PM Reading: Karlin and Taylor Ch. 5 Resnick Ch. 3 A renewal process is a generalization of the Poisson point process. The Poisson point process is completely
More informationIEOR 3106: Second Midterm Exam, Chapters 5-6, November 7, 2013
IEOR 316: Second Midterm Exam, Chapters 5-6, November 7, 13 SOLUTIONS Honor Code: Students are expected to behave honorably, following the accepted code of academic honesty. You may keep the exam itself.
More informationEE126: Probability and Random Processes
EE126: Probability and Random Processes Lecture 19: Poisson Process Abhay Parekh UC Berkeley March 31, 2011 1 1 Logistics 2 Review 3 Poisson Processes 2 Logistics 3 Poisson Process A continuous version
More informationPage 0 of 5 Final Examination Name. Closed book. 120 minutes. Cover page plus five pages of exam.
Final Examination Closed book. 120 minutes. Cover page plus five pages of exam. To receive full credit, show enough work to indicate your logic. Do not spend time calculating. You will receive full credit
More informationReadings: Finish Section 5.2
LECTURE 19 Readings: Finish Section 5.2 Lecture outline Markov Processes I Checkout counter example. Markov process: definition. -step transition probabilities. Classification of states. Example: Checkout
More informationDisjointness and Additivity
Midterm 2: Format Midterm 2 Review CS70 Summer 2016 - Lecture 6D David Dinh 28 July 2016 UC Berkeley 8 questions, 190 points, 110 minutes (same as MT1). Two pages (one double-sided sheet) of handwritten
More informationThe story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes
The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element
More informationNotes on Continuous Random Variables
Notes on Continuous Random Variables Continuous random variables are random quantities that are measured on a continuous scale. They can usually take on any value over some interval, which distinguishes
More informationACM 116 Problem Set 4 Solutions
ACM 6 Problem Set 4 Solutions Lei Zhang Problem Answer (a) is correct. Suppose I arrive at time t, and the first arrival after t is bus N i + and occurs at time T Ni+. Let W t = T Ni+ t, which is the waiting
More informationOptional Stopping Theorem Let X be a martingale and T be a stopping time such
Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10
More informationMarkov Chain Model for ALOHA protocol
Markov Chain Model for ALOHA protocol Laila Daniel and Krishnan Narayanan April 22, 2012 Outline of the talk A Markov chain (MC) model for Slotted ALOHA Basic properties of Discrete-time Markov Chain Stability
More informationPart II: continuous time Markov chain (CTMC)
Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1
More informationP i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=
2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]
More information