Lecture 10: Semi-Markov Type Processes
|
|
- Marshall Fletcher
- 6 years ago
- Views:
Transcription
1 Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov modulation (PSMM) 2.1 M/G type queuing systems 2.2 Definition of PSMM 2.3 Regeneration properties of PSMM 3. Miscellaneous results 3.1 Ergodic theorems 3.2 Coupling and ergodic theorems 1
2 1. Semi-Markov processes (SMP) 1.1 Definition of SMP Let (J n, T n ), n =, 1,... be a homogeneous Markov chain with the phase space X [, ), where X = {1, 2,..., m}, initial distribution p i = P{J = i, X = }, i X and transition probabilities, for i, j X, s, t, Q ij (t) = P{J n+1 = j, T n+1 t/j n = i, T n = s}. Definition 1.1. The process (J n, S n = T T n ), n =, 1,... is called a Markov renewal process. We assume that the following condition excluding instant renewals holds: A: Q ij () =, i, j X. Let N(t) = max(n : S n t), t be a Markov renewal counting process (N(t) counts a number of renewals in an interval [, t]). Condition A guarantees that the random variable N(t) < with probability 1 for every t. Definition 1.2. The process J(t) = J N(t), t is called a semi-markov process. (1) Semi-Markov processes have been introduced independently by Takács, Lévy, and Smith in the mid of 195s, as a natural generalization of Markov chains. 2
3 (2) If Q ij (t) = p ij (1 e λ it ), i, j X, t, then SMP J(t) is a continuous time Markov chain. (3) If Q ij (t) = p ij I(t 1), i, j X, t, then SMP J(t), t =, 1,... is a discrete time Markov chain. (4) By the definition, random variables J n = J(S n ), n =, 1,... are states of the SMP J(t) at moments of sequential jumps, S n, n =, 1,... are moments of sequential jumps, and T n, n = 1, 2,... are inter-jump times (also referred as sojourn times). Lemma 1.1. The random sequence J n, n =, 1,... is a discrete time homogeneous d Markov chain (an imbedded MC) with the phase space X, initial distribution p i = P{J = i}, i X and transition probabilities, p ij = Q ij ( ) = P{J n+1 = j/j n = i}, i, j X. (1) (5) Transition probabilities Q ij (t) can be always represented in the form Q ij (t) = p ij F ij (t), where F ij (t) = Q ij (t)/p ij if p ij > while F ij (t) can be an arbitrary distribution function such that F ij () = if p ij =. Lemma 1.2. The random sequence T n, n =, 1,... is a sequence of conditionally independent random variables with respect to the MC J n, n =, 1,... that is, for every t 1, t 2,..., i, i 1,... X and n = 1, 2,..., P{T k t k, k = 1,..., n/j k = i k, k =,..., n} = F i i 1 (t 1 ) F in 1 i n (t n ). (2) 3
4 (6) Let us introduce a process T (t) = t S N(t), t (T (t) counts time between the moment of the last before t jump of the SMP X(t) and the moment t). Theorem 1.2. Process (J(t), T (t)), t is a continuous time homogeneous Markov process with the phase space X [, ). 1.2 Transition probabilities for SMP (7) Let us introduce transition probabilities p ij (t) = P{J(t) = j/j() = i}, i, j X, t. Denote also, F i (t) = l X Q il (t), i X. Theorem 1.1. The transition probabilities p ij (t) satisfy the following system of Markov renewal equations, for every j X, p ij (t) = (1 F i (t))i(i = j) + l X t p lj (t s)q il (ds), t, i X. (3) This system has the unique solution in the class of all measurable, non-negative, bounded functions, if condition A holds. (a) One should write down the formula of total probability using as hypotheses a state of the SMP J(t) at the moment t S 1. (b) Take into account that {J(t) = j, S 1 > t} = {J() = j}. 4
5 (c) The proof of uniqueness of the solution is analogous to those for the renewal equation. (8) Denote, π ij (s) = e st p ij (t)dt, ϕ ij (s) = e st Q ij (dt), ψ i (s) = e st (1 F i (t))dt, s. Then, the system of equations (5) is equivalent to the following family of systems of linear equations, for every j X, π ij (s) = ψ i (s)i(j = i) + l X ϕ il (s)π lj (s), i X, s. (4) 1.3 Hitting times and semi-markov renewal equations (7) Let us introduce a random functional, which is the first hitting time to a state j X for the SMP J(t), V j = inf(t > S 1 : J(t) = j). (5) We assume that the following condition excluding instant renewals holds: B: X is a communicative class of states for the imbedded Markov chain J n. (8) Let us introduce conditional distribution functions, G ij (t) = P{V j t/j() = i}, i, j X, t. Theorem 1.3. The distribution functions G ij (t) satisfy the following system of Markov renewal equations, for every j X, G ij (t) = Q ij (t) + l j t G lj (t s)q il (ds), t, i X. (6) 5
6 (a) Analogously to the proof of Theorem 1.2, one should write down the formula of total probability using as hypotheses a state of the SMP J(t) at the moment t S 1. (b) Take into consideration that {J(t) = j, S 1 > t} = {J() = j}. (c) The proof of uniqueness of the solution is analogous to those for the renewal equation. (8) The following stochastic relation, which follows from the Markov property the Markov renewal process (J n, S n ), is very useful in semi-markov computations, V ij d = { Tij with prob. p ij, T il + V lj with prob. p il, l j, i X. (7) where: (a) V ij is a random variable with the distribution function G ij (t) for every i X; (b) T il is a random variable with the distribution function F il (t) for every i, l X; (c) random variables T il and V lj are independent for every l X. (9) Computation of distribution functions for random variables of the left and right hand sides in (1) yields equations (9). (1) Denote m il = tf il (dt), m i = l X p il m il = E i S 1, i, l X. The following theorem can be obtained with the use of relation (1). 6
7 Theorem 1.4. If m i <, i X then the expectations M ij = tg ij (dt) <, i, j X and are the unique solution of the following system of linear equations, for every j X, M ij = m ij p ij + l j p il m il + l j p il M lj = m i + l j p il M lj, i X. (8) Theorem 1.5. The transition probabilities p ij (t) satisfy the following system of Markov renewal equations, for every j X, p ij (t) = (1 F i (t))i(i = j) + t p jj (t s)g ij (ds), t, i X. (9) This system has the unique solution if condition A holds. (a) One should write down the formula of total probability using as hypotheses possible states for the SMP J(t) at the moment t V j. (b) Take into consideration that {J(t) = j, V j > t} = {J() = j}. (c) The proof of uniqueness of the solution is analogous to those for the renewal equation. (11) Note that for i = j, equation (12) is a renewal equation for probabilities p jj (t) while, for i j equation (12) is just relation which expresses the probability p ij (t) via the probabilities p jj (s), s t. 7
8 2. Processes with semi-markov modulation (PSMM) 2.1 M/G type queuing systems (12) Let us consider a standard M/G queuing system with a bounded queue buffer. This system have one server and a bounded queue buffer of size N 2 (including the place in the server). It functions in the following way: (a) the input flow of customers is a standard Poisson flow with a parameter λ > ; (b) the customers are placed in the queue buffer in the order of their coming in the system; (c) a new customer coming in the system immediately start to be served if the queue buffer is empty, is placed in the queue buffer if the number of customers in the system is less than N, or leaves the system if the queue buffer is full, i.e. the number of customers in the system equals to N; (d) the service times for different customers are independent random variables with the distribution function G(t); (e) the processes of customers coming in the system and the service processes are independent. (13) Let define as S n the sequential moments of the beginning of service for customers or waiting time for empty system, and J n the number of customers in the system in moments S n, and T n = S n S n 1. Due to the above assumptions (a) (e), (J n, S n ) is a Markov renewal process with the phase space {, 1,..., N 1} [, ) and transition probabilities given by the following relation, 8
9 Q ij (t) = P{J n+1 = j, T n+1 t/j n = i} (1) 1 e λt if i =, j = 1, if i =, 1 < j = N 1, t (λs) j i+1 (j i+1)! = e λs G(ds) if 1 i N 1, i 1 j N 2, t λ N i s N i 1 (N i 1)! e λs (1 G(s))ds if 1 i N 1, j = N 1. (14) The corresponding SMP J(t) represents the number of customers in the system at moment of the last before t moment of the beginning of service for customers or waiting time for empty system. (15) But SMP J(t) does not define the number of customers in the system at moment t. This can be achieved by introducing the corresponding processes with semi-markov modulation which would describe dynamics of the ques in the system between moments S n. 2.2 Definition of PSMM Let < Y n,i (t), t, J n,i, T n,i >, n = 1, 2,..., i X = {1,..., m} and J is a family of stochastic functions defined on a probability space < Ω, F, P > such that: (a) Y n,i (t), t are measurable processes with a measurable phase space Y (with sigma-algebra of measurable sets B Y ); (b) J n,i are random variables taking values in the set X; T n,i are positive random variables; (d) J is a random variable taking values in X. 9
10 We assume that the following condition holds: C: (a) < Y n,i (t), t, J n,i, T n,i >, n = 1, 2,..., i X, J is a family of independent random functions; (b) The joint finite dimensional distributions P{Y n,i (t r ) A r,i, J n,i = j i, T n,i s i, r = 1,..., m, i X}, A r,i B, t, t r, s i, r = 1,..., m, i X, m 1 do not depend on n = 1, 2,... (but can depend on i X). Let define recursively, for n = 1, 2,..., J n = J n,jn 1, T n = T n,jn 1, Y n (t) = Y n,jn 1 (t), t. (11) Let also S n = T T n, n =, 1,... (S = ), N(t) = max(n : S n t), t and T (t) = t S N(t), t be, respectively, sequential moments of renewals, the number of renewals in an interval [, t] for this process, and the waiting time (time between the last renewal before t and moment t) for the Markov renewal process (J n, S n ). Definition 9.5. The process Y (t) = Y N(t)+1 (T (t)), t is called a processes with semi-markov modulation (modulated by the SMP J(t) = J N(t), t ). (16) SMP J(t) is a particular case of the PSPM Y (t), where Y n,i (t) i, t. In this case, Y (t) J(t), t. Example Let return back to consideration of M/G queuing system with a bounded queue buffer. In this case, random functions independent random functions < Y n,i (t), t, J n,i, T n,i >, n = 1
11 1, 2,..., i X = {1,..., m} and J can be defined in the following way for every n = 1, 2,...: (e) If i = then: Y n, (t), t ; T n, is exponentially distributed r.v. with parameter λ; J n, = 1. (f) If 1 i N 1 then: Y n,i (t) = min(i+n n,i (t), N), t, where N n,i (t), t is a Poisson process with parameter λ; T n,i is r.v. with the distribution function G(t); J n,i = Y n,i (T n,i ). 2.3 Regeneration properties of PSMM (17) Let us introduce transition probabilities p ij,a (t) = P{Y (t) A, J(t) = j/j = i}, i, j X, A B Y, t. Let also denote, q ij,a (t) = P{Y 1,i (t) A, T 1,i > t}i(i = j), and Q ij (t) = P{J 1,i = j, T 1,i t}. Theorem 1.6. The transition probabilities p ij,a (t) satisfy the following system of Markov renewal equations, for every j X and A B Y, p ij,a (t) = q ij,a (t) + l X t p lj,a (t s)q il (ds), t, i X. (12) 11
12 (a) One should write down the formula of total probability using as hypotheses a state of the SMP J(t) at the moment t S 1. (b) Take into account that condition C and the corresponding Markov regeneration property determined by this condition and relation (11). (c) The proof of uniqueness of the solution is analogous to those for the renewal equation. Theorem 1.7. The PSMP Y (t) is a regenerative process with regeneration times which are times of return of the modulating SMP J(t) in some fixed state j X. Theorem 1.8. The transition probabilities p ij (t) satisfy the following system of Markov renewal equations, for every j X and A B Y, p ij,a (t) = q ij,a (t) + t 3. Miscellaneous results 3.1 Ergodic theorems p jj,a (t s)g ij (ds), t, i X. (13) Theorem 1.9. Let assume that (a) X is one class of communicative states for MC J n ; (b) m i <, i X; (c) at least one distribution function F i (t) is non-arithmetic; (d) q ij,a (t) is almost everywhere continuous with respect to Lebesgue measure 12
13 on [, ). Then, for i, j X, A B Y, q jj,a (s)ds p ij,a (t) π j,a = as t. (14) M jj (18) If A = Y then p ij,y (t) = p ij (t) = P{J(t) = j/j() = i}, i, j X. Let also ρ l, l X be the stationary distribution of the imbedded Markov chain J n. Theorem 1.1. Let assume that (a) X is one class of communicative states for MC J n ; (b) m i <, i X; (c) at least one distribution function F i (t) is non-arithmetic. Then, for i, j X, p ij (t) π j = m j ρ j m j = M jj l X ρ as t. (15) lm l (a) In this case, q jj,y (s) = (1 F j(s))ds = m j. (b) Also, it is known that M jj = ρ l m l, ρ j l X (c) Thus, relation (14) takes the following form of relation (15). 3.2 Coupling and ergodic theorems (18) The variational distance between two probability measures P 1 (A) and P 2 (A) defined on σ-algebra of measurable subsets B of some space Z is defined as, d(p 1 ( ), P 2 ( )) = sup P 1 (A) P 2 (A). (16) A B 13
14 Let L(P 1, P 2 ) be the class of all random vectors (Z 1, Z 2 ) defined on some probability space < Ω, F, P > and taking values in the space Z Z, such that P{Z i A} = P i (A), A B, i = 1, 2. Lemma 1.3. The maximal coincidence probability, P = sup P{Z 1 = Z 2 } = 1 d(p 1 ( ), P 2 ( )). (17) (Z 1,Z 2 ) L(P 1,P 2 ) (19) Let assume that P i (A) = A p i(z)ν(dz), where p i (z) is a non-negative measurable in (x, z) function, and µ is a σ-finite measure on B. Lemma 1.4. Under the above assumption, P = 1 d(p 1 ( ), P 2 ( )) = min(p 1 (z), p 2 (z))ν(dz). (18) Z and the vector (Z1, Z2), for which the coincidence probability take its maximal value, has the following structure, { (Z, Z) with prob. P, (Z1, Z2) = (19) (Z 1, Z 2 ) with prob. 1 P, where Z is a random variable with the distribution P (A) = 1 P A min(p 1(z), p 2 (z))ν(dz), while (Z 1, Z 2 ) is a random vector with independent components, which have distributions P i (A) = A (p i(z) min(p 1 (z), p 2 (z)))ν(dz), i = 1, P (2) Let Z n, n =, 1,... be a discrete time homogeneous Markov chain with a phase space, initial distribution µ(a) and transition probabilities P (z, A). 14
15 The following condition is known condition of exponential ergodicity: D: d(p (x, ), P (y, )) ρ < 1, x, y Z. Lemma 1.5. Let condition D holds. Then, for any initial distributions P 1 and P 2, it is possible to construct a Markov chain (Zn, 1 Zn), 2 n =, 1,... with the phase space Z Z, initial distribution µ 1 µ 2, and transition probabilities P ((x, y), A B) such that: (a) P ((x, y), A Z) = P (x, A) and P ((x, y), Z B) = P (y, B); (b) Zn 1 and Zn 2 are themselves homogeneous Markov chains with initial distributions, respectively, P 1 and P 2, and transition probabilities P (x, A); (c) P{Z1 1 = Z1/Z 2 1 = x, Z 2 = y} = 1 d(p (x, ), P (y, )), x, y Z. (21) Let now define the coupling time, T = min(n : Z 1 n = Z 2 n). (2) and define the coupling operation defining a new Markov chain ( Z 1 n, Z 2 n) = { (Z 1 n, Z 2 n) for n < T, (Z 1 n, Z 1 n) for n T. (21) Lemma 1.6. ( Z n, 1 Z n) 2 is a homogeneous Markov chain with the phase space Z Z, the initial distribution P 1 P 2, and transition probabilities P ((x, y), A B), P ((x, y), A B) = { P ((x, y), A B) for x y, P (x, A B) for x = y. (22) 15
16 (22) Lemmas 1.5 and 1.6 imply that Z 1 n, and Z 2 n are themselves homogeneous Markov chains with the phase space Z, initial distributions, respectively, µ 1 and µ 2, and transition probabilities P (x, A). (23) Let us introduce probabilities, for A B, n =, 1,..., i = 1, 2, P (n) µ i (A) = P{Z i n A} = P{ Z i n A}. Theorem Let condition D holds. Then the following coupling estimate takes place, for every A B and n =, 1,..., P (n) µ i (A) P (n) µ i (A) P{T > n} (1 ρ) n. (23) (a) The following relation for random events holds, for n =, 1..., { Z 1 n A} { Z 2 n A} {T > n}. (24) (b) This relation implies that, for A B, n =, 1,..., i = 1, 2, P{ Z 1 n A} { Z 2 n A} (c) Also, for n =, 1... P{{ Z 1 n A} { Z 2 n A}} P{T > n}. (25) P{T > n} = P{ Z k 1 Z k, 2 k n} = P{ Z n 1 Z n/ 2 Z n 1 1 = x, Z n 1 2 = y}p{ Z n 1 1 dx, Z Z 2 n 1 dy, Z 1 k Z 2 k, k n 1} (1 ρ) P{ Z 1 k Z 2 k, k n 1} (1 ρ) n. (26) 16
17 (24) Under condition D there exists the stationary distribution π(a) for Markov chain Z n, which is the unique solution of the following integral equation, π(a) = π(dx)p (x, A), A B. (27) Z Lemma 1.7. em If to choose the initial distribution µ(a) π(a), then the Markov chain Z n is a stationary sequence, and for any A B and n =, 1,..., P{Z n A} = π(a). (28) Theorem Let MC Z n has an initial distribution µ and condition D holds. Then the following coupling estimate takes place, for any A B and n =, 1,..., P{Z n A} π(a) (1 ρ) n. (29) (a) Taking into account Theorem 1.11 and Lemma 1.7, we get P (n) µ (A) P π (n) (A) = P{Z n A} π(a) P{T > n} (1 ρ) n. (3) (24) Coupling method can be applied to discrete and continuous time Markov chains with, semi-markov processes, processes with semi-markov modulation, etc. 17
18 Some references are: [1] Lindvall, T. Lectures on Coupling Method, Wiley, [2] Silvestrov, D. S. Coupling for Markov renewal processes and the rate of convergence in ergodic theorems for processes with semi-markov switchings. Acta Applicandae Mathematicae, 34, 1994, LN problems 1.1. Try to prove Lemmas 1.1 and 1.2 (Hint: use Markov property of the Markov renewal process (J n, S n ) Prove that solution of the system (3) is unique in the class of all measurable, non-negative, bounded functions, if condition A holds Compute Laplace transforms in the equation (3) and get the systems of linear equations (4) Using the stochastic relations (7) write down the system of linear equations for the for moments EV n i, i X Compute functions q ij,{l} (t) in for the PSMP Y (t) describing the function of the M/G queuing system with a bounded queue buffer Try to prove Lemma Prove Lemma
19 19
Stochastic process. X, a series of random variables indexed by t
Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,
More informationPoisson Processes. Stochastic Processes. Feb UC3M
Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written
More informationIEOR 6711, HMWK 5, Professor Sigman
IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.
More information2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES
295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.
More informationP (L d k = n). P (L(t) = n),
4 M/G/1 queue In the M/G/1 queue customers arrive according to a Poisson process with rate λ and they are treated in order of arrival The service times are independent and identically distributed with
More informationGI/M/1 and GI/M/m queuing systems
GI/M/1 and GI/M/m queuing systems Dmitri A. Moltchanov moltchan@cs.tut.fi http://www.cs.tut.fi/kurssit/tlt-2716/ OUTLINE: GI/M/1 queuing system; Methods of analysis; Imbedded Markov chain approach; Waiting
More informationM/G/1 and M/G/1/K systems
M/G/1 and M/G/1/K systems Dmitri A. Moltchanov dmitri.moltchanov@tut.fi http://www.cs.tut.fi/kurssit/elt-53606/ OUTLINE: Description of M/G/1 system; Methods of analysis; Residual life approach; Imbedded
More informationMarkov Chains. X(t) is a Markov Process if, for arbitrary times t 1 < t 2 <... < t k < t k+1. If X(t) is discrete-valued. If X(t) is continuous-valued
Markov Chains X(t) is a Markov Process if, for arbitrary times t 1 < t 2
More informationIrreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1
Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate
More informationCDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical
CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one
More informationRenewal processes and Queues
Renewal processes and Queues Last updated by Serik Sagitov: May 6, 213 Abstract This Stochastic Processes course is based on the book Probabilities and Random Processes by Geoffrey Grimmett and David Stirzaker.
More informationMATH 56A: STOCHASTIC PROCESSES CHAPTER 6
MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and
More informationM/G/1 and Priority Queueing
M/G/1 and Priority Queueing Richard T. B. Ma School of Computing National University of Singapore CS 5229: Advanced Compute Networks Outline PASTA M/G/1 Workload and FIFO Delay Pollaczek Khinchine Formula
More informationLecture 20: Reversible Processes and Queues
Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n
More informationIEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.
IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas
More informationCDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes
CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv
More informationContinuous-Time Markov Chain
Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),
More informationIntroduction to Queuing Networks Solutions to Problem Sheet 3
Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus
More informationChapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition
More informationM/G/1 queues and Busy Cycle Analysis
queues and Busy Cycle Analysis John C.S. Lui Department of Computer Science & Engineering The Chinese University of Hong Kong www.cse.cuhk.edu.hk/ cslui John C.S. Lui (CUHK) Computer Systems Performance
More informationMarkov processes and queueing networks
Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution
More informationChapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS
Chapter 2 SOME ANALYTICAL TOOLS USED IN THE THESIS 63 2.1 Introduction In this chapter we describe the analytical tools used in this thesis. They are Markov Decision Processes(MDP), Markov Renewal process
More informationMAT SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions.
MAT 4371 - SYS 5120 (Winter 2012) Assignment 5 (not to be submitted) There are 4 questions. Question 1: Consider the following generator for a continuous time Markov chain. 4 1 3 Q = 2 5 3 5 2 7 (a) Give
More informationRenewal theory and its applications
Renewal theory and its applications Stella Kapodistria and Jacques Resing September 11th, 212 ISP Definition of a Renewal process Renewal theory and its applications If we substitute the Exponentially
More informationMS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 7
MS&E 321 Spring 12-13 Stochastic Systems June 1, 213 Prof. Peter W. Glynn Page 1 of 7 Section 9: Renewal Theory Contents 9.1 Renewal Equations..................................... 1 9.2 Solving the Renewal
More informationFigure 10.1: Recording when the event E occurs
10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationLECTURE #6 BIRTH-DEATH PROCESS
LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death
More informationManual for SOA Exam MLC.
Chapter 10. Poisson processes. Section 10.5. Nonhomogenous Poisson processes Extract from: Arcones Fall 2009 Edition, available at http://www.actexmadriver.com/ 1/14 Nonhomogenous Poisson processes Definition
More informationPerformance Evaluation of Queuing Systems
Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems
More informationThe Transition Probability Function P ij (t)
The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it
More informationContinuous Time Processes
page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point
More informationFunctional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals
Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico
More informationSTAT STOCHASTIC PROCESSES. Contents
STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5
More informationIntro Refresher Reversibility Open networks Closed networks Multiclass networks Other networks. Queuing Networks. Florence Perronnin
Queuing Networks Florence Perronnin Polytech Grenoble - UGA March 23, 27 F. Perronnin (UGA) Queuing Networks March 23, 27 / 46 Outline Introduction to Queuing Networks 2 Refresher: M/M/ queue 3 Reversibility
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationExponential Distribution and Poisson Process
Exponential Distribution and Poisson Process Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 215 Outline Introduction Exponential
More informationBRANCHING PROCESSES AND THEIR APPLICATIONS: Lecture 15: Crump-Mode-Jagers processes and queueing systems with processor sharing
BRANCHING PROCESSES AND THEIR APPLICATIONS: Lecture 5: Crump-Mode-Jagers processes and queueing systems with processor sharing June 7, 5 Crump-Mode-Jagers process counted by random characteristics We give
More informationLecture 4a: Continuous-Time Markov Chain Models
Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time
More informationExact Simulation of the Stationary Distribution of M/G/c Queues
1/36 Exact Simulation of the Stationary Distribution of M/G/c Queues Professor Karl Sigman Columbia University New York City USA Conference in Honor of Søren Asmussen Monday, August 1, 2011 Sandbjerg Estate
More informationNEW FRONTIERS IN APPLIED PROBABILITY
J. Appl. Prob. Spec. Vol. 48A, 209 213 (2011) Applied Probability Trust 2011 NEW FRONTIERS IN APPLIED PROBABILITY A Festschrift for SØREN ASMUSSEN Edited by P. GLYNN, T. MIKOSCH and T. ROLSKI Part 4. Simulation
More informationTime Reversibility and Burke s Theorem
Queuing Analysis: Time Reversibility and Burke s Theorem Hongwei Zhang http://www.cs.wayne.edu/~hzhang Acknowledgement: this lecture is partially based on the slides of Dr. Yannis A. Korilis. Outline Time-Reversal
More informationNon Markovian Queues (contd.)
MODULE 7: RENEWAL PROCESSES 29 Lecture 5 Non Markovian Queues (contd) For the case where the service time is constant, V ar(b) = 0, then the P-K formula for M/D/ queue reduces to L s = ρ + ρ 2 2( ρ) where
More informationStationary Analysis of a Multiserver queue with multiple working vacation and impatient customers
Available at http://pvamu.edu/aam Appl. Appl. Math. ISSN: 932-9466 Vol. 2, Issue 2 (December 207), pp. 658 670 Applications and Applied Mathematics: An International Journal (AAM) Stationary Analysis of
More informationarxiv:math/ v4 [math.pr] 12 Apr 2007
arxiv:math/612224v4 [math.pr] 12 Apr 27 LARGE CLOSED QUEUEING NETWORKS IN SEMI-MARKOV ENVIRONMENT AND ITS APPLICATION VYACHESLAV M. ABRAMOV Abstract. The paper studies closed queueing networks containing
More informationQ = (c) Assuming that Ricoh has been working continuously for 7 days, what is the probability that it will remain working at least 8 more days?
IEOR 4106: Introduction to Operations Research: Stochastic Models Spring 2005, Professor Whitt, Second Midterm Exam Chapters 5-6 in Ross, Thursday, March 31, 11:00am-1:00pm Open Book: but only the Ross
More informationData analysis and stochastic modeling
Data analysis and stochastic modeling Lecture 7 An introduction to queueing theory Guillaume Gravier guillaume.gravier@irisa.fr with a lot of help from Paul Jensen s course http://www.me.utexas.edu/ jensen/ormm/instruction/powerpoint/or_models_09/14_queuing.ppt
More informationHITTING TIME IN AN ERLANG LOSS SYSTEM
Probability in the Engineering and Informational Sciences, 16, 2002, 167 184+ Printed in the U+S+A+ HITTING TIME IN AN ERLANG LOSS SYSTEM SHELDON M. ROSS Department of Industrial Engineering and Operations
More informationQueuing Theory. Richard Lockhart. Simon Fraser University. STAT 870 Summer 2011
Queuing Theory Richard Lockhart Simon Fraser University STAT 870 Summer 2011 Richard Lockhart (Simon Fraser University) Queuing Theory STAT 870 Summer 2011 1 / 15 Purposes of Today s Lecture Describe general
More information2905 Queueing Theory and Simulation PART IV: SIMULATION
2905 Queueing Theory and Simulation PART IV: SIMULATION 22 Random Numbers A fundamental step in a simulation study is the generation of random numbers, where a random number represents the value of a random
More informationName of the Student:
SUBJECT NAME : Probability & Queueing Theory SUBJECT CODE : MA 6453 MATERIAL NAME : Part A questions REGULATION : R2013 UPDATED ON : November 2017 (Upto N/D 2017 QP) (Scan the above QR code for the direct
More informationThe concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.
The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes
More informationTHE ON NETWORK FLOW EQUATIONS AND SPLITTG FORMULAS TRODUCTION FOR SOJOURN TIMES IN QUEUEING NETWORKS 1 NO FLOW EQUATIONS
Applied Mathematics and Stochastic Analysis 4, Number 2, Summer 1991, III-I16 ON NETWORK FLOW EQUATIONS AND SPLITTG FORMULAS FOR SOJOURN TIMES IN QUEUEING NETWORKS 1 HANS DADUNA Institut flit Mathematische
More informationRegenerative Processes. Maria Vlasiou. June 25, 2018
Regenerative Processes Maria Vlasiou June 25, 218 arxiv:144.563v1 [math.pr] 22 Apr 214 Abstract We review the theory of regenerative processes, which are processes that can be intuitively seen as comprising
More informationOther properties of M M 1
Other properties of M M 1 Přemysl Bejda premyslbejda@gmail.com 2012 Contents 1 Reflected Lévy Process 2 Time dependent properties of M M 1 3 Waiting times and queue disciplines in M M 1 Contents 1 Reflected
More informationLecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321
Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More informationContents Preface The Exponential Distribution and the Poisson Process Introduction to Renewal Theory
Contents Preface... v 1 The Exponential Distribution and the Poisson Process... 1 1.1 Introduction... 1 1.2 The Density, the Distribution, the Tail, and the Hazard Functions... 2 1.2.1 The Hazard Function
More informationPart II: continuous time Markov chain (CTMC)
Part II: continuous time Markov chain (CTMC) Continuous time discrete state Markov process Definition (Markovian property) X(t) is a CTMC, if for any n and any sequence t 1
More informationConvergence Rates for Renewal Sequences
Convergence Rates for Renewal Sequences M. C. Spruill School of Mathematics Georgia Institute of Technology Atlanta, Ga. USA January 2002 ABSTRACT The precise rate of geometric convergence of nonhomogeneous
More informationUNIVERSITY OF YORK. MSc Examinations 2004 MATHEMATICS Networks. Time Allowed: 3 hours.
UNIVERSITY OF YORK MSc Examinations 2004 MATHEMATICS Networks Time Allowed: 3 hours. Answer 4 questions. Standard calculators will be provided but should be unnecessary. 1 Turn over 2 continued on next
More informationTHIELE CENTRE. The M/M/1 queue with inventory, lost sale and general lead times. Mohammad Saffari, Søren Asmussen and Rasoul Haji
THIELE CENTRE for applied mathematics in natural science The M/M/1 queue with inventory, lost sale and general lead times Mohammad Saffari, Søren Asmussen and Rasoul Haji Research Report No. 11 September
More informationChapter 3: Markov Processes First hitting times
Chapter 3: Markov Processes First hitting times L. Breuer University of Kent, UK November 3, 2010 Example: M/M/c/c queue Arrivals: Poisson process with rate λ > 0 Example: M/M/c/c queue Arrivals: Poisson
More informationTCOM 501: Networking Theory & Fundamentals. Lecture 6 February 19, 2003 Prof. Yannis A. Korilis
TCOM 50: Networking Theory & Fundamentals Lecture 6 February 9, 003 Prof. Yannis A. Korilis 6- Topics Time-Reversal of Markov Chains Reversibility Truncating a Reversible Markov Chain Burke s Theorem Queues
More informationExercises Solutions. Automation IEA, LTH. Chapter 2 Manufacturing and process systems. Chapter 5 Discrete manufacturing problems
Exercises Solutions Note, that we have not formulated the answers for all the review questions. You will find the answers for many questions by reading and reflecting about the text in the book. Chapter
More informationLIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974
LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the
More informationContinuous time Markov chains
Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017
More informationRecap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks
Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution
More information1 Delayed Renewal Processes: Exploiting Laplace Transforms
IEOR 6711: Stochastic Models I Professor Whitt, Tuesday, October 22, 213 Renewal Theory: Proof of Blackwell s theorem 1 Delayed Renewal Processes: Exploiting Laplace Transforms The proof of Blackwell s
More informationOverload Analysis of the PH/PH/1/K Queue and the Queue of M/G/1/K Type with Very Large K
Overload Analysis of the PH/PH/1/K Queue and the Queue of M/G/1/K Type with Very Large K Attahiru Sule Alfa Department of Mechanical and Industrial Engineering University of Manitoba Winnipeg, Manitoba
More informationExercises Stochastic Performance Modelling. Hamilton Institute, Summer 2010
Exercises Stochastic Performance Modelling Hamilton Institute, Summer Instruction Exercise Let X be a non-negative random variable with E[X ]
More informationMean-field dual of cooperative reproduction
The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time
More informationPerformance Modelling of Computer Systems
Performance Modelling of Computer Systems Mirco Tribastone Institut für Informatik Ludwig-Maximilians-Universität München Fundamentals of Queueing Theory Tribastone (IFI LMU) Performance Modelling of Computer
More informationSMSTC (2007/08) Probability.
SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................
More informationStability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk
Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk ANSAPW University of Queensland 8-11 July, 2013 1 Outline (I) Fluid
More informationNONLINEARLY PERTURBED STOCHASTIC PROCESSES
Theory of Stochastic Processes Vol. 14 (30), no. 3-4, 2008, pp.129-164 DMITRII SILVESTROV NONLINEARLY PERTURBED STOCHASTIC PROCESSES This paper is a survey of results presented in the recent book [25]
More informationTMA4265 Stochastic processes ST2101 Stochastic simulation and modelling
Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes
More information6 Continuous-Time Birth and Death Chains
6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.
More informationRELATING TIME AND CUSTOMER AVERAGES FOR QUEUES USING FORWARD COUPLING FROM THE PAST
J. Appl. Prob. 45, 568 574 (28) Printed in England Applied Probability Trust 28 RELATING TIME AND CUSTOMER AVERAGES FOR QUEUES USING FORWARD COUPLING FROM THE PAST EROL A. PEKÖZ, Boston University SHELDON
More informationreversed chain is ergodic and has the same equilibrium probabilities (check that π j =
Lecture 10 Networks of queues In this lecture we shall finally get around to consider what happens when queues are part of networks (which, after all, is the topic of the course). Firstly we shall need
More information1 Poisson processes, and Compound (batch) Poisson processes
Copyright c 2007 by Karl Sigman 1 Poisson processes, and Compound (batch) Poisson processes 1.1 Point Processes Definition 1.1 A simple point process ψ = {t n : n 1} is a sequence of strictly increasing
More informationQueuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem. Wade Trappe
Queuing Networks: Burke s Theorem, Kleinrock s Approximation, and Jackson s Theorem Wade Trappe Lecture Overview Network of Queues Introduction Queues in Tandem roduct Form Solutions Burke s Theorem What
More informationQueueing systems. Renato Lo Cigno. Simulation and Performance Evaluation Queueing systems - Renato Lo Cigno 1
Queueing systems Renato Lo Cigno Simulation and Performance Evaluation 2014-15 Queueing systems - Renato Lo Cigno 1 Queues A Birth-Death process is well modeled by a queue Indeed queues can be used to
More informationEXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00
Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013
More informationContinuous Time Markov Chains
Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov
More informationClass 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis.
Service Engineering Class 11 Non-Parametric Models of a Service System; GI/GI/1, GI/GI/n: Exact & Approximate Analysis. G/G/1 Queue: Virtual Waiting Time (Unfinished Work). GI/GI/1: Lindley s Equations
More informationECE 3511: Communications Networks Theory and Analysis. Fall Quarter Instructor: Prof. A. Bruce McDonald. Lecture Topic
ECE 3511: Communications Networks Theory and Analysis Fall Quarter 2002 Instructor: Prof. A. Bruce McDonald Lecture Topic Introductory Analysis of M/G/1 Queueing Systems Module Number One Steady-State
More informationA Simple Solution for the M/D/c Waiting Time Distribution
A Simple Solution for the M/D/c Waiting Time Distribution G.J.Franx, Universiteit van Amsterdam November 6, 998 Abstract A surprisingly simple and explicit expression for the waiting time distribution
More informationModelling Complex Queuing Situations with Markov Processes
Modelling Complex Queuing Situations with Markov Processes Jason Randal Thorne, School of IT, Charles Sturt Uni, NSW 2795, Australia Abstract This article comments upon some new developments in the field
More informationThe Distributions of Stopping Times For Ordinary And Compound Poisson Processes With Non-Linear Boundaries: Applications to Sequential Estimation.
The Distributions of Stopping Times For Ordinary And Compound Poisson Processes With Non-Linear Boundaries: Applications to Sequential Estimation. Binghamton University Department of Mathematical Sciences
More informationChapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan
Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process
More informationMarkov chains. Randomness and Computation. Markov chains. Markov processes
Markov chains Randomness and Computation or, Randomized Algorithms Mary Cryan School of Informatics University of Edinburgh Definition (Definition 7) A discrete-time stochastic process on the state space
More informationChapter 2. Poisson Processes
Chapter 2. Poisson Processes Prof. Ai-Chun Pang Graduate Institute of Networking and Multimedia, epartment of Computer Science and Information Engineering, National Taiwan University, Taiwan utline Introduction
More informationReading: Karlin and Taylor Ch. 5 Resnick Ch. 3. A renewal process is a generalization of the Poisson point process.
Renewal Processes Wednesday, December 16, 2015 1:02 PM Reading: Karlin and Taylor Ch. 5 Resnick Ch. 3 A renewal process is a generalization of the Poisson point process. The Poisson point process is completely
More informationMODELING WEBCHAT SERVICE CENTER WITH MANY LPS SERVERS
MODELING WEBCHAT SERVICE CENTER WITH MANY LPS SERVERS Jiheng Zhang Oct 26, 211 Model and Motivation Server Pool with multiple LPS servers LPS Server K Arrival Buffer. Model and Motivation Server Pool with
More informationE-Companion to Fully Sequential Procedures for Large-Scale Ranking-and-Selection Problems in Parallel Computing Environments
E-Companion to Fully Sequential Procedures for Large-Scale Ranking-and-Selection Problems in Parallel Computing Environments Jun Luo Antai College of Economics and Management Shanghai Jiao Tong University
More informationQueueing. Chapter Continuous Time Markov Chains 2 CHAPTER 5. QUEUEING
2 CHAPTER 5. QUEUEING Chapter 5 Queueing Systems are often modeled by automata, and discrete events are transitions from one state to another. In this chapter we want to analyze such discrete events systems.
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationA Robust Queueing Network Analyzer Based on Indices of Dispersion
A Robust Queueing Network Analyzer Based on Indices of Dispersion Wei You (joint work with Ward Whitt) Columbia University INFORMS 2018, Phoenix November 6, 2018 1/20 Motivation Many complex service systems
More informationRandom point patterns and counting processes
Chapter 7 Random point patterns and counting processes 7.1 Random point pattern A random point pattern satunnainen pistekuvio) on an interval S R is a locally finite 1 random subset of S, defined on some
More information