Integrals for Continuous-time Markov chains

Size: px
Start display at page:

Download "Integrals for Continuous-time Markov chains"

Transcription

1 Integrals for Continuous-time Markov chains P.K. Pollett Abstract This paper presents a method of evaluating the expected value of a path integral for a general Markov chain on a countable state space. We illustrate the method with reference to several models, including birth-death processes and the birth, death and catastrophe process. Introduction Let Xt, t 0 be a continuous-time Markov chain taking values in the nonnegative integers S = {0,,... } and consider the path integral Γ 0 f = τ0 0 fxt dt, where τ 0 = inf{t > 0 : Xt = 0} is the first hitting time of state 0, and f is a given function that maps S to [0,. Stefanov and Wang [0] derived an explicit expression for E i Γ 0 f := EΓ 0 f X0 = i, i, in the case of a birth-death process, under the condition that f is eventually non-decreasing. This built on earlier work of Hernández-Suárez and Castillo-Chavez [4], who considered the case i = and fx = x. However, results from potential theory subsume these results and allow one to extend substantially the work of these authors, thus allowing the expectation of Γ 0 f to be evaluated explicitly for a much wider variety of models. We shall provide several illustrations, giving special attention to birth-death processes and the birth, death and catastrophe process. We shall see that, in the case of a birthdeath process, the condition on f, imposed by Stefanov and Wang, can be relaxed completely. We will consider the more general problem of evaluating E i Γf for the path integral Γf = 0 fxt dt. 2 The solution to the original problem can then be obtained by making 0 an absorbing state and setting f0 = 0. As pointed out in [4], it is useful, particularly in biological contexts, to think of fx as being the cost or reward per unit time of staying in state x. The problem, then, is to evaluate the expected total cost over the life of the process. For convenience, we shall write f i = fi whenever this simplifies notation. Department of Mathematics, University of Queensland, Queensland 4072, Australia.

2 2 Expectation of the path integral Let Q = q ij, i, j S be the q-matrix of transition rates of the chain assumed to be stable and conservative, so that q ij represents the rate of transition from state i to state j, for j i, and q ii = q i, where q i := j i q ij < represents the total rate out of state i. For example, in the case of a birth-death process we would have q i,i+ = λ i and q i,i = µ i with µ 0 = 0, where λ i and µ i are the birth and death rates of the process. It will not be necessary to assume that Q is regular, so that there may actually be many processes with the given set of rates. However, we will take Xt, t 0 to be the minimal process associated with Q. Its transition function P t = p ij t, i, j S is the minimal solution to the Kolmogorov backward equations, and has the following interpretation: p ij t = PrXt = j, Nt < X0 = i, where Nt is the number of jumps of the process up to time t. For further technical details, see []. Whilst it is commonly assumed that Q is regular, which is equivalent to Nt being almost surely finite for all t, it is often useful in biological applications, and particularly when dealing with birth-death processes, to allow for the possibility that the process might explode by performing infinitely-many transitions in a finite time. The following result can be found in any text on Markov chains that deals with potential theory for example, Theorem of [8]. Proposition If y i = E i Γf, where Γf is given by 2, then y = y i, i S is the minimal non-negative solution to the system of equations q ij z j + f i = 0, i S, 3 j S in the sense that y satisfies these equations, and, if z = z i, i S is any nonnegative solution, then y i z i for all i S. Returning to the original problem formulated in the introduction, where we sought to evaluate expectations of the path integral Γ 0 f = τ 0 fxt dt with τ 0 0 being the first hitting time of state 0, we may simply apply Proposition to the chain modified so that 0 is an absorbing state, that is q 0 = 0 and q 0j = 0 for j, and set f0 = 0 when τ 0 <, there is no contribution to the path integral 2 for t > τ 0. Of course, this is the context in which we would most likely apply the result: where the expected value of the path integral is evaluated up to absorption. Corollary If e i = E i Γ 0 f, where Γ 0 f is given by, then e = e i, i is the minimal non-negative solution to the system of equations q ij z j + f i = 0, i. 4 j Remarks. i If we set fx =, then Γ 0 f records the time of first leaving the set E = {, 2,... }, that is Γ 0 f = min{τ 0, τ }, where τ = sup{t > 0 : Nt < }. If Q is assumed to be regular, so that τ is almost surely infinite, then Corollary 2

3 reduces to a well known and widely used result on expected hitting times that can be found in any text on Markov chains for example, Theorem of [8]. Despite its obvious use in evaluating expected extinction times, it is apparently not widely known to biologists; in their paper Four facts every conservation biologist should know about persistence, Mangel and Tier [6] implore their readers to use it: Fact 2 There is a simple and direct method for the computation of persistence times that virtually all biologists can use. ii If, more generally, τ 0 is replaced by the hitting time of a set A, that is τ A = inf{t > 0 : Xt A}, then the expected value of the resulting path integral Γ A f can be evaluated. If e i = E i Γ A f for i A c, then e = e i, i A c will be the minimal non-negative solution to j A q c ij z j + f i = 0, i A c. iii On dividing equation 4 by f i assuming here that f j > 0 for all j, we see that E i Γ 0 f is the same as the expected hitting time of state 0, starting in state i, for the Markov chain with transition rates Q = qij, i, j S given by qij = q ij /f i, for i, and q0j = q 0j. This was observed for birth-death processes by McNeil [7]. Indeed, he observed that, conditional on X0 = j, the distribution of Γ 0 f is the same as that for τ for the Markov chain with transition rates Q. A similar observation can be made in respect of equation 3: E i Γf is the same as the expected time to explosion, stating in state i, for the Markov chain with transition rates Q given by qij = q ij /f i, for i S. 3 Birth-death processes In the case of birth-death processes, explicit formulae can be obtained. This was done for E i Γ 0 f by Stefanov and Wang [0], assuming that there exists a positive number x 0 such that f is non-decreasing for x x 0, but, as we shall see, this condition can be relaxed. Let q i,i+ = λ i for i 0 and q i,i = µ i for i, where λ i, i 0 and µ i, i are sets of strictly positive birth and death rates, with all other transition rates being 0. Under these conditions, S is irreducible. For simplicity, set µ 0 = 0, so that q i = λ i + µ i, i S. First consider the system of equations 3. For the birth-death process, they can be written λ i i µ i i + f i = 0, i, and λ f 0 = 0, where i = z i+ z i, i 0, and can be solved iteratively to obtain i = f j π j, i 0, 5 where the potential coefficients π j, j S are given by π 0 = and π i = i j= j=0 λ j µ j, i. Summing 5 from i = 0 to j gives z j = z 0 C j f, where C j f = j i=0 3 f k π k.

4 Notice that C j f Cf as j, where Cf = i=0 f k π k. Thus, for a finite non-negative solution, we require Cf < and, then, z 0 Cf. The minimal non-negative solution is obtained on setting z 0 = Cf. We have proved the following result. Proposition 2 For the birth-death process, E j Γf = i=j f k π k, j S. Remark. On setting fx = we obtain the well known result that the expected time to explosion starting in state j is given by E j τ = i=j π k, j S. Indeed, for the birth-death process, τ is almost surely infinite if and only if E j τ = for some and then all j; see []. Next let us consider equations 4. We seek the minimal non-negative solution z i, i to the system λ i z i+ λ i + µ i z i + µ i z i + f i = 0, i 2, λ z 2 λ + µ z + f = 0. 6 If we set z 0 = 0 and i = z i+ z i, i 0, these can be written λ i i µ i i +f i = 0, i, and can be solved iteratively to obtain µ i = z f j π j = z µ f j π j, i, 7 j= where the potential coefficients π j, j are now given by π = and π i = i k=2 λ k µ k, i 2. Summing 7 from i = to j gives, for j 2, j j z j = z + µ f k π k = z + Using the fact that = µ i+ π i+, we arrive at j= j z µ f k π k. z j = j z µ B i f, where B i f = 4 i f k π k, 8

5 interpreting an empty sum as being 0. This expression is valid also for j =. Now let j A j = and observe that, as j, A j A and B j f Bf, where A = and Bf = f k π k. Both of these sums may converge or diverge. We shall consider the cases A = and A < separately. The case A =. Under this condition, the process is non-explosive and τ 0 is almost surely finite see []. Since A = the sequence {z j } will eventually become negative unless Bf z µ. So, if Bf =, the solution to 6 will be infinite. Otherwise, the minimal solution is obtained on setting z µ = Bf. Thus, we have proved the following result, which generalizes Proposition of Stefanov and Wang [0]. Proposition 3 For the birth-death process with A =, E j Γ 0 f = j f k π k, j, k=i and this is finite if and only if f kπ k <. Remark. On setting fx = we obtain the well known result that if the process hits 0 with probability, the expected first passage time to 0 starting in state j is given by j E j τ 0 = π k, j, and this is finite if and only if π k <. The case A <. In view of 8, we may write z j = z µ A j C j f, where, now, C f = 0 and, for j 2, k=i C j f = j i f k π k = i=2 j As j, A j A < and C j f Cf, where f k π k. Cf = i f k π k = i=2 f k π k. If Cf =, then certainly sequence {z j } will eventually become negative. Otherwise, we require µ z Mf, where Mf = sup j C j f/a j, in order to avoid this happening. The minimal solution is then obtained on setting z µ = Mf. Thus, we have proved the following result. 5

6 Proposition 4 For the birth-death process with A <, j i E j Γ 0 f = Mf f k π k, j, and this is finite if and only if Cf <. Remark. This result is not covered by Proposition of Stefanov and Wang [0], for they have effectively assumed that A =. Their method of proof relies on a state-space truncation argument, whereby the process is approximated by a finitestate birth-death process on S n = {0,,..., n} with a reflecting barrier at n. This latter stipulation means that the resulting process may not be the minimal process, though it will when C := π k =. However, the limit process will always have A = ; see [5]. Example. In order to illustrate the results of this section, we will consider the birth-death process on S = {0,,... } with λ i = λ and µ i = µ both strictly positive. This is a simple model for a population that allows immigration at rate λ and removal rate µ. It is also known as the M/M/ queue, and referred to as such in both [4] and [0]. Let us first apply Proposition 2. It is easy to see that π i = ρ i, i 0, where ρ = λ/µ, and so E j Γf = f k r i k, j 0, λ i=j where r = /ρ. To illustrate this further, take f k = α k, where α > 0. If α = r, then the expectation is finite if and only if r < that is λ > µ, in which case E j Γf = + rjrj λ r 2, j 0. If α r, then the expectation is finite if and only if max{α, r} <, which requires λ > µ, and in which case E j Γf = αrj+ rα j+ λr α α r, j 0. Next, we shall evaluate E j Γ 0 f using Propositions 3 and 4. First observe that A j = rj µ r, and so A < if and only if r < that is λ > µ, in which case A = /µ r. Consider first the case λ < µ. We now have π i = ρ i, i. Using Proposition 3, we find that E j Γ 0 f = µ j i=0 f k+ ρ k i, j. k=i 6

7 To illustrate this further, take f k = α k, where α > 0. The expectation is finite if and only if α < r, in which case E j Γ 0 f = r α j µr α α, j. Notice that {f j } is a decreasing sequence when α <, and so Proposition of Stefanov and Wang [0] cannot be used to establish this formula. Finally, we shall consider the case λ > µ, that is, r <. A simple calculation gives and so j r C j f = f k r j k and Cf = µ r C j f A j j = r r j k f k. r j r µ r f k. The expectation in Proposition 4 will therefore be finite if and only if f k <, in which case where E j Γ 0 f = rj µ r M = sup j j M r j r r j k f k, j, r j r j k f k. r j As before, take f k = α k, but assume that α <, so as to ensure that the expectation is finite. We find that rj r C j f j jr j +, if α = r, r = j r A j rα j α + α j r + α r j, if α r. r j αr α In both cases this defines an increasing sequence with limit r/ α. Therefore, if α = r, E j Γ 0 f = jrj µ r, j, while if α r, then E j Γ 0 f = rr j α j µ αr α, j. Example 2. Next we shall consider an example of a birth-death process that exhibits explosive behaviour. We begin by examining the pure birth process. This has transition rates q i,i+ = q i = λ i, i 0, where the birth rates λ i, i 0 are all 7

8 strictly positive. Equations 3 are λ i z i+ z i + f i = 0, i 0. On summing from i = 0 to j we get z j = z 0 C j f, where C j f = j i=0 f i/λ i. Since C j f Cf := i=0 f i/λ i, a finite non-negative solution is obtained whenever Cf < and z 0 Cf, in which case setting z 0 = Cf gives the minimal solution. We deduce that E i Γ = j=i f j/λ j. Setting f i = shows that E i τ = j=i /λ j. So, the expected time to explosion is infinite if and only if C := j=0 /λ j =. Indeed, the process is regular, that is τ is almost surely infinite for all starting states, if and only if this latter condition holds see []. Explosive behaviour is easily exhibited. For example, suppose that Xt describes the number in a population of individuals, pairs of whom produce offspring at per-interaction rate γ > 0. Then, γ i 2 will be the birth rate when the population size is i. Since at least two individuals are needed to get things going, it is convenient to relabel the state space so that Xt = i indicates that i+2 individuals are present at time t. So, we have a pure birth process of the kind described with λ i = λi + i + 2, where λ = γ/2. The process is explosive because C = i=0 and the path integral Γ has expectation E i Γ = λ λi + i + 2 = λ <, j=i For example, if f i = α i, where 0 < α, then E 0 Γ = λα f j j + j α log α α The introduction of a linear death component presents no particular difficulties. Let λ i = λii+ and define µ i by µ i = µi, where µ > 0. In this way, E = {, 2,... } is an irreducible class and 0 is the sole absorbing state. Let ρ = λ/µ. Then, in the notation established in connection with Propositions 3 and 4, we have π = and, for j 2, π j = j k=2 λ k µ k = ρ j j k=2 kk k. = ρ j j! This formula is valid also for j =. The j-th term of A is a j := µ j π j = µρ j j! and so A < because a j+ /a j = /ρj + 0. Thus, exit from E occurs with probability less than no matter what the ρ. Also, since λ i /µ i = ρi +, there is very strong positive drift. So, we might expect the process to be explosive. Indeed, this can be established. The j-th term of C is c j := λ j π j j π k = j λρ j j +! 8 ρ k k! = d j λjj +,

9 where d j = ρ j j! j ρ k k! = + j ρ j k j k! ρ j j! Since, for each k, the summand converges to 0 monotonically as j, monotone convergence implies that d j. Hence, because k j= {jj + } =, we may deduce that C <. To illustrate the evaluation of the path integral, suppose that the states in E have Poisson weights: f k = e α α k /k!, k, where α > 0. Then, if αρ, A j = ρ j ρ i and C j f = e α A j j α i, µ i! αρ αµ i! while if αρ =, A j = αµ j α i i! e α αρ and It is easy to show that C j f/a j M, where M = and. C j f = e α µ + αµa j µa j. e α αρe /ρ, if αρ, M = e α + α, if αρ =. e α Using Proposition 4, we get j e α α i E j Γ 0 f = αµ αρ i! eα j ρ i, if αρ, e /ρ i! and E j Γ 0 f = e α µ e α e α j α i j i! i=0 α i, if αρ =. i! 4 The birth, death and catastrophe process We shall briefly examine a model [9] that is equivalent to one described by Brockwell [2], which has been used to describe the behaviour of populations that are subject to catastrophic mortality or emigration events. In addition to simple birth and death transitions, it incorporates jumps down of arbitrary size catastrophes. The process has state space S = {0,,... } and q-matrix given by iρ k i d k, j = 0, i, iρd i j, j =, 2,... i, i 2, q ij = iρ, j = i, i 0, iρa j = i +, i 0, 0, otherwise, 9

10 where ρ> 0 is the rate per capita at which the population size changes, a> 0 is the probability of a birth and d i, the probability of a catastrophe of size i, is positive for at least one i a + i d i =. Notice that when d i = 0 for all i 2, we recover the simple linear birth-death process, with λ i = iρa and µ i = iρd. Clearly E = {, 2,... } is an irreducible class and 0 is an absorbing state that is accessible from E. It is easily shown that Q is regular. Let d be the probability generating function given by ds = a + d i s i+, s <, and assume that d <. Brockwell showed that the process is absorbed with probability if and only if the drift D, given by D = d, is less than or equal to 0. We shall evaluate E i Γ 0 f for the path integral, where τ 0 is the time to absorption. Considering equations 4, we seek the minimal non-negative solution to the system This can be written i iρaz i+ iρz i + iρ d i j z j + f i = 0, i. j= i ρaz i+ ρz i + ρ d i j z j + g i = 0, i, 9 j= where g i = f i /i. On multiplying by s i and then summing over i, we find that the generating function Hs = z is i of any solution z i, i to 9 must satisfy bshs aρz + gs = 0, where bs = ρds s and gs = g is i. It is easy to see that b is convex on 0,, with b0 = aρ and b = 0. Furthermore, as Brockwell showed, σ, the smallest zero of b in 0, ], satisfies σ = if D 0 and σ < if D < 0. We know, from Lemma V.2. of Harris [3], that /bs has a power series expansion with positive coefficients in a neighbourhood of 0. So, let us write /bs = j=0 e js j, s < σ, where e j > 0, noting that aρ = b0 = /e 0. Letting κ = aρz, we obtain and hence, for i 2, Hs = z + κe i g j e i j s i, j= i i 2 z i = κe i g j e i j = κe i g i j e j. j= Now, {e i } is an increasing sequence. This is because e 0 = /aρ, and, since, z i = κe i when g 0, we have from 9 that j=0 i ρae i ρe i + ρ d i j e j = 0, i, j= 0

11 and hence that i ae i e i = d i j e i e j + e i j= j=i d j, i. Therefore, {z i } is the difference of two non-negative increasing sequences, and so, in order to ensure that {z i } itself is non-negative, we require κ sup i h i, where h i = g j e i j. e i The minimal solution is obtained with equality here. Since {e i } is increasing, we have that 0 < e i j /e i for all i j 0. Moreover, because σ is the radius of convergence of j=0 e js j, we have σ = lim i e i /e i, whenever this limit exists, implying that e i j /e i σ j for each j. Hence, formally, h i gσ. If this can be justified, then provided gσ <, we may set κ = gσ to obtain the minimal non-negative solution to 9. By Fatou s lemma, we always have lim inf i h i = lim inf i e i g j e i j j= j= g j σ j = gσ, but, in order that lim i h i exists and equals gσ, further mild conditions must be imposed. For example, if j= g jb j < and e i j /e i b j for all i j, or if {e i j /e i } is monotonic in i, then dominated and monotone convergence, respectively, guarantee that h i gσ. If, for example, e 2i = 9 i = 3 2i and e 2i+ = 2.9 i = 2.3 2i for i = 0,,..., then it can be show that lim i h i does not exist. Finally, if gσ < and h i gσ, we get j= i 2 E i Γ 0 f = gσe i g i j e j, i, 0 j=0 or, if preferred, E i Γ 0 fs i = gσ gs/bs, s < σ. To illustrate this, take f i = α i, where α > 0, so that gs = /α log αs, s < /α, and hence gσ < provided α < /σ. For example, if α =, so that f i = for all i, then gσ < if D < 0 and gσ = g = if D 0. We may therefore deduce that, when D < 0, the expected time to extinction is given by E i τ 0 = log σe i i 2 j=0 e j i j, i, or, if preferred, E i τ 0 s i = bs log s, s < σ. σ

12 m= Expected Reward E i Γ 0 f m=4 m=6 m= m= Initial Population Size i Figure : The effect of varying the m, the mean catastrophe size, on E i Γ 0 f with Poisson weights f k = e α α k /k! a = 0.55 and α = 20. This is equation 3. of [2]. Explicit results can be obtained in the case where the catastrophe size follows a geometric law. Suppose that d i = b qq i, i, where b> 0 satisfies a+b =, and 0 q <. The simple linear birth-death process is recovered on setting q = 0. It is easy to see that D = a b/ q, and so D < 0 or D 0 according as c > or c, where c = q + b/a. We also have bs = b + qas2 + qas + a qs = a s cs qs, and hence if D < 0, then σ = /c <. The coefficients of the power series /bs = j=0 e js j are easily evaluated using partial fractions. If D = 0, then e j = + qj/a, j 0. Otherwise, if D > 0 or D < 0, then e j = q c qcj a c, j 0. Note that, in all cases, e j /e j σ and hence e j k /e j σ k. Thus, E i Γ 0 f may be evaluated explicitly by substituting these expressions into 0, remembering that the expectation will be finite whenever gσ <. For example, when f i = α i, where α > 0, the expectation is finite if α < max{, q + b/a}. In contrast, gσ is always finite in the case of Poisson weights: f k = e α α k /k!, where α > 0. 2

13 For this latter case, Figure illustrates the effect of varying the mean catastrophe size on the expected value of the path integral. Acknowledgements: I would like to thank the referees and the Editor for very helpful suggestions. I would also like to thank Valery Stefanov for useful conversations on this work. The support of the Australian Research Council Grant No. A is gratefully acknowledged. References [] W.J. Anderson, Continuous-Time Markov Chains: An Applications-Oriented Approach, Springer-Verlag, New York, 99. [2] P.J. Brockwell, The extinction time of a birth, death and catastrophe process and of a related diffusion model, Adv. Appl. Probab [3] T.H. Harris, The Theory of Branching Processes, Springer-Verlag, Berlin, 963. [4] C. M. Hernández-Suárez and C. Castillo-Chavez, A basic result on the integral for birth-death Markov processes, Math. Biosci [5] W. Ledermann and G.E.H. Reuter, Spectral theory for the differential equations of simple birth and death processes, Phil. Trans. R. Soc. London [6] M. Mangel and C. Tier, Four facts every conservation biologist should know about persistence, Ecology [7] D.R. McNeil, Integral functionals of birth and death processes and related limiting distributions, Ann. Math. Statist [8] J.R. Norris, Markov Chains, Cambridge University Press, Cambridge, 997. [9] P.K. Pollett, Quasi-stationarity in populations that are subject to large-scale mortality or emigration, Environment International [0] V.T. Stefanov and S. Wang, A note on integrals for birth-death processes, Math. Biosci

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS (February 25, 2004) EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS, University of Queensland PHIL POLLETT, University of Queensland Abstract The birth, death and catastrophe

More information

A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime

A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime P.K. Pollett a and V.T. Stefanov b a Department of Mathematics, The University of Queensland Queensland

More information

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS J. Appl. Prob. 41, 1211 1218 (2004) Printed in Israel Applied Probability Trust 2004 EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS and P. K. POLLETT, University of Queensland

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Quasi-Stationary Distributions in Linear Birth and Death Processes

Quasi-Stationary Distributions in Linear Birth and Death Processes Quasi-Stationary Distributions in Linear Birth and Death Processes Wenbo Liu & Hanjun Zhang School of Mathematics and Computational Science, Xiangtan University Hunan 411105, China E-mail: liuwenbo10111011@yahoo.com.cn

More information

SIMILAR MARKOV CHAINS

SIMILAR MARKOV CHAINS SIMILAR MARKOV CHAINS by Phil Pollett The University of Queensland MAIN REFERENCES Convergence of Markov transition probabilities and their spectral properties 1. Vere-Jones, D. Geometric ergodicity in

More information

Extinction times for a birth-death process with two phases

Extinction times for a birth-death process with two phases Extinction times for a birth-death process with two phases J.V. Ross P.K. Pollett January 6, 2006 Abstract Many populations have a negative impact on their habitat or upon other species in the environment

More information

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing Advances in Dynamical Systems and Applications ISSN 0973-5321, Volume 8, Number 2, pp. 401 412 (2013) http://campus.mst.edu/adsa Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes

More information

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing Department of Applied Mathematics Faculty of EEMCS t University of Twente The Netherlands P.O. Box 27 75 AE Enschede The Netherlands Phone: +3-53-48934 Fax: +3-53-48934 Email: memo@math.utwente.nl www.math.utwente.nl/publications

More information

EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING

EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING Erik A. van Doorn and Alexander I. Zeifman Department of Applied Mathematics University of Twente P.O. Box 217, 7500 AE Enschede, The Netherlands

More information

Irregular Birth-Death process: stationarity and quasi-stationarity

Irregular Birth-Death process: stationarity and quasi-stationarity Irregular Birth-Death process: stationarity and quasi-stationarity MAO Yong-Hua May 8-12, 2017 @ BNU orks with W-J Gao and C Zhang) CONTENTS 1 Stationarity and quasi-stationarity 2 birth-death process

More information

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Front. Math. China 215, 1(4): 933 947 DOI 1.17/s11464-15-488-5 Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Yuanyuan LIU 1, Yuhui ZHANG 2

More information

Notes on uniform convergence

Notes on uniform convergence Notes on uniform convergence Erik Wahlén erik.wahlen@math.lu.se January 17, 2012 1 Numerical sequences We begin by recalling some properties of numerical sequences. By a numerical sequence we simply mean

More information

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974 LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

2. Transience and Recurrence

2. Transience and Recurrence Virtual Laboratories > 15. Markov Chains > 1 2 3 4 5 6 7 8 9 10 11 12 2. Transience and Recurrence The study of Markov chains, particularly the limiting behavior, depends critically on the random times

More information

Linear-fractional branching processes with countably many types

Linear-fractional branching processes with countably many types Branching processes and and their applications Badajoz, April 11-13, 2012 Serik Sagitov Chalmers University and University of Gothenburg Linear-fractional branching processes with countably many types

More information

Other properties of M M 1

Other properties of M M 1 Other properties of M M 1 Přemysl Bejda premyslbejda@gmail.com 2012 Contents 1 Reflected Lévy Process 2 Time dependent properties of M M 1 3 Waiting times and queue disciplines in M M 1 Contents 1 Reflected

More information

Introduction to Queuing Networks Solutions to Problem Sheet 3

Introduction to Queuing Networks Solutions to Problem Sheet 3 Introduction to Queuing Networks Solutions to Problem Sheet 3 1. (a) The state space is the whole numbers {, 1, 2,...}. The transition rates are q i,i+1 λ for all i and q i, for all i 1 since, when a bus

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

1 Types of stochastic models

1 Types of stochastic models 1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen

More information

N.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a

N.G.Bean, D.A.Green and P.G.Taylor. University of Adelaide. Adelaide. Abstract. process of an MMPP/M/1 queue is not a MAP unless the queue is a WHEN IS A MAP POISSON N.G.Bean, D.A.Green and P.G.Taylor Department of Applied Mathematics University of Adelaide Adelaide 55 Abstract In a recent paper, Olivier and Walrand (994) claimed that the departure

More information

arxiv: v1 [math.pr] 1 Jan 2013

arxiv: v1 [math.pr] 1 Jan 2013 The role of dispersal in interacting patches subject to an Allee effect arxiv:1301.0125v1 [math.pr] 1 Jan 2013 1. Introduction N. Lanchier Abstract This article is concerned with a stochastic multi-patch

More information

Computable bounds for the decay parameter of a birth-death process

Computable bounds for the decay parameter of a birth-death process Loughborough University Institutional Repository Computable bounds for the decay parameter of a birth-death process This item was submitted to Loughborough University's Institutional Repository by the/an

More information

C.7. Numerical series. Pag. 147 Proof of the converging criteria for series. Theorem 5.29 (Comparison test) Let a k and b k be positive-term series

C.7. Numerical series. Pag. 147 Proof of the converging criteria for series. Theorem 5.29 (Comparison test) Let a k and b k be positive-term series C.7 Numerical series Pag. 147 Proof of the converging criteria for series Theorem 5.29 (Comparison test) Let and be positive-term series such that 0, for any k 0. i) If the series converges, then also

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Cover Page. The handle holds various files of this Leiden University dissertation

Cover Page. The handle   holds various files of this Leiden University dissertation Cover Page The handle http://hdlhandlenet/1887/39637 holds various files of this Leiden University dissertation Author: Smit, Laurens Title: Steady-state analysis of large scale systems : the successive

More information

STABILIZATION OF AN OVERLOADED QUEUEING NETWORK USING MEASUREMENT-BASED ADMISSION CONTROL

STABILIZATION OF AN OVERLOADED QUEUEING NETWORK USING MEASUREMENT-BASED ADMISSION CONTROL First published in Journal of Applied Probability 43(1) c 2006 Applied Probability Trust STABILIZATION OF AN OVERLOADED QUEUEING NETWORK USING MEASUREMENT-BASED ADMISSION CONTROL LASSE LESKELÄ, Helsinki

More information

Lecture 20: Reversible Processes and Queues

Lecture 20: Reversible Processes and Queues Lecture 20: Reversible Processes and Queues 1 Examples of reversible processes 11 Birth-death processes We define two non-negative sequences birth and death rates denoted by {λ n : n N 0 } and {µ n : n

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i :=

P i [B k ] = lim. n=1 p(n) ii <. n=1. V i := 2.7. Recurrence and transience Consider a Markov chain {X n : n N 0 } on state space E with transition matrix P. Definition 2.7.1. A state i E is called recurrent if P i [X n = i for infinitely many n]

More information

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011 Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions

More information

RECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS

RECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS Pliska Stud. Math. Bulgar. 16 (2004), 43-54 STUDIA MATHEMATICA BULGARICA RECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS Miguel González, Manuel Molina, Inés

More information

Lecture 20 : Markov Chains

Lecture 20 : Markov Chains CSCI 3560 Probability and Computing Instructor: Bogdan Chlebus Lecture 0 : Markov Chains We consider stochastic processes. A process represents a system that evolves through incremental changes called

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

The SIS and SIR stochastic epidemic models revisited

The SIS and SIR stochastic epidemic models revisited The SIS and SIR stochastic epidemic models revisited Jesús Artalejo Faculty of Mathematics, University Complutense of Madrid Madrid, Spain jesus_artalejomat.ucm.es BCAM Talk, June 16, 2011 Talk Schedule

More information

Question Points Score Total: 70

Question Points Score Total: 70 The University of British Columbia Final Examination - April 204 Mathematics 303 Dr. D. Brydges Time: 2.5 hours Last Name First Signature Student Number Special Instructions: Closed book exam, no calculators.

More information

Censoring Technique in Studying Block-Structured Markov Chains

Censoring Technique in Studying Block-Structured Markov Chains Censoring Technique in Studying Block-Structured Markov Chains Yiqiang Q. Zhao 1 Abstract: Markov chains with block-structured transition matrices find many applications in various areas. Such Markov chains

More information

IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, Weirdness in CTMC s

IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, Weirdness in CTMC s IEOR 6711: Stochastic Models I Professor Whitt, Thursday, November 29, 2012 Weirdness in CTMC s Where s your will to be weird? Jim Morrison, The Doors We are all a little weird. And life is a little weird.

More information

STOCHASTIC PROCESSES Basic notions

STOCHASTIC PROCESSES Basic notions J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Multi Stage Queuing Model in Level Dependent Quasi Birth Death Process

Multi Stage Queuing Model in Level Dependent Quasi Birth Death Process International Journal of Statistics and Systems ISSN 973-2675 Volume 12, Number 2 (217, pp. 293-31 Research India Publications http://www.ripublication.com Multi Stage Queuing Model in Level Dependent

More information

From Calculus II: An infinite series is an expression of the form

From Calculus II: An infinite series is an expression of the form MATH 3333 INTERMEDIATE ANALYSIS BLECHER NOTES 75 8. Infinite series of numbers From Calculus II: An infinite series is an expression of the form = a m + a m+ + a m+2 + ( ) Let us call this expression (*).

More information

Intertwining of Markov processes

Intertwining of Markov processes January 4, 2011 Outline Outline First passage times of birth and death processes Outline First passage times of birth and death processes The contact process on the hierarchical group 1 0.5 0-0.5-1 0 0.2

More information

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

More information

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES by Peter W. Glynn Department of Operations Research Stanford University Stanford, CA 94305-4022 and Ward Whitt AT&T Bell Laboratories Murray Hill, NJ 07974-0636

More information

The Multi-Agent Rendezvous Problem - The Asynchronous Case

The Multi-Agent Rendezvous Problem - The Asynchronous Case 43rd IEEE Conference on Decision and Control December 14-17, 2004 Atlantis, Paradise Island, Bahamas WeB03.3 The Multi-Agent Rendezvous Problem - The Asynchronous Case J. Lin and A.S. Morse Yale University

More information

1 + lim. n n+1. f(x) = x + 1, x 1. and we check that f is increasing, instead. Using the quotient rule, we easily find that. 1 (x + 1) 1 x (x + 1) 2 =

1 + lim. n n+1. f(x) = x + 1, x 1. and we check that f is increasing, instead. Using the quotient rule, we easily find that. 1 (x + 1) 1 x (x + 1) 2 = Chapter 5 Sequences and series 5. Sequences Definition 5. (Sequence). A sequence is a function which is defined on the set N of natural numbers. Since such a function is uniquely determined by its values

More information

The small ball property in Banach spaces (quantitative results)

The small ball property in Banach spaces (quantitative results) The small ball property in Banach spaces (quantitative results) Ehrhard Behrends Abstract A metric space (M, d) is said to have the small ball property (sbp) if for every ε 0 > 0 there exists a sequence

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

AARMS Homework Exercises

AARMS Homework Exercises 1 For the gamma distribution, AARMS Homework Exercises (a) Show that the mgf is M(t) = (1 βt) α for t < 1/β (b) Use the mgf to find the mean and variance of the gamma distribution 2 A well-known inequality

More information

Part I Stochastic variables and Markov chains

Part I Stochastic variables and Markov chains Part I Stochastic variables and Markov chains Random variables describe the behaviour of a phenomenon independent of any specific sample space Distribution function (cdf, cumulative distribution function)

More information

Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca

More information

THE CLOSED-POINT ZARISKI TOPOLOGY FOR IRREDUCIBLE REPRESENTATIONS. K. R. Goodearl and E. S. Letzter

THE CLOSED-POINT ZARISKI TOPOLOGY FOR IRREDUCIBLE REPRESENTATIONS. K. R. Goodearl and E. S. Letzter THE CLOSED-POINT ZARISKI TOPOLOGY FOR IRREDUCIBLE REPRESENTATIONS K. R. Goodearl and E. S. Letzter Abstract. In previous work, the second author introduced a topology, for spaces of irreducible representations,

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Continuous time Markov chains

Continuous time Markov chains Chapter 2 Continuous time Markov chains As before we assume that we have a finite or countable statespace I, but now the Markov chains X {X(t) : t } have a continuous time parameter t [, ). In some cases,

More information

A probabilistic proof of Perron s theorem arxiv: v1 [math.pr] 16 Jan 2018

A probabilistic proof of Perron s theorem arxiv: v1 [math.pr] 16 Jan 2018 A probabilistic proof of Perron s theorem arxiv:80.05252v [math.pr] 6 Jan 208 Raphaël Cerf DMA, École Normale Supérieure January 7, 208 Abstract Joseba Dalmau CMAP, Ecole Polytechnique We present an alternative

More information

PCMI LECTURE NOTES ON PROPERTY (T ), EXPANDER GRAPHS AND APPROXIMATE GROUPS (PRELIMINARY VERSION)

PCMI LECTURE NOTES ON PROPERTY (T ), EXPANDER GRAPHS AND APPROXIMATE GROUPS (PRELIMINARY VERSION) PCMI LECTURE NOTES ON PROPERTY (T ), EXPANDER GRAPHS AND APPROXIMATE GROUPS (PRELIMINARY VERSION) EMMANUEL BREUILLARD 1. Lecture 1, Spectral gaps for infinite groups and non-amenability The final aim of

More information

Non-Essential Uses of Probability in Analysis Part IV Efficient Markovian Couplings. Krzysztof Burdzy University of Washington

Non-Essential Uses of Probability in Analysis Part IV Efficient Markovian Couplings. Krzysztof Burdzy University of Washington Non-Essential Uses of Probability in Analysis Part IV Efficient Markovian Couplings Krzysztof Burdzy University of Washington 1 Review See B and Kendall (2000) for more details. See also the unpublished

More information

Solving the Poisson Disorder Problem

Solving the Poisson Disorder Problem Advances in Finance and Stochastics: Essays in Honour of Dieter Sondermann, Springer-Verlag, 22, (295-32) Research Report No. 49, 2, Dept. Theoret. Statist. Aarhus Solving the Poisson Disorder Problem

More information

Latent voter model on random regular graphs

Latent voter model on random regular graphs Latent voter model on random regular graphs Shirshendu Chatterjee Cornell University (visiting Duke U.) Work in progress with Rick Durrett April 25, 2011 Outline Definition of voter model and duality with

More information

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321 Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process

More information

Almost sure asymptotics for the random binary search tree

Almost sure asymptotics for the random binary search tree AofA 10 DMTCS proc. AM, 2010, 565 576 Almost sure asymptotics for the rom binary search tree Matthew I. Roberts Laboratoire de Probabilités et Modèles Aléatoires, Université Paris VI Case courrier 188,

More information

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time

Math 416 Lecture 11. Math 416 Lecture 16 Exam 2 next time Math 416 Lecture 11 Math 416 Lecture 16 Exam 2 next time Birth and death processes, queueing theory In arrival processes, the state only jumps up. In a birth-death process, it can either jump up or down

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1) 1.4. CONSTRUCTION OF LEBESGUE-STIELTJES MEASURES In this section we shall put to use the Carathéodory-Hahn theory, in order to construct measures with certain desirable properties first on the real line

More information

Limiting distribution for subcritical controlled branching processes with random control function

Limiting distribution for subcritical controlled branching processes with random control function Available online at www.sciencedirect.com Statistics & Probability Letters 67 (2004 277 284 Limiting distribution for subcritical controlled branching processes with random control function M. Gonzalez,

More information

1 Continuous-time chains, finite state space

1 Continuous-time chains, finite state space Université Paris Diderot 208 Markov chains Exercises 3 Continuous-time chains, finite state space Exercise Consider a continuous-time taking values in {, 2, 3}, with generator 2 2. 2 2 0. Draw the diagramm

More information

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with

In particular, if A is a square matrix and λ is one of its eigenvalues, then we can find a non-zero column vector X with Appendix: Matrix Estimates and the Perron-Frobenius Theorem. This Appendix will first present some well known estimates. For any m n matrix A = [a ij ] over the real or complex numbers, it will be convenient

More information

Regular finite Markov chains with interval probabilities

Regular finite Markov chains with interval probabilities 5th International Symposium on Imprecise Probability: Theories and Applications, Prague, Czech Republic, 2007 Regular finite Markov chains with interval probabilities Damjan Škulj Faculty of Social Sciences

More information

CHUN-HUA GUO. Key words. matrix equations, minimal nonnegative solution, Markov chains, cyclic reduction, iterative methods, convergence rate

CHUN-HUA GUO. Key words. matrix equations, minimal nonnegative solution, Markov chains, cyclic reduction, iterative methods, convergence rate CONVERGENCE ANALYSIS OF THE LATOUCHE-RAMASWAMI ALGORITHM FOR NULL RECURRENT QUASI-BIRTH-DEATH PROCESSES CHUN-HUA GUO Abstract The minimal nonnegative solution G of the matrix equation G = A 0 + A 1 G +

More information

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected

Example: physical systems. If the state space. Example: speech recognition. Context can be. Example: epidemics. Suppose each infected 4. Markov Chains A discrete time process {X n,n = 0,1,2,...} with discrete state space X n {0,1,2,...} is a Markov chain if it has the Markov property: P[X n+1 =j X n =i,x n 1 =i n 1,...,X 0 =i 0 ] = P[X

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 011 MODULE 3 : Stochastic processes and time series Time allowed: Three Hours Candidates should answer FIVE questions. All questions carry

More information

CHAPTER 8: EXPLORING R

CHAPTER 8: EXPLORING R CHAPTER 8: EXPLORING R LECTURE NOTES FOR MATH 378 (CSUSM, SPRING 2009). WAYNE AITKEN In the previous chapter we discussed the need for a complete ordered field. The field Q is not complete, so we constructed

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

Quasi-stationary distributions

Quasi-stationary distributions Quasi-stationary distributions Phil Pollett Discipline of Mathematics The University of Queensland http://www.maths.uq.edu.au/ pkp AUSTRALIAN RESEARCH COUNCIL Centre of Excellence for Mathematics and Statistics

More information

reversed chain is ergodic and has the same equilibrium probabilities (check that π j =

reversed chain is ergodic and has the same equilibrium probabilities (check that π j = Lecture 10 Networks of queues In this lecture we shall finally get around to consider what happens when queues are part of networks (which, after all, is the topic of the course). Firstly we shall need

More information

University of Twente. Faculty of Mathematical Sciences. The deviation matrix of a continuous-time Markov chain

University of Twente. Faculty of Mathematical Sciences. The deviation matrix of a continuous-time Markov chain Faculty of Mathematical Sciences University of Twente University for Technical and Social Sciences P.O. Box 217 75 AE Enschede The Netherlands Phone: +31-53-48934 Fax: +31-53-4893114 Email: memo@math.utwente.nl

More information

Modelling Complex Queuing Situations with Markov Processes

Modelling Complex Queuing Situations with Markov Processes Modelling Complex Queuing Situations with Markov Processes Jason Randal Thorne, School of IT, Charles Sturt Uni, NSW 2795, Australia Abstract This article comments upon some new developments in the field

More information

12 Markov chains The Markov property

12 Markov chains The Markov property 12 Markov chains Summary. The chapter begins with an introduction to discrete-time Markov chains, and to the use of matrix products and linear algebra in their study. The concepts of recurrence and transience

More information

Potentials of stable processes

Potentials of stable processes Potentials of stable processes A. E. Kyprianou A. R. Watson 5th December 23 arxiv:32.222v [math.pr] 4 Dec 23 Abstract. For a stable process, we give an explicit formula for the potential measure of the

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

First passage time for Brownian motion and piecewise linear boundaries

First passage time for Brownian motion and piecewise linear boundaries To appear in Methodology and Computing in Applied Probability, (2017) 19: 237-253. doi 10.1007/s11009-015-9475-2 First passage time for Brownian motion and piecewise linear boundaries Zhiyong Jin 1 and

More information

Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Stochastic modelling of epidemic spread Julien Arino Department of Mathematics University of Manitoba Winnipeg Julien Arino@umanitoba.ca 19 May 2012 1 Introduction 2 Stochastic processes 3 The SIS model

More information

Convergence Rates for Renewal Sequences

Convergence Rates for Renewal Sequences Convergence Rates for Renewal Sequences M. C. Spruill School of Mathematics Georgia Institute of Technology Atlanta, Ga. USA January 2002 ABSTRACT The precise rate of geometric convergence of nonhomogeneous

More information

Lebesgue Measure on R n

Lebesgue Measure on R n CHAPTER 2 Lebesgue Measure on R n Our goal is to construct a notion of the volume, or Lebesgue measure, of rather general subsets of R n that reduces to the usual volume of elementary geometrical sets

More information

ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS

ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS JOHANNES RUF AND JAMES LEWIS WOLTER Appendix B. The Proofs of Theorem. and Proposition.3 The proof

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

The Distribution of Mixing Times in Markov Chains

The Distribution of Mixing Times in Markov Chains The Distribution of Mixing Times in Markov Chains Jeffrey J. Hunter School of Computing & Mathematical Sciences, Auckland University of Technology, Auckland, New Zealand December 2010 Abstract The distribution

More information

(U) =, if 0 U, 1 U, (U) = X, if 0 U, and 1 U. (U) = E, if 0 U, but 1 U. (U) = X \ E if 0 U, but 1 U. n=1 A n, then A M.

(U) =, if 0 U, 1 U, (U) = X, if 0 U, and 1 U. (U) = E, if 0 U, but 1 U. (U) = X \ E if 0 U, but 1 U. n=1 A n, then A M. 1. Abstract Integration The main reference for this section is Rudin s Real and Complex Analysis. The purpose of developing an abstract theory of integration is to emphasize the difference between the

More information

At the boundary states, we take the same rules except we forbid leaving the state space, so,.

At the boundary states, we take the same rules except we forbid leaving the state space, so,. Birth-death chains Monday, October 19, 2015 2:22 PM Example: Birth-Death Chain State space From any state we allow the following transitions: with probability (birth) with probability (death) with probability

More information

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

Introductory Analysis 2 Spring 2010 Exam 1 February 11, 2015

Introductory Analysis 2 Spring 2010 Exam 1 February 11, 2015 Introductory Analysis 2 Spring 21 Exam 1 February 11, 215 Instructions: You may use any result from Chapter 2 of Royden s textbook, or from the first four chapters of Pugh s textbook, or anything seen

More information

( f ^ M _ M 0 )dµ (5.1)

( f ^ M _ M 0 )dµ (5.1) 47 5. LEBESGUE INTEGRAL: GENERAL CASE Although the Lebesgue integral defined in the previous chapter is in many ways much better behaved than the Riemann integral, it shares its restriction to bounded

More information

The Skorokhod problem in a time-dependent interval

The Skorokhod problem in a time-dependent interval The Skorokhod problem in a time-dependent interval Krzysztof Burdzy, Weining Kang and Kavita Ramanan University of Washington and Carnegie Mellon University Abstract: We consider the Skorokhod problem

More information