BRANCHING PROCESS WITH

Size: px
Start display at page:

Download "BRANCHING PROCESS WITH"

Transcription

1 Chapter 2 A MODIFIED MARKOV BRANCHING PROCESS WITH IMMIGRATION 2. Introduction Several physical and biological populations exhibit a branching character in their evolutionary behaviour in the sense that each individual (also called a particle ) of one type lives for some amount of time and then splits into several individuals of the same type or of different types. These processes are called cascade processes or reproductive processes or branching processes. Two important factors which characterize these processes are the random life-time of an individual and the random number of its off-springs. By considering various assumptions on the life-time and the number of off-springs, several models have The content of the first model of the chapter is published in the Proceedings of the International Conference on Mathematical Computer Engineering (ICMCE23), pp. -8. ISBN and the content of the second model of the chapter has been communicated for publication in the international journal Far East Journal of Mathematical Sciences. 4

2 2.. Introduction 5 been proposed and analysed very extensively in the past (see, for example, Harris [34], Srinivasan [83], Mode [57], Athreya and Ney [] and Assmussen and Hering []). One class of models is the class of Markov branching processes. A Markov branching process is a continuous-time branching process in which all particles behave independently and identically; and all particles have a common exponential life-time and a common off-spring probability generating function. There is basically one important aspect which is inherent in biological growth, namely the immigration of particles of the same type from outside the population. Adke [2] has considered a birth, death and migration processes. Foster [3] and Pakes [64] have separately studied a discrete branching process with immigration occurring only when the population size is. Aksland [3] has studied a simple birth-and-death process with state-independent immigration. Yamazato [2] has analysed a continuous time branching process with state-dependent immigration in which once the population size reaches the state it sojourns in that state for an exponentially distributed time and at the end of it the population instantaneously resurrects to some positive state according to some probability distribution. Bartoszynski et al. [4] have considered Markov branching processes under the influence of disasters that arrive independently of the present population size and derived an integral equation for the probability of extinction. Chen and Renshaw [23] have studied Markov branching processes with instantaneous immigration. Pakes [66] has studied the problem of constructing a Markov branching process with instantaneous resurrection. Swift [89] has obtained transient probabilities for a simple birth-death-immigration process subject to Catastrophes that occur at a constant rate. Li and Chen [54] have studied a Markov branching process with state-dependent immigration and resurrection. Di Crescenzo et al. [27] have studied birth-death processes subject to catastrophes and analysed the probability distribution of the first effective catastrophe occurrence time. Recently, Li and Liu [56] have analysed Markov branching processes with immigration-migration and resurrection. Li et al. [55] have studied the asymptotic properties of the Markov branching process with immigration. In the above models all particles have no age restriction in branching. However branching requires a specified amount of time. Parthasarathy [67] considered a Markov branching process in which each particle surviving at least up to a definite time T splits into several particles at a later time t (t>t) while all particles whose life-time ends before the time T leave no descendents.

3 2.. Introduction 6 He calls T as the maturity age of the particles. This type of maturity dependency in Markov branching processes has not been studied so far with immigration and catastrophes. To fill this gap, we investigate in the present chapter two models of Markov branching processes which incorporates immigration and age-dependent branching. In the first model the immigration is assumed to be state-independent and the second model it is assumed to be state-dependent. The organization of the chapter is as follows: In section 2.2 we consider the model I of the present chapter. The subsection 2.2. describes the first model of a modified Markov branching process with state-independent immigration. Inter-connected integral equations for conditional probability generating functions are formulated in the subsection We obtain explicit transient expressions for some of the characteristics of the population in the subsection We consider the second model of the present chapter in section 2.3. The model is described in the subsection Interconnected integral equations are derived in the subsection Explicit expression for the mean number of particles at any time t is found in the subsection In section 2.3.4, we provide the analysis of the point process of resurrection. For analysing the behaviour of the two models, we present the model of Yamazato [2] in section 2.4 and obtain an explicit expression for the mean number of particles at any time t. We also obtain an explicit expression for the transient probability P (t) of the population size is at time t given that the population size at time t= is. A numerical illustration is provided in section 2.5 to compare the two models of the present chapter with the model of Yamazato [2] and highlight the impact of the maturity-dependent branching process with immigration and disasters.

4 2.2. Model I Model I 2.2. Description of the model We consider a modified Markov branching process with state-independent immigration. To be specific, a biological population is considered in which each individual lives for a random length L of time and at the end of its life-time splits into a random number of identical off-springs if L > T, where T is a positive constant. The individual leaves no descendants if L < T. The descendants behave independently and identically to the ancestor. We assume that the life-time of each individual is governed by the probability density function f (t) given by f (t)=λe λt,λ>, t>. (2.2.) Let h(s) be the off-spring probability generating function and h(s)= p + p j s j. (2.2.2) j=2 We note that h ()= jp j (2.2.3) j=2 and h () represents the mean number of off-springs of an individual. The branching process is called sub-critical, critical or super-critical according as h ()<, h ()= or h ()>. The growth of the population is supplemented by immigration of particles in batches independently from outside the population which occur according to a Poisson process with rate γ>. The number of particles in each batch is a random variable ξ having state space{, 2, 3, } and which has a common probability generating function defined by q(s)= q k s k. (2.2.4) k=

5 2.2. Model I 8 Let X(t) be the number of individuals of the population present at time t. Then the stochastic process{x(t), t } is the Markov branching process with state-independent immigration and age-dependent branching The Probability Generating Functions To study the process X(t), we define the following conditional probability generating functions: G (s, t)=e{s X(t) X()=}, (2.2.5) G (s, t)=e{s X(t) X()=}. (2.2.6) Case (i) t T : If X()=, then either an immigration occurs or no immigration takes place before t. Then we have G (s, t)=e γt +γ e γu q(g (s, t u))du. (2.2.7) If X()=, then the three mutually exclusive and exhaustive events are possible: e : No event takes place up to time t. e 2 : The first event that occurs after time t= and before time t is that the life of the particle which existed at time t= ends. e 3 : The first event that occurs after time t = and before time t is that a batch immigration occurs. Then we get G (s, t)= se (λ+γ)t +λ e (λ+γ)u G (s, t u)du

6 2.2. Model I 9 +γ e (λ+γ)u k= q k {G (s, t u)} k+ du. (2.2.8) Case (ii) t>t : If X()=, then either an immigration occurs or no immigration takes place before t. Then we have G (s, t)=e γt +γ e γu q(g (s, t u))du, (2.2.9) If X()=, then three mutually exclusive and exhaustive events are possible: e : No event takes place up to time t. e 2 : The first event that occurs after time t= and before time t is that the life of the particle which existed at time t= ends. The time of occurrence of this event may be either before the maturity time T or after the maturity time T. If it is before the time T, then no off-spring is produced. If it is after the time T, then a random number of off-springs are produced. e 3 : The first event that occurs after time t = and before time t is that a batch immigration occurs. Then we get T G (s, t)= se (λ+γ)t +λ e (λ+γ)u G (s, t u)du +λ e (λ+γ)u p G (s, t u)+ T +γ e (λ+γ)u k= p j {G (s, t u)} j du j=2 q k {G (s, t u)} k+ du. (2.2.)

7 2.2. Model I The mean number of individuals We define the following conditional means: m (t)=e{x(t) X()=}, (2.2.) m (t)=e{x(t) X()=}. (2.2.2) If t T, then differentiating (2.2.7) and (2.2.8) with respect to s and putting s=, we get respectively m (t)=γq () e γu m (t u)du, (2.2.3) m (t)=e (λ+γ)t +λ e (λ+γ)u m (t u)du +γ[q ()+] e (λ+γ)u m (t u)du. (2.2.4) If t>t, then differentiating (2.2.9) and (2.2.) with respect to s and putting s=, we get respectively m (t)=γq () e γu m (t u)du, (2.2.5) T m (t)=e (λ+γ)t +λ e (λ+γ)u m (t u)du+λp e (λ+γ)u m (t u)du T +λh () e (λ+γ)u m (t u)du+γ[q ()+] e (λ+γ)u m (t u)du. (2.2.6) T Using the Heaviside function, (2.2.4) and (2.2.6) yield m (t)=e (λ+γ)t +λ [ H(u T)]e (λ+γ)u m (t u)du +λp H(u T)e (λ+γ)u m (t u)du +a e (λ+γ)u H(u T)m (t u)du

8 2.2. Model I 2 +(b+γ) e (λ+γ)u m (t u)du, for all t>, (2.2.7) where a=λh () and b=γq (). From (2.2.3) and (2.2.5), we get m (t)=b e γu m (t u)du, for all t>. (2.2.8) Using Laplace transform, (2.2.7) and (2.2.8) give m (θ)= θ+λ+γ +λm (θ) [ ] e (θ+λ+γ)t + λp m (θ)e (θ+λ+γ)t θ+λ+γ θ+λ+γ Solving (2.2.9) and (2.2.2), we get m (θ)= m (θ)= From (2.2.2), we get + ae (θ+λ+γ)t (b+γ) θ+λ+γ m (θ)+ θ+λ+γ m (θ); (2.2.9) m (θ)= b θ+γ m (θ). (2.2.2) b (θ+λ+γ)(θ+γ) bλ (b+γ)(θ+γ) [a(θ+γ) bλ( p )]e (θ+λ+γ)t, (2.2.2) (θ+γ) (θ+λ+γ)(θ+γ) bλ (b+γ)(θ+γ) [a(θ+γ) bλ( p )]e (θ+λ+γ)t. (2.2.22) m (θ)= (θ+λ+γ)(θ+γ) [ bλ+(b+γ)(θ+γ)+[a(θ+γ) bλ( p )]e (θ+λ+γ)t (θ+λ+γ)(θ+γ) b ] = b n= n j= ( ) n [a(θ+γ) bλ( p )] j e j(θ+λ+γ)t {bλ+(b+γ)(θ+γ)} n j j (θ+λ+γ) n+ (θ+γ) n+ = b n= n n j( n j j= k= )( n j k )

9 2.2. Model I 22 [a(θ+γ) bλ( p )] j e j(θ+λ+γ)t b n j k λ n j k (b+γ) k (θ+γ) k = n= (θ+λ+γ) n+ (θ+γ) n+ n n j j= k= j ()( n n j ( ) l j k l= )( j l ) b l+n+ j k λ l+n j k (b+γ) k a j l ( p ) l e j(θ+λ+γ)t = (θ+λ+γ) n+ (θ+γ) n++l j k n= n n j j= k= j ( )( n n j ( ) l j k b n+ j k+l λ l+n j k (b+γ) k a j l ( p ) l e j(θ+λ+γ)t (θ+λ+γ) n+ δ n++l j k,+ ( ) ( ) n+l j k+r λ r δ n++l j k, r (θ+λ+γ) On inversion, (2.2.23) gives r= l= )( j l ) n++l j k+r. (2.2.23) m (t)=e (λ+γ)t n= n n j j= k= j ( )( n n j ( ) l j k l= )( j l ) δ n++l j k, n! b n+ j k+l λ l+n j k (b+γ) k a j l ( p ) l H(t jt)(t jt) n + ( ) ( ) n+l j k+r λ r n+l j k+r+ (t jt) δ n++l j k, r (2n+l j k+r+)!. (2.2.24) Similarly, using (2.2.22), we get r= m (θ)= (θ+λ+γ) [ ] bλ+(b+γ)(θ+γ)+[a(θ+γ) bλ( p )]e (θ+λ+γ)t (θ+λ+γ)(θ+γ) = = n= [ bλ+(b+γ)(θ+γ)+{a(θ+γ) bλ( p )}e (θ+λ+γ)t] n n= n j= (θ+λ+γ) n+ (θ+γ) n ( ) n{bλ+(b+γ)(θ+γ)} n j {a(θ+γ) bλ( p )} j e j(θ+λ+γ)t j (θ+λ+γ) n+ (θ+γ) n

10 2.2. Model I 23 = n= n n j j= k= j l= ( n j )( n j k )() j l ( ) j l (bλ)n j k (b+γ) k a l b j l λ j l ( p ) j l e j(θ+λ+γ)t (θ+λ+γ) n+ (θ+γ) n k l = n= n n j j= k= j l= ( n j )( n j k )() j l ( ) j l (bλ)n j k (b+γ) k a l b j l λ j l ( p ) j l e j(θ+λ+γ)t (θ+λ+γ) n+ ( ) n+k+l λ δ n k l, + ( δ n k l, ) (θ+λ+γ) n k l+r θ+λ+γ n n j j ( )( )() n n j j = j k l n= j= ( ) j l (bλ)n k l (b+γ) k a l ( p ) j l e j(θ+λ+γ)t (θ+λ+γ) n+ ( ) n k l+r δ λ r n k l,+ ( δ n k l, ) r On inversion, (2.2.25) yields r= k= l= (θ+λ+γ) n k l+r. (2.2.25) m (t)=e (λ+γ)t n= n n j j= k= j l= ( n j )( n j k )( j l ) ( ) j l (bλ) n k l (b+γ) k a l ( p ) j l H(t jt)(t jt) n δ ( ) n k l, n k l+r λ r (t jt) n k l+r + ( δ n k l, ) n! r (2n k l+r)!. (2.2.26) r= By settingγ= in (2.2.26), we get back the result of Parthasarathy [67]: m (t)=e λt H(t nt){λh ()(t nt)} n. (2.2.27) n! n=

11 2.3. Model II 24 Further, by setting T=, we get back the classical result: m (t)=e λ(h () )t. (2.2.28) 2.3 Model II 2.3. Description of the model We consider a maturity-dependent Markov branching process with state-dependent immigration. As in model I, each individual of a population lives for a random length L of time and at the end of its life-time splits into a random number of identical off-springs if L>T, where T is a positive constant. The individual leaves no descendants if L<T. The descendants behave independently and identically to the ancestor. We assume that the life-time of each individual is governed by the probability density function f (t) given by f (t)=λe λt,λ>, t>. (2.3.) Let h(s) be the off-spring probability generating function which is given by h(s)= p + p j s j. j=2 Suppose that m is the mean number of off-springs of an individual. Then we obtain m=h ()= jp j. (2.3.2) j=2 The branching process is called sub-critical, critical or super-critical according as m <, m= or m>. We assume that catastrophes arrive in the population according to a Poisson process with rate η and each catastrophe wipes out the population instantaneously.

12 2.3. Model II 25 The growth of the population is supplemented by immigration of particles from outside the population which occur whenever the population size is zero. We assume that the sojourn time of the population in the state zero is exponentially distributed with mean. We assume that at the expiry of this sojourn time, a batch of k immigrants join the γ population with probability q k and we define the probability generating function of the batch size by q(s)= q k s k. (2.3.3) k= Let X(t) be the number of individuals of the population present at time t. Then the stochastic process{x(t), t } is the maturity-dependent Markov branching process with state-dependent immigration subject to disasters The Probability Generating Function of X(t) We define the following conditional probability generating functions: G (s, t)=e{s X(t) X()=}, (2.3.4) Using the total probability theorem, we obtain G (s, t)=e{s X(t) X()=}. (2.3.5) G (s, t)=e γt +γ e γu q(g (s, t u))du. (2.3.6) To find the integral equation for G (s, t), we have two cases: Case: t T In this case, we note that (i) no event occurs up to time t or (ii) the first event that occurs in (, t) is that the life-time of the particle which existed at time t= ends before

13 2.3. Model II 26 the time t with out producing any off-spring or (iii) the first event that occurs in (, t) is that a disaster occurs at some time in (, t). Then, by total probability law, we get Case: t>t G (s, t)=e (λ+η)t s+(λ+η) e (λ+η)u G (s, t u)du. (2.3.7) In this case, we note that (i) no event occurs up to time t so that there is exactly one individual existing at time t or (ii) the first event that occurs in (, t) is that the life-time of the particle which existed at time t= ends before the time T with out producing any off-spring or (iii) the first event that occurs in (, t) is that a disaster occurs at some time in (, t) wiping out the population or (iv) the first event that occurs in (, t) is that the lifetime of the particle which existed at time t= ends in (T, t) producing a random number of individuals governed by the probability generating function h(s). Then, by applying the total probability law, we obtain T G (s, t)=e (λ+η)t s+λ e (λ+η)u G (s, t u)du+η e (λ+η)u G (s, t u)du +λ e (λ+η)u p G (s, t u)+ T j=2 p j (G (s, t u)) j du. (2.3.8) The mean number of individuals We define the following conditional means: m (t)=e{x(t) X()=}, (2.3.9) m (t)=e{x(t) X()=}. (2.3.)

14 2.3. Model II 27 Differentiating (2.3.6) with respect to s and putting s=, we get m (t)=b e γu m (t u)du, for t>. (2.3.) Differentiating (2.3.7) with respect to s and putting s=, we get m (t)=e (λ+η)t + (λ+η) e (λ+η)u m (t u)du, t T. (2.3.2) Differentiating (2.3.8) with respect to s and putting s=, we get T m (t)=e (λ+η)t +λ e (λ+η)u m (t u)du+η e (λ+η)u m (t u)du +λ e (λ+η)u [p m (t u)+h ()m (t u)]du, t>t. (2.3.3) T Using Heaviside function, we write (2.3.2) and (2.3.3) together as m (t)=e (λ+η)t +λ [ H(u T)]e (λ+η)u m (t u)du +η e (λ+η)u m (t u)du +λ H(u T)e (λ+η)u [p m (t u)+h ()m (t u)]du. (2.3.4) Using Laplace transform, (2.3.) gives Similarly, (2.3.4) yields m (θ)= bm (θ) (θ+γ). (2.3.5) m (θ)= ( ) e (θ+λ+η)t θ+λ+η +λ m (θ) θ+λ+η

15 2.3. Model II 28 + ηm (θ) θ+λ+η + λe (θ+λ+η)t( p m (θ)+h ()m (θ) ). (2.3.6) θ+λ+η Substituting (2.3.5) in (2.3.6) and simplifying, we obtain m (θ)= θ+γ (θ+γ)(θ+λ+η)+{λb( p ) a(θ+γ)} e (θ+λ+η)t b(λ+η), (2.3.7) where a=λh () and b=γq (). Then, by substituting (2.3.7) in (2.3.5), we get m (θ)= b (θ+γ)(θ+λ+η)+{λb( p ) a(θ+γ)} e (θ+λ+η)t b(λ+η). (2.3.8) Assuming that the binomial expansion is valid in a domain, (2.3.8) yields m (θ)= (θ+γ)(θ+λ+η) [ b(λ+η)+{a(θ+γ) bλ( p )}e (θ+λ+η)t (θ+λ+η)(θ+γ) b ] [ b(λ+η)+{a(θ+γ) bλ( p )} e (θ+λ+η)t] n = b (θ+λ+η) n+ (θ+γ) n+ n= = b n= n j= ( ) n j ( j ( ) )a j k k (θ+γ) k b j k λ j k ( p ) j k e (θ+λ+η) jt b n j (λ+η) n j j k (θ+λ+η) n+ (θ+γ) n+ k= = = n= n= n j= n j= j ()( n j ( ) )a j k k λ j k ( p ) j k e (θ+λ+η) jt b n k+ (λ+η) n j j k (θ+λ+η) n+ (θ+γ) n+ k k= j ()() n j ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k+ (λ+η) n j j k k= (θ+λ+η) n+ [ δ n+ k, + ( δ ] n+ k,) (θ+γ) n+ k

16 2.3. Model II 29 n j ( )() n j = ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k+ (λ+η) n j j k n= j= k= (θ+λ+η) n+ δ n+ k,+ ( δ n+ k,) (+ (γ λ η) (θ+λ+η) )n+ k n j ( )() n j = ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k+ (λ+η) n j j k n= j= k= (θ+λ+η) δ n+ k,+ ( δ n+ k,) ( ) n +k (γ λ η) r n+ (θ+λ+η) n+ k r (θ+λ+η) r n j ()() n j = ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k+ (λ+η) n j j k n= j= k= δ n+ k, ( δ n+ k,) ( ) n +k (γ λ η) r (θ+λ+η) n++ (θ+λ+η) 2n+2 k r (θ+λ+η) r n j ()() n j = ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k+ (λ+η) n j j k n= j= k= δ n+ k, ( ) n+ k+r (θ+λ+η) n++ ( δ (λ+η γ) r n+ k,) r (θ+λ+η). (2.3.9) 2n+2 k+r Taking inverse transform on both sides of (2.3.9), we get r= r= r= m (t)=e (λ+η)t n= n j= k= j ( )() n j ( ) j k j k δ n+ k, n! a k λ j k ( p ) j k b n k+ (λ+η) n j H(t jt)(t jt) n ( ) n k+r (λ+η γ) r (t jt) n+ k+r + ( δ n+ k, ) r (2n+ k+r)!. (2.3.2) r=

17 2.3. Model II 3 Considering (2.3.7) and by binomial series expansion in negative powers of (θ+λ+η), we obtain [ b(λ+η)+{a(θ+γ) bλ( p )} e (θ+λ+η)t] n m (θ)= (θ+λ+η) n+ (θ+γ) n n= = n= n j= = = = ( ) n j ( j ( ) )a j k k (θ+γ) k b j k λ j k ( p ) j k e (θ+λ+η) jt b n j (λ+η) n j j k (θ+λ+η) n+ (θ+γ) n k= n= n= n= n j= n j= n j= j ()( n j ( ) )a j k k λ j k ( p ) j k e (θ+λ+η) jt b n k (λ+η) n j j k (θ+λ+η) n+ (θ+γ) n k k= j ()() n j ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k (λ+η) n j j k k= (θ+λ+η) n+ [ δ n k, + ( δ ] n k,) (θ+γ) n k j ( )() n j ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k (λ+η) n j j k k= (θ+λ+η) n+ δ n k,+ ( δ n k, ) (θ+λ+η) n k (+ (γ λ η) (θ+λ+η) )n k n j ( )() n j = ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k (λ+η) n j j k n= j= k= (θ+λ+η) δ n k,+ ( δ n k,) ( ) n+k (γ λ η) r n+ (θ+λ+η) n k r (θ+λ+η) r n j ( )() n j = ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k (λ+η) n j j k n= j= k= δ n k, ( δ n k, ) ( ) n+k (γ λ η) r (θ+λ+η) n++ (θ+λ+η) 2n+ k r (θ+λ+η) r r= r=

18 2.3. Model II 3 n = n= j= k= δ n k, (θ+λ+η) n++ ( δ n k,) Inverting (2.3.2), we obtain j ()() n j ( ) j k a k λ j k ( p ) j k e (θ+λ+η) jt b n k (λ+η) n j j k ( ) n k+r (λ+η γ) r r r= (θ+λ+η) 2n+ k+r. (2.3.2) m (t)=e (λ+η)t n= n j= k= j ( )() n j ( ) j k j k a k λ j k ( p ) j k b n k (λ+η) n j H(t jt)(t jt) n δ ( ) n k, n k+r (λ+η γ) r (t jt) n k+r + ( δ n k, ) n! r (2n k+r)!. (2.3.22) r= Settingγ= andη= in (2.3.22), we get back the result of Parthasarathy [67]: m (t)=e λt H(t nt){λh ()(t nt)} n. (2.3.23) n! n= Setting further T=, we get back the classical result of the Markov branching process: m (t)=e λ(h () )t. (2.3.24) The point process of resurrection Consider the stochastic process X(t). The maturity-dependent Markov branching process X(t) is subject to disasters and the growth of the population is supplemented by immigration of particles from outside the population which occur whenever the population size is zero. Yamazato [2] has termed this type of immigration as state-dependent immigration since it has the property that once the population size reaches the state it sojourns in that state for an exponentially distributed time and at the end of it the

19 2.3. Model II 32 population instantaneously resurrects to some positive state according to some probability distribution. Pakes [66] has termed the above type of immigration as resurrection since the process regenerates from state with a random number of immigrants. Li and Liu [56] have considered a modified Markov branching process incorporating with both statedependent immigration-migration and resurrection and obtained explicit expressions for the extinction probabilities and mean extinction times. Chen et al. [22] have obtained extinction probabilities for two interacting branching collision processes where an ordinary Markov branching process strongly interacts with a collision branching process. We now consider the stochastic point process Z(t) generated by the sequence of random time points at which resurrection takes place. We note that Z(t) is in fact a marked stochastic point process. We proceed to obtain the stationary first-order product density of the point process Z(t). We define ξ = lim t P{X(t)= X()=}; ξ = lim t P{X(t)= X()=}. Then, from (2.3.6), we get ξ = q(ξ ). (2.3.25) Similarly, from (2.3.8), we get ξ = λξ ( ) e (λ+η)t + ηξ λ+η λ+η +λh(ξ ) λ+η e (λ+η)t. (2.3.26) Suppose that Then we obtain where q(s)= s 2 s, h(s)= 2 (+ s2 ). ξ ξ = ( c) + c(+ξ2 ), (2.3.27) 2 ξ 2 c= λ λ+η e (λ+η)t.

20 2.4. Yamazato Model 33 Simplifying (2.3.27), we get cξ 3 2(c+)ξ2 + (3c+2)ξ 2c=. (2.3.28) Factorizing (2.3.28), we get (ξ )[cξ 2 (c+2)ξ + 2c]=. (2.3.29) Therefore the roots are The smallest root lying in [, ) is ξ =, c+2± (c+2) 2 8c 2. 2c ξ = c+2 (c+2) 2 8c 2. 2c Hence the stationary first-order product density of the point process Z(t) is given by lim t h(t)=ξ γ= γ 2c [c+2 (c+2) 2 8c 2 ]. (2.3.3) 2.4 Yamazato Model Yamazato [2] has considered a continuous time branching process which allows random immigration whenever the population size is zero. This model is a particular case of our II model when we remove age-dependency and disaster in the II model. To be specific, we put T= andη= in (2.3.8) and get G (s, t)=e λt s+λ e λu p G (s, t u)+ j=2 p j (G (s, t u)) j du, (2.4.)

21 2.4. Yamazato Model 34 where G (s, t) is given by (2.3.6). Differentiating (2.4.) with respect to s and putting s=, we get m (t)=e λt +λ e λu [p m (t u)+h ()m (t u)]du. (2.4.2) We can solve (2.4.2) together with (2.3.8). Taking Laplace transform on both sides of (2.4.2), we get Solving (2.3.2) and (2.4.3), we get m (θ)= θ+λ +λ[p m (θ)+h ()m (θ)]. (2.4.3) θ+λ m (θ)= m (θ)= b (θ+λ a)(θ+γ) λp b, (2.4.4) θ+γ (θ+λ a)(θ+γ) λp b, (2.4.5) where a=λh () and b=γq (). These equations can also be obtained from (2.3.4) and (2.3.5) by putting T = and η =. Using binomial series, (2.4.4) gives Inverting (2.4.6), we obtain m (θ)= n= ( ) n+r λ n p n bn+ (λ a γ) r. (2.4.6) r (θ+λ a) 2n+r+2 r= m (t)=e (λ a)t n= r= ( ) n+r λ n p n bn+ (λ a γ) r t 2n+r+. (2.4.7) r (2n+r+)! Similarly (2.4.5) gives m (θ)= θ+λ a + n= ( ) n+r λ n p n bn (λ a γ) r. (2.4.8) r (θ+λ a) 2n+r+ r=

22 2.4. Yamazato Model 35 Inverting (2.4.8), we obtain m (t)=e (λ a)t + n= r= We can also write (2.4.7) and (2.4.9) in the following form: ( ) n+r λ n p n bn (λ a γ) r t 2n+r r (2n+r)!. (2.4.9) m (t)= 2be (λ a+γ)t 2 (λ a γ)2 + 4λp b sinh t (λ a γ) 2 + 4λp b 2 (2.4.) m (t)= e (λ a+γ)t 2 (λ a γ)2 + 4λp b cosh t (λ a γ) 2 + 4λp b (λ a γ)2 + 4λp b 2 (λ a γ) sinh t (λ a γ) 2 + 4λp b 2. (2.4.) Resolving m (θ) and m (θ) into partial fractions, we can also obtain m (t) and m (t) in the following form m (t)= be (λ a)t α β ( e αt e βt), (2.4.2) m (t)= e (λ a)t ( αe βt βe αt), (2.4.3) 2(α β) where α= 2 [ λ a γ+ (λ a γ)2 + 4λp b ], (2.4.4) β= 2 [ λ a γ (λ a γ)2 + 4λp b ]. (2.4.5) Suppose that X()=i and P i j (t)=p[x(t)= j X()=i]. Then we obtain P i (t+ )=P i (t)[ γ ]+ P i (t)λp, (2.4.6) j P i j (t+ )=P i j (t)[ jλ ]+ P i, j r+ (t)( j r+)λp r r= +P i, (t)γq j, j. (2.4.7)

23 2.4. Yamazato Model 36 Writing (2.4.6) and (2.4.7) in the form of differential equations, we get P i (t)= γp i(t)+ P i (t)λp, (2.4.8) j P i j(t)= jλp i j (t)+ P i, j r+ (t)( j r+)λp r r= +P i, (t)γq j, j. (2.4.9) Let us define the generating function G i (s, t)= P i j (t)s j. (2.4.2) j= Then, by using (2.4.8) and (2.4.9), we obtain G i (s, t) t = P i j(t)s j j= = γp i (t)+ P i (t)λp + jλp i j(t)+ j= j P i, j r+ (t)( j r+)λp r + P i, (t)γq j s j r= = γp i (t) λs s P i j (t)s j j= +λ j= j (l+)p i,l+ (t)p j l s j +γ l= P i, (t)q j s j j= = γp i (t) λs s P i j (t)s j j= +λ l= (l+)p i,l+ (t)p j l s j +γ j=l P i, (t)q j s j j= =λ[h(s) s] G i(s, t) s +γp i, (t)[q(s) ]. (2.4.2)

24 2.4. Yamazato Model 37 Defining the mean m i (t)=e [X(t) X()=i], (2.4.2) yields m i(t)=(a λ)m i (t)+bp i, (t). (2.4.22) Equation (2.4.22) agrees with Yamazato [2]. Equation (2.4.22) is subject to the initial condition m i ()=i. Taking Laplace transform on both sides of (2.4.22), we get [θ (a λ)]m i (θ)=i+bp i, (θ). Therefore, we get Putting i= in (2.4.23) and equating with (2.4.4), we get P, (θ)= m i (θ)= i+bp i, (θ) θ (a λ). (2.4.23) θ+λ a (θ+λ a)(θ+γ) λp b = = n= n= λ n p n bn (θ+λ a) n (θ+γ) n+ ( ) n+r λ n p n bn (λ a γ) r. (2.4.24) r (θ+λ a) 2n+r+ r= Inverting (2.4.24), we get explicitly P (t)=e (λ a)t n= r= ( n+r )λ n p n r bn (λ a γ) r t 2n+r (2n+r)!. (2.4.25) Resolving P, (θ) into partial fractions, we can also obtain P (t)= e (λ a)t α β whereαandβare given by (2.4.4) and (2.4.5). ( αe αt βe βt), (2.4.26)

25 2.5. A Numerical Illustration A Numerical Illustration In this section, we present a numerical illustration to highlight the impact of maturitydependent branching and state-dependent immigration under the influence of catastrophes on the growth of the population. We consider the models of Yamazato [2] and Parthasarathy [67] and compare the performance of the mean values of the present models with their mean values. To accomplish this, we first take the model of Yamazato [2] and our models and compute numerically the mean value m (t) as t varies from. to 2. and make a comparative picture. For this, we have chosen the parameter values as λ=.3,γ=.,η=., T=.3, p =.4, h ()=.2, q ()=2.3. In Table 2., we have presented the mean number of particles for the Yamazato model and that of the two models of the present chapter. The mean value for model II decreases and that of Yamazato model increases as t increases. This is as expected, since the model II has the occurrence of disaster added to Yamazato model. The rate of change of mean value for Yamazato model is very much higher in comparison with that of the II model as it is expected so, since the model II is maturity dependent. The rate of change of mean value for model I is higher than that of Yamazato since model I has independent immigration. Next, to study the impact of catastrophes on the growth of the population, we fix the time as t=2. and vary the catastrophe parameterηfrom. to.4 and choosing the same values for all the other parameters as for table 2., we obtain Table 2.2. In this table, we highlight the performance of model II in comparison with that of Yamazato [2]. We find in Table 2.2 that, as η increases from. to.4, equivalently, as the mean of the inter-occurrence interval of catastrophes decreases, the mean value at the time point t = 2 decreases, as expected. We compute m Y (2) = We find in Table 2.2 that, as η increases from. to.4, equivalently, as the mean of the inter-occurrence interval of catastrophes decreases, the mean value at the time point t=2 decreases, as expected.

26 2.5. A Numerical Illustration 39 Next, we fix the catastrophe rateη=.6 and vary the maturity T from. to.4 and compute the mean value m (2) for the model II and compare with that of Yamazato [2] model at the time point t=2 in table 2.3. We find in Table 2.3 that as the maturity level increases, the mean value decreases as expected and that the occurrence of catastrophes has decreased the mean value of Yamazato model. This result is as expected. Table 2.: Comparison of m (t) t m Y (t) m(i) (t) m(ii) (t) t m Y (t) m(i) (t) m(ii) (t) Table 2.2: Comparison of m (2) of model II with that of Yamazato [2] λ=.3, T=.3,γ=., p =.4, h ()=.2, q ()=2.3 η m II (2) η m II (2) η m II (2)

27 2.5. A Numerical Illustration 4 Table 2.3: Comparison of m (2) of model II with that of Yamazato [2] with the occurrence of catastrophes λ=.3,γ=.,η=.6, p =.4, h ()=.2, q ()=2.3 T m II (2) T m II (2) T m II (2) Table 2.4: Comparison of m (t) of model II with that of Yamazato [2] with out the occurrence of catastrophes λ=.3,γ=.,η=., p =.4, h ()=.2, q ()=2.3 T m II (2) T m II (2) T m II (2) We also study the effect of maturity alone on the population growth by takingη= in m (t) of model II and obtain the mean value at t=2 for maturity level ranging from to.4. We present the mean values in table 2.4. We find in Table 2.4 that as the maturity level increases, the mean value decreases as expected. We have computed m Y (2)= To study the impact of immigration and catastrophes on the population growth, we next consider a numerical comparison of the performance of our models with that of Parthasarathy [67]. In Table 2.5, we tabulate the mean values of the present models I and II with that of Parthasarathy [67] for t increasing from to 6. We observe that the rate of change in the mean value of model II is very much lower than that of Parthasarathy [67]. This is due to the effect of the occurrence of catastrophes. The rate of change of the mean value for model I is higher than that of Parthasarathy [67]. This is as expected since model

28 2.5. A Numerical Illustration 4 I includes immigration in it. Next we proceed to compare further the effect of maturity T and immigration in the population mean of our models. We take the same parameter values as in Parthasarathy [67] and observe that the population mean value is higher in our model as compared with that of Parthasarathy [67] (see Table 2.6, Table 2.7 and Table 2.8). This is because of the fact that the immigration takes care of the population growth and we also observe that the maturity slows down the growth in both the models. Further the growth is more in the super-critical case. Table 2.5: Comparison of m (t) with that of Parthasarathy [67] λ=.3, T=.3, h ()=.2 t m P (t) m(i) (t) m(ii) (t) t m P (t) m(i) (t) m(ii) (t) Table 2.6: Effect of maturity time T in the mean of m (t) in the subcritical case λ=2., T=.,.25,.5,.75 h () T m (P) (2) m(i) (2) m(ii) (2)

29 2.5. A Numerical Illustration 42 Table 2.7: Effect of maturity time T in the mean of m (t) in the critical case λ=2., T=.,.25,.5,.75 h () T m (P) (2) m(i) (2) m(ii) (2) Table 2.8: Effect of maturity time T in the mean of m (t) in the supercritical case λ=2., T=.,.25,.5,.75 h () T m (P) (2) m(i) (2) m(ii) (2)

30 2.5. A Numerical Illustration 43 Finally, we depict the performance of our models as compared with the model of Yamazato [2] in Figure 2.. In Figure 2.2, we highlight the performance of our models as compared with the model of Parthasarathy [67]. Figure 2.: Performance of the models compared to the model of Yamazato [2] Figure 2.2: Performance of the models compared to the model of Parthasarathy [67] We observe that the mean values m (I) (t) and m(ii) (t) of our I and II models are both much lower than that of Parthasarathy [67].

RENEWAL PROCESSES AND POISSON PROCESSES

RENEWAL PROCESSES AND POISSON PROCESSES 1 RENEWAL PROCESSES AND POISSON PROCESSES Andrea Bobbio Anno Accademico 1997-1998 Renewal and Poisson Processes 2 Renewal Processes A renewal process is a point process characterized by the fact that the

More information

Continuous Time Processes

Continuous Time Processes page 102 Chapter 7 Continuous Time Processes 7.1 Introduction In a continuous time stochastic process (with discrete state space), a change of state can occur at any time instant. The associated point

More information

BRANCHING PROCESSES AND THEIR APPLICATIONS: Lecture 15: Crump-Mode-Jagers processes and queueing systems with processor sharing

BRANCHING PROCESSES AND THEIR APPLICATIONS: Lecture 15: Crump-Mode-Jagers processes and queueing systems with processor sharing BRANCHING PROCESSES AND THEIR APPLICATIONS: Lecture 5: Crump-Mode-Jagers processes and queueing systems with processor sharing June 7, 5 Crump-Mode-Jagers process counted by random characteristics We give

More information

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS (February 25, 2004) EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS, University of Queensland PHIL POLLETT, University of Queensland Abstract The birth, death and catastrophe

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS J. Appl. Prob. 41, 1211 1218 (2004) Printed in Israel Applied Probability Trust 2004 EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS and P. K. POLLETT, University of Queensland

More information

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains

More information

STAT 380 Markov Chains

STAT 380 Markov Chains STAT 380 Markov Chains Richard Lockhart Simon Fraser University Spring 2016 Richard Lockhart (Simon Fraser University) STAT 380 Markov Chains Spring 2016 1 / 38 1/41 PoissonProcesses.pdf (#2) Poisson Processes

More information

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model. Poisson Processes Particles arriving over time at a particle detector. Several ways to describe most common model. Approach 1: a) numbers of particles arriving in an interval has Poisson distribution,

More information

errors every 1 hour unless he falls asleep, in which case he just reports the total errors

errors every 1 hour unless he falls asleep, in which case he just reports the total errors I. First Definition of a Poisson Process A. Definition: Poisson process A Poisson Process {X(t), t 0} with intensity λ > 0 is a counting process with the following properties. INDEPENDENT INCREMENTS. For

More information

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical

CDA5530: Performance Models of Computers and Networks. Chapter 3: Review of Practical CDA5530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic ti process X = {X(t), t T} is a collection of random variables (rvs); one

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Math 180C, Spring Supplement on the Renewal Equation

Math 180C, Spring Supplement on the Renewal Equation Math 18C Spring 218 Supplement on the Renewal Equation. These remarks supplement our text and set down some of the material discussed in my lectures. Unexplained notation is as in the text or in lecture.

More information

EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING

EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING Erik A. van Doorn and Alexander I. Zeifman Department of Applied Mathematics University of Twente P.O. Box 217, 7500 AE Enschede, The Netherlands

More information

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES

2905 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 295 Queueing Theory and Simulation PART III: HIGHER DIMENSIONAL AND NON-MARKOVIAN QUEUES 16 Queueing Systems with Two Types of Customers In this section, we discuss queueing systems with two types of customers.

More information

Performance Evaluation of Queuing Systems

Performance Evaluation of Queuing Systems Performance Evaluation of Queuing Systems Introduction to Queuing Systems System Performance Measures & Little s Law Equilibrium Solution of Birth-Death Processes Analysis of Single-Station Queuing Systems

More information

The exponential distribution and the Poisson process

The exponential distribution and the Poisson process The exponential distribution and the Poisson process 1-1 Exponential Distribution: Basic Facts PDF f(t) = { λe λt, t 0 0, t < 0 CDF Pr{T t) = 0 t λe λu du = 1 e λt (t 0) Mean E[T] = 1 λ Variance Var[T]

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00 Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013

More information

Continuous-time Markov Chains

Continuous-time Markov Chains Continuous-time Markov Chains Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ October 23, 2017

More information

Continuous Time Markov Chains

Continuous Time Markov Chains Continuous Time Markov Chains Stochastic Processes - Lecture Notes Fatih Cavdur to accompany Introduction to Probability Models by Sheldon M. Ross Fall 2015 Outline Introduction Continuous-Time Markov

More information

Markov Branching Process

Markov Branching Process MODULE 8: BRANCHING PROCESSES 11 Lecture 3 Markov Branching Process Let Z(t) be number of particles at time t. Let δ 1k + a k h + o(h), k = 0, 1,... represent the probability that a single particle will

More information

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes

The story of the film so far... Mathematics for Informatics 4a. Continuous-time Markov processes. Counting processes The story of the film so far... Mathematics for Informatics 4a José Figueroa-O Farrill Lecture 19 28 March 2012 We have been studying stochastic processes; i.e., systems whose time evolution has an element

More information

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

More information

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES Applied Probability Trust 7 May 22 UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES HAMED AMINI, AND MARC LELARGE, ENS-INRIA Abstract Upper deviation results are obtained for the split time of a

More information

6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

More information

Lecture 4a: Continuous-Time Markov Chain Models

Lecture 4a: Continuous-Time Markov Chain Models Lecture 4a: Continuous-Time Markov Chain Models Continuous-time Markov chains are stochastic processes whose time is continuous, t [0, ), but the random variables are discrete. Prominent examples of continuous-time

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

Probability Distributions

Probability Distributions Lecture : Background in Probability Theory Probability Distributions The probability mass function (pmf) or probability density functions (pdf), mean, µ, variance, σ 2, and moment generating function (mgf)

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

IEOR 6711, HMWK 5, Professor Sigman

IEOR 6711, HMWK 5, Professor Sigman IEOR 6711, HMWK 5, Professor Sigman 1. Semi-Markov processes: Consider an irreducible positive recurrent discrete-time Markov chain {X n } with transition matrix P (P i,j ), i, j S, and finite state space.

More information

Intertwining of Markov processes

Intertwining of Markov processes January 4, 2011 Outline Outline First passage times of birth and death processes Outline First passage times of birth and death processes The contact process on the hierarchical group 1 0.5 0-0.5-1 0 0.2

More information

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes

CDA6530: Performance Models of Computers and Networks. Chapter 3: Review of Practical Stochastic Processes CDA6530: Performance Models of Computers and Networks Chapter 3: Review of Practical Stochastic Processes Definition Stochastic process X = {X(t), t2 T} is a collection of random variables (rvs); one rv

More information

THE x log x CONDITION FOR GENERAL BRANCHING PROCESSES

THE x log x CONDITION FOR GENERAL BRANCHING PROCESSES J. Appl. Prob. 35, 537 544 (1998) Printed in Israel Applied Probability Trust 1998 THE x log x CONDITION FOR GENERAL BRANCHING PROCESSES PETER OLOFSSON, Rice University Abstract The x log x condition is

More information

Continuous time Markov chains

Continuous time Markov chains Chapter 2 Continuous time Markov chains As before we assume that we have a finite or countable statespace I, but now the Markov chains X {X(t) : t } have a continuous time parameter t [, ). In some cases,

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory JUMP PROCESSES GENERALIZING STOCHASTIC INTEGRALS WITH JUMPS Tyler Hofmeister University of Calgary Mathematical and Computational Finance Laboratory Overview 1. General Method 2. Poisson Processes 3. Diffusion

More information

A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime

A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime P.K. Pollett a and V.T. Stefanov b a Department of Mathematics, The University of Queensland Queensland

More information

process on the hierarchical group

process on the hierarchical group Intertwining of Markov processes and the contact process on the hierarchical group April 27, 2010 Outline Intertwining of Markov processes Outline Intertwining of Markov processes First passage times of

More information

On the Convolution Order with Reliability Applications

On the Convolution Order with Reliability Applications Applied Mathematical Sciences, Vol. 3, 2009, no. 16, 767-778 On the Convolution Order with Reliability Applications A. Alzaid and M. Kayid King Saud University, College of Science Dept. of Statistics and

More information

An M/M/1 Queue in Random Environment with Disasters

An M/M/1 Queue in Random Environment with Disasters An M/M/1 Queue in Random Environment with Disasters Noam Paz 1 and Uri Yechiali 1,2 1 Department of Statistics and Operations Research School of Mathematical Sciences Tel Aviv University, Tel Aviv 69978,

More information

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY

EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY EXAMINATIONS OF THE ROYAL STATISTICAL SOCIETY GRADUATE DIPLOMA, 011 MODULE 3 : Stochastic processes and time series Time allowed: Three Hours Candidates should answer FIVE questions. All questions carry

More information

Stochastic Modelling Unit 1: Markov chain models

Stochastic Modelling Unit 1: Markov chain models Stochastic Modelling Unit 1: Markov chain models Russell Gerrard and Douglas Wright Cass Business School, City University, London June 2004 Contents of Unit 1 1 Stochastic Processes 2 Markov Chains 3 Poisson

More information

Poisson processes and their properties

Poisson processes and their properties Poisson processes and their properties Poisson processes. collection {N(t) : t [, )} of rando variable indexed by tie t is called a continuous-tie stochastic process, Furtherore, we call N(t) a Poisson

More information

arxiv: v1 [math.pr] 1 Jan 2013

arxiv: v1 [math.pr] 1 Jan 2013 The role of dispersal in interacting patches subject to an Allee effect arxiv:1301.0125v1 [math.pr] 1 Jan 2013 1. Introduction N. Lanchier Abstract This article is concerned with a stochastic multi-patch

More information

Math 597/697: Solution 5

Math 597/697: Solution 5 Math 597/697: Solution 5 The transition between the the ifferent states is governe by the transition matrix 0 6 3 6 0 2 2 P = 4 0 5, () 5 4 0 an v 0 = /4, v = 5, v 2 = /3, v 3 = /5 Hence the generator

More information

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing Department of Applied Mathematics Faculty of EEMCS t University of Twente The Netherlands P.O. Box 27 75 AE Enschede The Netherlands Phone: +3-53-48934 Fax: +3-53-48934 Email: memo@math.utwente.nl www.math.utwente.nl/publications

More information

The Age-specific Incidence of Cancer: phases, transitions and biological implications

The Age-specific Incidence of Cancer: phases, transitions and biological implications The Age-specific Incidence of Cancer: phases, transitions and biological implications Supplementary Information A Hazard and Survival Function of the MSCE model Figure 1 shows a schematic representation

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

THE QUEEN S UNIVERSITY OF BELFAST

THE QUEEN S UNIVERSITY OF BELFAST THE QUEEN S UNIVERSITY OF BELFAST 0SOR20 Level 2 Examination Statistics and Operational Research 20 Probability and Distribution Theory Wednesday 4 August 2002 2.30 pm 5.30 pm Examiners { Professor R M

More information

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk ANSAPW University of Queensland 8-11 July, 2013 1 Outline (I) Fluid

More information

Chapter 2. Poisson Processes

Chapter 2. Poisson Processes Chapter 2. Poisson Processes Prof. Ai-Chun Pang Graduate Institute of Networking and Multimedia, epartment of Computer Science and Information Engineering, National Taiwan University, Taiwan utline Introduction

More information

P (L d k = n). P (L(t) = n),

P (L d k = n). P (L(t) = n), 4 M/G/1 queue In the M/G/1 queue customers arrive according to a Poisson process with rate λ and they are treated in order of arrival The service times are independent and identically distributed with

More information

STATS 3U03. Sang Woo Park. March 29, Textbook: Inroduction to stochastic processes. Requirement: 5 assignments, 2 tests, and 1 final

STATS 3U03. Sang Woo Park. March 29, Textbook: Inroduction to stochastic processes. Requirement: 5 assignments, 2 tests, and 1 final STATS 3U03 Sang Woo Park March 29, 2017 Course Outline Textbook: Inroduction to stochastic processes Requirement: 5 assignments, 2 tests, and 1 final Test 1: Friday, February 10th Test 2: Friday, March

More information

Asymptotic Behavior of a Controlled Branching Process with Continuous State Space

Asymptotic Behavior of a Controlled Branching Process with Continuous State Space Asymptotic Behavior of a Controlled Branching Process with Continuous State Space I. Rahimov Department of Mathematical Sciences, KFUPM, Dhahran, Saudi Arabia ABSTRACT In the paper a modification of the

More information

Nonlinear Branching Processes with Immigration

Nonlinear Branching Processes with Immigration Acta Mathematica Sinica, English Series Aug., 217, Vol. 33, o. 8, pp. 121 138 Published online: April 24, 217 DOI: 1.17/s1114-17-6472- Http://www.ActaMath.com Acta Mathematica Sinica, English Series Springer-Verlag

More information

Continuous time Markov chains

Continuous time Markov chains Continuous time Markov chains Alejandro Ribeiro Dept. of Electrical and Systems Engineering University of Pennsylvania aribeiro@seas.upenn.edu http://www.seas.upenn.edu/users/~aribeiro/ October 16, 2017

More information

Solutions to Math 53 Math 53 Practice Final

Solutions to Math 53 Math 53 Practice Final Solutions to Math 5 Math 5 Practice Final 20 points Consider the initial value problem y t 4yt = te t with y 0 = and y0 = 0 a 8 points Find the Laplace transform of the solution of this IVP b 8 points

More information

EDRP lecture 7. Poisson process. Pawe J. Szab owski

EDRP lecture 7. Poisson process. Pawe J. Szab owski EDRP lecture 7. Poisson process. Pawe J. Szab owski 2007 Counting process Random process fn t ; t 0g is called a counting process, if N t is equal total number of events that have happened up to moment

More information

1 An Introduction to Stochastic Population Models

1 An Introduction to Stochastic Population Models 1 An Introduction to Stochastic Population Models References [1] J. H. Matis and T. R. Kiffe. Stochastic Population Models, a Compartmental Perective. Springer-Verlag, New York, 2000. [2] E. Renshaw. Modelling

More information

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have Lectures 16-17: Poisson Approximation 1. The Law of Rare Events Theorem 2.6.1: For each n 1, let X n,m, 1 m n be a collection of independent random variables with PX n,m = 1 = p n,m and PX n,m = = 1 p

More information

Linear-fractional branching processes with countably many types

Linear-fractional branching processes with countably many types Branching processes and and their applications Badajoz, April 11-13, 2012 Serik Sagitov Chalmers University and University of Gothenburg Linear-fractional branching processes with countably many types

More information

Mathematical Framework for Phylogenetic Birth-And-Death Models

Mathematical Framework for Phylogenetic Birth-And-Death Models Mathematical Framework for Phylogenetic Birth-And-Death Models Miklós Csűrös István Miklós February 5, 2009 Abstract A phylogenetic birth-and-death model is a probabilistic graphical model for a so-called

More information

Birth-Death Processes

Birth-Death Processes Birth-Death Processes Birth-Death Processes: Transient Solution Poisson Process: State Distribution Poisson Process: Inter-arrival Times Dr Conor McArdle EE414 - Birth-Death Processes 1/17 Birth-Death

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

Mathematics Department, Birjand Branch, Islamic Azad University, Birjand, Iran

Mathematics Department, Birjand Branch, Islamic Azad University, Birjand, Iran REVSTAT Statistical Journal Volume 14, Number 3, June 216, 229 244 PROPERTIES OF n-laplace TRANSFORM RATIO ORDER AND -CLASS OF LIFE DISTRIBUTIONS Authors: Jalil Jarrahiferiz Mathematics Department, Birjand

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 2. Poisson Processes. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 2. Poisson Processes Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Outline Introduction to Poisson Processes Definition of arrival process Definition

More information

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18.

IEOR 6711: Stochastic Models I, Fall 2003, Professor Whitt. Solutions to Final Exam: Thursday, December 18. IEOR 6711: Stochastic Models I, Fall 23, Professor Whitt Solutions to Final Exam: Thursday, December 18. Below are six questions with several parts. Do as much as you can. Show your work. 1. Two-Pump Gas

More information

Matrix Solutions to Linear Systems of ODEs

Matrix Solutions to Linear Systems of ODEs Matrix Solutions to Linear Systems of ODEs James K. Peterson Department of Biological Sciences and Department of Mathematical Sciences Clemson University November 3, 216 Outline 1 Symmetric Systems of

More information

Problems 5: Continuous Markov process and the diffusion equation

Problems 5: Continuous Markov process and the diffusion equation Problems 5: Continuous Markov process and the diffusion equation Roman Belavkin Middlesex University Question Give a definition of Markov stochastic process. What is a continuous Markov process? Answer:

More information

Chapter 6 - Random Processes

Chapter 6 - Random Processes EE385 Class Notes //04 John Stensby Chapter 6 - Random Processes Recall that a random variable X is a mapping between the sample space S and the extended real line R +. That is, X : S R +. A random process

More information

1 Types of stochastic models

1 Types of stochastic models 1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen

More information

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T

The goal of this chapter is to study linear systems of ordinary differential equations: dt,..., dx ) T 1 1 Linear Systems The goal of this chapter is to study linear systems of ordinary differential equations: ẋ = Ax, x(0) = x 0, (1) where x R n, A is an n n matrix and ẋ = dx ( dt = dx1 dt,..., dx ) T n.

More information

Poisson Processes. Stochastic Processes. Feb UC3M

Poisson Processes. Stochastic Processes. Feb UC3M Poisson Processes Stochastic Processes UC3M Feb. 2012 Exponential random variables A random variable T has exponential distribution with rate λ > 0 if its probability density function can been written

More information

Exponential functionals of Lévy processes

Exponential functionals of Lévy processes Exponential functionals of Lévy processes Víctor Rivero Centro de Investigación en Matemáticas, México. 1/ 28 Outline of the talk Introduction Exponential functionals of spectrally positive Lévy processes

More information

LECTURE #6 BIRTH-DEATH PROCESS

LECTURE #6 BIRTH-DEATH PROCESS LECTURE #6 BIRTH-DEATH PROCESS 204528 Queueing Theory and Applications in Networks Assoc. Prof., Ph.D. (รศ.ดร. อน นต ผลเพ ม) Computer Engineering Department, Kasetsart University Outline 2 Birth-Death

More information

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 7

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 7 MS&E 321 Spring 12-13 Stochastic Systems June 1, 213 Prof. Peter W. Glynn Page 1 of 7 Section 9: Renewal Theory Contents 9.1 Renewal Equations..................................... 1 9.2 Solving the Renewal

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011

Solutions to Homework Discrete Stochastic Processes MIT, Spring 2011 Exercise 6.5: Solutions to Homework 0 6.262 Discrete Stochastic Processes MIT, Spring 20 Consider the Markov process illustrated below. The transitions are labelled by the rate q ij at which those transitions

More information

z Transform System Analysis

z Transform System Analysis z Transform System Analysis Block Diagrams and Transfer Functions Just as with continuous-time systems, discrete-time systems are conveniently described by block diagrams and transfer functions can be

More information

1 Informal definition of a C-M-J process

1 Informal definition of a C-M-J process (Very rough) 1 notes on C-M-J processes Andreas E. Kyprianou, Department of Mathematical Sciences, University of Bath, Claverton Down, Bath, BA2 7AY. C-M-J processes are short for Crump-Mode-Jagers processes

More information

Almost sure asymptotics for the random binary search tree

Almost sure asymptotics for the random binary search tree AofA 10 DMTCS proc. AM, 2010, 565 576 Almost sure asymptotics for the rom binary search tree Matthew I. Roberts Laboratoire de Probabilités et Modèles Aléatoires, Université Paris VI Case courrier 188,

More information

MA 266 Review Topics - Exam # 2 (updated)

MA 266 Review Topics - Exam # 2 (updated) MA 66 Reiew Topics - Exam # updated Spring First Order Differential Equations Separable, st Order Linear, Homogeneous, Exact Second Order Linear Homogeneous with Equations Constant Coefficients The differential

More information

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility José Enrique Figueroa-López 1 1 Department of Statistics Purdue University Statistics, Jump Processes,

More information

An Introduction to Stochastic Modeling

An Introduction to Stochastic Modeling F An Introduction to Stochastic Modeling Fourth Edition Mark A. Pinsky Department of Mathematics Northwestern University Evanston, Illinois Samuel Karlin Department of Mathematics Stanford University Stanford,

More information

Lecture 10: Semi-Markov Type Processes

Lecture 10: Semi-Markov Type Processes Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

AARMS Homework Exercises

AARMS Homework Exercises 1 For the gamma distribution, AARMS Homework Exercises (a) Show that the mgf is M(t) = (1 βt) α for t < 1/β (b) Use the mgf to find the mean and variance of the gamma distribution 2 A well-known inequality

More information

Putzer s Algorithm. Norman Lebovitz. September 8, 2016

Putzer s Algorithm. Norman Lebovitz. September 8, 2016 Putzer s Algorithm Norman Lebovitz September 8, 2016 1 Putzer s algorithm The differential equation dx = Ax, (1) dt where A is an n n matrix of constants, possesses the fundamental matrix solution exp(at),

More information

LIMIT THEOREMS FOR NON-CRITICAL BRANCHING PROCESSES WITH CONTINUOUS STATE SPACE. S. Kurbanov

LIMIT THEOREMS FOR NON-CRITICAL BRANCHING PROCESSES WITH CONTINUOUS STATE SPACE. S. Kurbanov Serdica Math. J. 34 (2008), 483 488 LIMIT THEOREMS FOR NON-CRITICAL BRANCHING PROCESSES WITH CONTINUOUS STATE SPACE S. Kurbanov Communicated by N. Yanev Abstract. In the paper a modification of the branching

More information

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008

STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 Name STA 624 Practice Exam 2 Applied Stochastic Processes Spring, 2008 There are five questions on this test. DO use calculators if you need them. And then a miracle occurs is not a valid answer. There

More information

Linear Differential Equations. Problems

Linear Differential Equations. Problems Chapter 1 Linear Differential Equations. Problems 1.1 Introduction 1.1.1 Show that the function ϕ : R R, given by the expression ϕ(t) = 2e 3t for all t R, is a solution of the Initial Value Problem x =

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

A. Bovier () Branching Brownian motion: extremal process and ergodic theorems

A. Bovier () Branching Brownian motion: extremal process and ergodic theorems Branching Brownian motion: extremal process and ergodic theorems Anton Bovier with Louis-Pierre Arguin and Nicola Kistler RCS&SM, Venezia, 06.05.2013 Plan 1 BBM 2 Maximum of BBM 3 The Lalley-Sellke conjecture

More information

Nahm s conjecture about modularity of q-series

Nahm s conjecture about modularity of q-series (joint work with S. Zwegers) Oberwolfach July, 0 Let r and F A,B,C (q) = q nt An+n T B+C n (Z 0 ) r (q) n... (q) nr, q < where A M r (Q) positive definite, symmetric n B Q n, C Q, (q) n = ( q k ) k= Let

More information

Multiserver Queueing Model subject to Single Exponential Vacation

Multiserver Queueing Model subject to Single Exponential Vacation Journal of Physics: Conference Series PAPER OPEN ACCESS Multiserver Queueing Model subject to Single Exponential Vacation To cite this article: K V Vijayashree B Janani 2018 J. Phys.: Conf. Ser. 1000 012129

More information

ON THE COMPLETE LIFE CAREER OF POPULATIONS IN ENVIRONMENTS WITH A FINITE CARRYING CAPACITY. P. Jagers

ON THE COMPLETE LIFE CAREER OF POPULATIONS IN ENVIRONMENTS WITH A FINITE CARRYING CAPACITY. P. Jagers Pliska Stud. Math. 24 (2015), 55 60 STUDIA MATHEMATICA ON THE COMPLETE LIFE CAREER OF POPULATIONS IN ENVIRONMENTS WITH A FINITE CARRYING CAPACITY P. Jagers If a general branching process evolves in a habitat

More information