Nonlinear Branching Processes with Immigration

Size: px
Start display at page:

Download "Nonlinear Branching Processes with Immigration"

Transcription

1 Acta Mathematica Sinica, English Series Aug., 217, Vol. 33, o. 8, pp Published online: April 24, 217 DOI: 1.17/s Acta Mathematica Sinica, English Series Springer-Verlag Berlin Heidelberg & The Editorial Office of AMS 217 onlinear Branching Processes with Immigration Pei-Sen LI School of Mathematical Sciences, Beijing ormal University, Beijing 1875, P. R. China peisenli@mail.bnu.edu.cn Abstract The nonlinear branching process with immigration is constructed as the pathwise unique solution of a stochastic integral equation driven by Poisson random measures. Some criteria for the regularity, recurrence, ergodicity and strong ergodicity of the process are then established. Keywords onlinear branching process, immigration, stochastic integral equation, regularity, recurrence, ergodicity, strong ergodicity MR21 Subject Classification 6J27, 6J8 1 Introduction Markov branching processes are models for the evolution of populations of particles. Those processes constitute one of the most important subclasses of continuous-time Markov chains. Standard references on those processes are [9] and [2]. The basic property of an ordinary linear branching process is that different particles act independently when giving birth or death. In most realistic situations, however, this property is unlikely to be appropriate. In particular, when the number of particles becomes large or the particles move with high speed, the particles may interact and, as a result, the birth and death rates can either increase or decrease. Those considerations have motivated the study of nonlinear branching processes. On the other hand, a branching process describes a population evolving randomly in an isolated environment. A useful and realistic modification of the model is the addition of new particles from outside sources. This consideration has provided the stimulation for the study of branching models with immigration and/or resurrection. Let {r i : i } be a sequence of nonnegative constants with r =and{b i : i } a discrete probability distribution on := {, 1,...} with b 1 =. A continuous-time Markov chain is called a nonlinear branching process if it has density matrix R =r ij givenby r i b j i1, j i 1, i 1, r i, j = i 1, r ij = 1.1 r i b, j = i 1, i 1,, otherwise. A typical special case is where r i = αi θ for α andθ>, which reduces to the ordinary linear branching process when r i = αi. Let γ andlet{a i : i } be another discrete Received October 8, 216, revised December 22, 216, accepted January 17, 217 Supported by SFC Grant os , and

2 122 Li P. S. probability distribution on satisfying a =. A continuous-time Markov chain is called a nonlinear branching process with resurrection if its density matrix is given by r i b j i1, j i 1, i 1, r i, j = i 1, r i b, j = i 1, i 1, ρ ij = 1.2 γa j, j > i =, γ, j = i =,, otherwise. Here the resurrection means that at each time when the process gets extinct, some immigrants come into the population at rate γ according to the distribution {a i }. By a nonlinear branching process with immigration we mean a Markov chain with density matrix Q =q ij givenby r i b j i1 γa j i, j i 1, i, r i γ, j = i, q ij = 1.3 r i b, j = i 1, i 1,, otherwise. In this model, the immigrants come at rate γ according to the distribution {a i } independently of the inner population. The purpose of this paper is to investigate the construction and basic properties of the nonlinear branching process with immigration defined by 1.3. Let m = ja j, M = jb j, j= which represent the birth mean and immigration mean of the process, respectively. Moreover, we introduce the functions F s = a i s i, As =γ1 F s, Gs = i= j= b i s i, Bs =Gs s, s [, 1]. i= Let q be the smaller root of the equation Gs =s in [, 1]. We sometimes denote r i by ri for notational convenience. Suppose that Ω, F, F t,p is a probability space satisfying the usual hypotheses. Denote mi =b i and ni =a i for each i. Let {pt} and {qt} be F t -Poisson point processes with characteristic measures dumdz andγndz, respectively. We assume {pt} and {qt} are independent of each other. Let p ds, du, dz and q ds, dz be the Poisson random measures associated with {pt} and {qt}, respectively. Given an -valued F -measurable random variable X, let us consider the stochastic integral equation rxs X t = X z 1 p ds, du, dz z q ds, dz. 1.4

3 onlinear Branching Processes 123 Let ζ = lim k τ k, where τ k =inf{t :X t k}. The above equation only makes sense for t<ζ.we call ζ the explosion time of {X t } and make the convention X t = for t ζ. We say the solution is non-explosive if ζ =. As a special case of 1.4 we also consider the equation rxs X t = X z 1 p ds, du, dz. 1.5 We now state the main results of the paper. Theorem 1.1 There exists a pathwise unique solution to 1.4. Moreover, if the solution to 1.5 is non-explosive, then so is the solution to 1.4. Theorem 1.2 Let {X t } be the solution to 1.4 and let Q ij t =P X t = j X = i. Then Q ij t solves the Kolmogorov forward equation of Q. Theorem 1.3 The solution to 1.4 is the minimal process of Q and the solution to 1.5 is the minimal process of R. Theorem 1.4 The density matrix R is regular if and only if Q is regular. Theorem If M 1, then Q is regular. 2 Suppose that i=1 r 1 i <. ThenQ is regular if and only if M 1. 3 Suppose that 1 <M and r i = αi θ for α> and θ>. Then Q is regular if and only if for some ε q, 1, we have 1 ln 1 θ 1 ds =. ε Bs s In the following three theorems, we assume γr i b > for every i 1, so the matrix Q is irreducible. Theorem Suppose that m<, M < 1 and lim i r i =. Then the nonlinear branching process with immigration is recurrence. 2 Suppose that r i is increasing and there exist constants α>and > such that r i /i α holds for each i>. Then the nonlinear branching process with immigration is recurrent if M 1 and [ 1 y ] J := αby exp Ax αbx dx dy =. 3 Suppose that M>1. Then the nonlinear branching process with immigration is transient. 4 Suppose that r i is increasing and there exist constants α>and > such that r i /i α holds for each i>. Then the nonlinear branching process with immigration is transient if M 1 and [ 1 y ] J := αby exp Ax αbx dx dy <. Theorem If m<, M 1, r i is increasing and i=1 r 1 i <, then the nonlinear branching process with immigration is ergodic.

4 124 Li P. S. 2 Suppose that r i = αi θ for α> and θ 1. Then the recurrent nonlinear branching process with immigration is ergodic if and only if As ln 1 θ 1 ds <. 1.6 αbs s 3 If m<,m < 1 and lim inf i r i /i >, then the nonlinear branching process with immigration is exponentially ergodic. Theorem If m<, M<1, r i is increasing and i=1 r 1 i <, then the process is strongly ergodic. 2 Suppose that r i = αi θ for α> and θ>1. Then the nonlinear branching process with immigration is strongly ergodic if and only if 1 ln 1 θ 1 ds <. 1.7 αbs s 3 If i=1 r 1 i =, then the nonlinear branching process with immigration is not strongly ergodic. The nonlinear branching process with resurrection defined above was introduced by [8], who studied the problems of uniqueness, recurrence and ergodicity of the process. The model has attracted the attention of a number of authors. In particular, [16] gave criteria for strong ergodicity of the process. [4] and [14] established some criteria for their regularity and uniqueness. [3] studied some interesting differential-integral equations associated with a special class of nonlinear branching processes and gave some characterizations of their mean extinction times. [5] established a Harris regularity criterion for such processes. The existence and uniqueness of linear branching processes with instantaneous resurrection were studied in [6]. However, most of the study of models with immigration have been focused on linear branching structures. The branching process with immigration was studied in [11], who gave a characterization of the onedimensional marginal distributions of the process starting from zero. An ergodicity criterion for the process was given in [12, 15] established some recurrence criteria for linear branching processes with immigration and resurrection. The first three theorems above give constructions of nonlinear branching processes with and without immigration. These provide convenient formulations of the processes. In particular, the result of Theorem 1.4 is derived as an immediate consequence of 1.4 and 1.5. We hope the equations can also be useful in some other similar situations. The proof of Theorem 1.5 is based on Theorem 1.4 and the results of [8] and [5]. The study of recurrence of the immigration model is more delicate since the problem cannot be reduced to the extinction problem of the original nonlinear branching process as in the case of a resurrection model. Theorem 1.6 was proved by using the results of the minimal nonnegative solutions as developed in [7] and comparing the process with some linear branching processes which was studied by [12]. The proofs of the ergodicities in Theorems 1.7 and 1.8 are based on comparisons of the process with some suitably designed birth-death process and estimates of the mean extinction time.

5 onlinear Branching Processes Stochastic Integral Equations Stochastic integral equations with jumps have been playing increasingly important roles in the study of Markov processes. In this section, we give a construction of the solution to 1.4 and prove the solution is a minimal nonlinear branching process with immigration. This result is then used to study the regularity of the density matrix Q. We refer to [1] for the general theory of stochastic equations with jumps. Proposition 2.1 The pathwise uniqueness of solutions holds for the equation 1.4. Proof Let {X t } and {X t} be any two solutions of equation 1.4 with X = X. By passing to the conditional probability P F, we may and do assume X = X is deterministic. Let τ m =inf{t :X t m}, τ m =inf{t :X t m} and σ m = τ m τ m. It is sufficient to show that τ m = τ m = σ m and X t = X t for all t σ m m =1, 2,... Then σm X t σm X t σ m = z 1[1 {<u rxs } m1 1 {<u rx s }] p ds, du, dz, where m = {, 1, 2,...,m}. Taking the expectation, we get E[ X t σm X t σ m ] { σm } E z 1[1 {<u rxs } 1 {<u rx s }] p ds, du, dz m1 { σm } E z 1 1 {<u rxs } 1 {<u rx s } dsdumdz m1 { σm } M m1 1E rx s rx s ds M m1 1 E[ rx s σm rx s σ m ]ds, where M m := m zmdz. By taking m X, we have X s X s m for <s σ m. Denote d m =sup{ ri rj/i j : i j, i, j m}. Then we have E[ X t σm X t σ m ] M m1 1d m E[ X s σm X s σ m ]ds. 2.1 Since X s σm and X s σ m only have countably many discontinuous points, we can also use X s σm and X s σ m instead of X s σm and X s σ m in the right hand side of 2.1. Using Gronwall s inequality we have E[ X s σm X s σ m ] =. Thus we can conclude that X t = X t for all t [,σ m a.s. This clearly implies that τ m = τ m = σ m a.s. and the pathwise uniqueness of solutions of 1.4 is proven. Theorem 2.2 For any -valued F -measurable random variable X, there is a pathwise unique solution to 1.5. Proof Without loss of generality, we assume X is deterministic. Let D 1 = {s : ps,rx ] }. Since rx E[ p,t],rx ] ] = ds du mdz =trx <,

6 126 Li P. S. the set D 1 is discrete in,. Let σ 1 be the minimal element in D 1 and pσ 1 =u 1,z 1. Then set X, t [,σ 1, X t = X z 1 1, t = σ 1. The process {X t :<t σ 1 } is clearly the solution of 1.5. Set D 2 = {s : ps σ 1 [,rxσ 1 ] }, σ 2 be the minimal element in D 2 and pσ 1 σ 2 =u 2,z 2. Define {X t : σ 1 <t σ 1 σ 2 } by xσ 1, t σ 1,σ 1 σ 2, X t = xσ 1 z 2 1, t = σ 1 σ 2. It is easy to see that {X t :<t σ 1 σ 2 } is the unique solution of 1.5. Continuing this process successively, we get a process {X t : t<τ}, whereτ = i=1 σ i. ext, we show τ = ζ := lim k τ k, where τ k =inf{t :X t k}. Clearly, for each n wehavex t < for t [, n i= σ i]. Then n i= σ i <ζholds for each n, and so τ ζ. On the other hand, since [ τm E rxs ] p ds, du, dz t max rk <, k m the process {X t } has finitely many jumps before t τ m, therefore t τ m <τ,since t and m 1 can be arbitrary, we get ζ τ. Then we have τ = ζ. Hence X t is determined in the time interval [,ζ; the uniqueness is clear from Proposition 2.1. Proof of Theorem 1.1 Let {Xt } denote the solution to 1.5. Let {v k : k =1, 2,...} be the set of jump times of the Poisson process t q ds, dz. We have clearly v k as k. For t<v 1 set X t = Xt. Suppose that X t has been defined for t<v k and let ξ = X vk z q ds, dz. {v k } Hereandinthesequelwemaketheconvention =. By the assumption there is also asolution{xt k } to rxs X t = ξ z 1 p v k ds, du, dz. Let η k be the explosion time of {Xt k }. Ifv k η k >v k1, we define X t = Xt v k k for v k t<v k1. If v k η k v k1,wesetx t = Xt v k k for v k t<v k η k and X t = for v k η k t<v k1. By induction that defines a process {X t }, which is clearly the pathwise unique solution to 1.4. Obviously, if the solution of 1.5 is non-explosive for each deterministic initial state X = i, we have η k = for all k, andso{x t } is non-explosive. Proof of Theorem 1.2 Let Ñpds, du, dz = p ds, du, dz dsdumdz andñqds, dz = Ñ q ds, du, dz dsdundz. For any bounded function f on, wehave τm rxs fx t τm =fx [fx s z 1 fx s ] p ds, du, dz

7 onlinear Branching Processes 127 where τm [fx s z fx s ] q ds, dz τm rxs = fx [fx s z 1 fx s ]dsdumdz τm [fx s z fx s ]γdsndzm t f, 2.2 M t f := τm rxs τm [fx s z 1 fx s ]Ñpds, du, dz [fx s z fx s ]Ñqds, dz is a martingale. Since X s X s for at most countably many s, we can also use X s instead of X s in the right hand side of 2.2. In particular, for f =1 {j},wehave τm 1 {Xt τ m =j} =1 {X =j} b k rx s [1 {Xs k 1=j} 1 {Xs =j}]ds k= τm γa k [1 {Xs k=j} 1 {Xs =j}]ds M t 1 {j}. Write E i = E X = i fori. Taking the expectation in both sides of the above equation and letting m,weget ζ E i 1 {Xt ζ =j} =E i 1 {X =j} b k E i rx s [1 {Xs k 1=j} 1 {Xs =j}]ds k= ζ γa k E i [1 {Xs k=j} 1 {Xs =j}]ds. Obviously, here we can remove the truncation ζ and obtain Q ij t =δ ij j = δ ij j b k [r j k1 Q i,j k1 s r j Q ij s]ds k= j1 j 1 γa k [Q i,j k s Q ij s]ds Q ik sr k b j k1 Q ij sr j ds Q ik sγa j k γq ij s ds. k= Differentiating both sides we get j1 j 1 Q ijt = Q ik tr k b j k1 Q ij tr j Q ik tγa j k γq ij t = Q ik tq kj. k= k=

8 128 Li P. S. This is just the Kolmogorov forward equation of Q. Proof of Theorem 1.3 By Theorem 1.1, the solution {X t } to 1.4 is a time homogeneous Markov process with state space := {, 1, 2,..., }. Suppose that σ 1 and z 1 are given in the proof of Theorem 2.2. Let qv 1 =y 1. By the properties of Poisson point process, we can see that P σ 1 >t=e rxt,pz 1 = i =m{i} =b i,pv 1 >t=e γt,py 1 = i =n{i} =a i and σ 1, z 1, v 1,y 1 are mutually independent. Write P i = P X = i fori. Let ξ t =max{n m : n i= σ i, m i= v i t}. Obviously, we have P i [X t = j, ξ t =]=δ ij. By the Markov property of {X t }, otice that P i {X t = j, ξ t = m 1} = P i {1 {σ1 v 1 <t}p Xσ1 v 1 [X t σ1 v 1 = j, ξ t σ1 v 1 = m]} = P i {1 {σ1 <t}1 {v1 σ 1 }P Xσ1 [X t σ1 = j, ξ t σ1 = m]} P i {1 {v1 <t}1 {v1 <σ 1 }P Xv1 [X t v1 = j, ξ t v1 = m]} { } = P i r i e rit s e γt s P Xt s [X s = j, ξ s = m]ds { } P i γe γt s e rit s P Xt s [X s = j, ξ s = m]ds { = P i r i e r iγt s P i { = k i γe r iγt s k=i 1 k=i1 e r iγt s q ik P k [X s = j, ξ s = m]ds. P i [X t = j] = } P z 1 = k i 1P k [X s = j, ξ s = m]ds } P y 1 = k ip k [X s = j, ξ s = m]ds P i [X t = j, ξ t = m]. m= From the theory of Markov chains we know P ij t :=P i [X t = j] is the minimal solution to the Kolmogorov equation of the density matrix Q, see Chen [7, p. 78]. Then {X t } is the minimal process of the density matrix Q. Proof of Theorem 1.4 Suppose that R is regular. Then the minimal solution of its Kolmogorov backward equation is honest, i.e., the minimal process of R is non-explosive. Applying Theorems 1.1 and 1.3 we know the minimal process of Q is non-explosive. Thus Q is regular. Conversely, suppose that R is not regular. Then by [1, Theorem 2.7 3], there exists a non-trivial solution u i to ui r ik u k, u i 1. 2γ r i k i Since r ik q ik, we see u i is also a solution to u i q ik u k. γ q i k i Using [1, Theorem 2.7 3] again, we see Q is not regular.

9 onlinear Branching Processes 129 Proof of Theorem 1.5 Theorem 2.3 of [5]. By Theorem 1.4, we derive the results from Theorem 1.2 of [8] and 3 Recurrence Proof of Theorem Under the assumption, there exists a constant 1 such that r i γm/1 M holds for each i. Takex i = i for i. For i, wehave q ij x j = r i b i 1 r i b j1 γa j i j j= =r i γi r i M 1 γm r i γi = q ii x i. Let π ij be the embedded chain of q ij. The above calculations imply that x i is a finite solution of π ij x j x i, i. j= Then Q is recurrent by Theorem 4.24 in [7]. 2 Suppose that M 1andJ =. We shall prove the process is recurrent by comparison arguments. Let Q = q ij be the density matrix defined by αib j i1 γa j i, j i 1, αi γ, j = i, q ij = αib, j = i 1,, otherwise, which corresponds to a linear branching process with immigration. It was proved in [12] that this process is recurrent. ext, we define the density matrix Q =qij by r i b j i1 γa j i r i /αi, j i 1, r i γr i /αi, j = i, qij = r i b, j = i 1, q ij, i <,, otherwise. Let π ij andπij denote the embedded chains of q ij andqij, respectively. It is easy to see that π ij = πij for i and j. Then Q is also recurrent. For l i>,wehave q ij = r i b qlj. Moreover, we have and j=k j=k j=i j=i q ij = qlj =, k i 1 j=k j=k q ij qij qlj, k l 1. j=k

10 13 Li P. S. Then Q and Q are stochastically comparable, so we can construct a Q-process X t anda Q -process Xt on some probability space in such a way that X = X and X t Xt for all t ; see Example 5.51 in [7]. ow the recurrence of X t follows from that of Xt. 3 Since M>1, there exists an s, 1 such that Bs <, i.e. i= b is i 1 < 1. Take H = {} and x i =1 s i. For i 1, we have π ik x k = π i,i 1 x i 1 π i,ik x ik k= = r ib r i γ x i 1 = 1 [ r i b 1 s i 1 r i γ [ =1 si r i r i γ k= r i b k1 γa k x ik r i γ γa k 1 s ik b k s k 1 γ ] a k s k 1 s i = x i. ] r i b k1 1 s ik Then the process is transient by Theorem 8..2 in [13]. 4 Since the proof is similar to that of 2, we omit it. 4 Mean Extinction Time In this section, we assume r i = αi θ for α>andθ 1. Let X t be a realization of the nonlinear branching process with immigration. Its jump times are given successively by τ = and τ n =inf{t : t>τ n 1,X t X τn 1 }. We also define σ k =inf{t τ 1 : X t = k}. Inorderto prove the criterion for the ergodicity of X t, let us consider the absorbing process X t := X t σ. The density matrix of this process is given by q ij, i, q ij =, i =. For this process, we define τ =, τ n =inf{t : t> τ n 1, Xt X τ n 1 } and σ k =inf{t τ 1 : Xt = k}. It is easy to see that E i σ = E i σ. 4.1 Let p ij t and φ ij λ denote the transition function and the resolvent of X t, respectively. Lemma 4.1 For any i and s [, 1, we have p ijts j = αbs p ij tj θ s j 1 As p ij ts j, t, 4.2 and j= j= λ φ ij λs j s i = αbs φ ij λj θ s j 1 As φ ij λs j, λ >. 4.3 Proof From the Kolmogorov forward equation of the transition function we obtain that j 1 p ij t = p ik tr k b j k1 γa j k p ij tr j γ p i,j1 tr j1 b.

11 onlinear Branching Processes 131 Multiplying s j on both sides of the above equality and then summing over j, wehave j 1 j 1 p ijts j = p ik tr k b j k1 s j γ p ik ta j k s j j= j= j= j= p i,j1 tr j1 s j b p ij tr j s j γ p ij ts j, Then we can interchange the order of summation to see j 1 p ik tr k s k 1 and It follows that j= k= p ik tr k b j k1 s j = k l γ j= j 1 j= k= p ik ta j k s j = γ k l p ik ts k j=k1 j=k1 j= b j k1 s j k1 a j k s j k. p ijts j = p ij tr j s j 1 αbs p ij ts j As. That proves 4.2 and 4.3 is just the Laplace transform of 4.2. Lemma 4.2 For any i, k 1, we have p ik tdt < and lim t p ik t =.Furthermore, for i 1 and s [, 1, we have p ik tdt s k <. 4.4 Proof Fixing an i 1, we can use the Kolmogorov forward equation to see which means that p i t =b α p i1 udu, j= p i1 tdt b 1 α 1 <. Suppose that p ik tdt < for k j. By the Kolmogorov forward equations, we can see for j 1, j 1 p ij t δ ij = αk θ b j k1 γa j k p ik udu αj θ γ p ij udu Letting t,wehave αj 1 θ b p ij1 udu. p ij1 tdt <. Then p ik tdt < by induction. Since the limit lim t p ik t always exists, we see lim t p ik t = immediately.

12 132 Li P. S. We next tend to prove 4.4. Since M 1, we have Bs > forafixeds [, 1. Then there exists a k 1sothatkαBs sas >. Using 4.2, we have p ijus j = αbs p ij uj θ s j 1 As p ij us j j= αbs j=k1 [kαbs sas] p ij uj θ s j 1 As j=k1 p ij us j p ij us j 1 As k p ij us j. 4.5 Let A =max s [,1] As and B =max s [,1] αbs. Thenforeachs [, 1, p ij usj du B p ij uj θ s j 1 du A p ij us j du j= t B j θ s j 1 t A s j <. Then we use Fubini s theorem to see p ijus j du = j= j= Integrating both sides of 4.5, p ij ts j s i [kαbs sas] j= As p ijus j du. j=k1 k p ij udu s j. Letting t and using the fact that p ik tdt <, we have p ij udu s j 1 <, j=k1 p ij udu s j 1 which implies 4.4. Proposition 4.3 Suppose that the nonlinear branching process with immigration is recurrent and 1.6 holds. Then for i 1 we have E i σ 1 1 y i ln 1 θ 1 [ 1 Ay dy exp ln 1 θ 1 ] dy Γθ αby y Γθ αby y 4.6 and E i σ 1 y i ln 1 θ 1 dy. 4.7 αby y Proof Multiplying 4.3 by lns/y θ 1, dividing by αbs and integrating both sides, we have s φ ij j θ y j 1 ln s θ 1 s λ Ay φ ij λy j y i λ φ i λ dy = ln s θ 1 dy. y αby y

13 onlinear Branching Processes 133 Letting y = se x j in the left hand side of the above equation, we get s φ ij j θ y j 1 ln s θ 1 dy = φ ij λs j x θ 1 e x dx =Γθ φ ij λs j. y Using the above two equations, we obtain φ ij λs j = 1 s λ Ay Γθ For i 1, λ>ands [, 1], let ote that Then, by 4.8, By Lemma 4.2, λ φ i λ = ψ i λ, s = ψ i λ, s 1 Γθ 1 Γθ s φ ij λy j y i λ φ i λ αby φ ij λs j. e t p i t λ 1 y i αby lim λ φ ij λ = lim λ λ It follows that, for s [, 1], Denote By 4.4, we have dt ln 1 y θ 1 dy λ Ayψ i λ, y αby lim λψ iλ, s lim λ λ λ C i := 1 Γθ 1 Γθ 1 Γθ ψ i,s= 1 y i αby 1 y αby Ay αby for each s<1. Letting λ in 4.9, we have ψ i,s C i 1 Γθ s e t p ij t λ e t dt =1. ln s y θ 1 dy. 4.8 ln 1 y θ 1 dy. 4.9 dt =. φ ij λ =. 4.1 ln 1 y θ 1 dy ln 1 y θ 1 dy ln 1 y θ 1 dy <. p ik tdt s k < Ayψ i,y αby ln 1 y θ 1 dy.

14 134 Li P. S. Using the Gronwall s inequality, we have [ s Ay ψ i,s C i exp ln 1 θ 1 ] dy Γθ αby y Letting s 1, we see lim ψ i,s = lim p ij ts j dt = 1 p i tdt = E i σ. s 1 s 1 Hence 4.6 follows from 4.1 and Similarly, by 4.8, we have ψ i λ, s 1 s Γθ Letting λ and then letting s 1, we obtain 4.7. λ φ i λ y i ln s θ 1 dy. αby y 5 Ergodicity and Strong Ergodicity One of the main steps to prove Theorems 1.7 and 1.8 is to compare our nonlinear branching process with immigration with a suitably designed birth-death process, which we now introduce. A similar birth-death process was used by [8] in her study of the regularity of the nonlinear branching process with resurrection. Let L = M b 1= kb k1 and let ˆX t be a birth-death process with birth rate d i = r i L γm and death rate c i = r i b. We denote the density matrix of ˆX t byˆq ij. Let T := inf{t : ˆXt =}. Lemma Suppose that m<, M<1, r i is increasing and i=1 r 1 i <. Then the birth-death process ˆX t is strongly ergodic. 2 Suppose that m<, M 1, r i is non-decreasing and i=1 r 1 i <. Then the birth-death process ˆX t is ergodic. Proof 1 It is easy to check that the birth-death process is regular. Fix an ε> satisfying L ε<b. Then there exists an such that d i γ i L ε foreachi>.let 1 n d k d n S =. 5.1 c n1 c k c n1 n=1 It is obvious that n=1 c 1 n1 <. otice that for each n>,wehave n d k d n d k d max c k c n1 1 k c k c ρ n n d 1 d n c 1 c n1 n d k d ρ n k1 max, 1 k c k c c n1 d k d n c k c n1 where ρ = b 1 L ε < 1. Then S<. By Corollary 2.4 of [16], we conclude that ˆX t is strongly ergodic.

15 onlinear Branching Processes Since L b,wehave d d n 1 R := d c 1 c n n=1 n=1 c 1 γmc 2 γm c n 1 γm c 1 c 2 c n. Taking logarithm on the right-hand side, we get n 1 c1 γmc 2 γm c n 1 γm ln = ln 1 γm 1 ln. c 1 c 2 c n r i b c n Since lim i r i =, wehaveln 1 γm r i b γm r i b as i. Then there exists a constant C such that for sufficiently large n, c1 γmc 2 γm c n 1 γm 1 1 ln C ln, c 1 c 2 c n r i c n and hence i=1 c 1 γmc 2 γm c n 1 γm c 1 c 2 c n for another constant T. That implies R<. By Theorem 4.55 in [7] the birth-death processisergodic. Lemma 5.2 If the nonlinear branching process with immigration has a stationary distribution μ =μ j, then the generating function fs := j= μ js j satisfies the following equation : Proof s Γθfs =Γθμ Ay αby i=1 T c n ln s y θ 1 fydy, s [, 1]. 5.2 The stationary distribution μ j satisfiesμq =. In view of 1.3, we have j 1 j1 μ j γ αj θ = μ i γa j i μ i αi θ b j i i= Multiplying s j on both sides of the above equality and then summing over j, wehave j 1 j1 γ μ j s j αs μ j j θ s j 1 = γ μ i a j i s j α μ i i θ b j i1 s j. 5.4 Interchanging the order of summation, l.h.s. of 5.4 = γ μ i s i i= α i= i= j=i1 μ i i θ s i 1 i=1 i= a j i s j i αμ 1 b j=i 1 i=1 b j i1 s j i1 = γ μ i s i F s αμ 1 b α μ i i θ s i 1 Gs. 5.5 Letting j = in 5.3, we see μ γ = αμ 1 b. Therefore, from 5.4 it follows that μ j j θ s j 1 = fsas αbs. i=1

16 136 Li P. S. Multiplying the above equation by ln s y θ 1 and integrating the both sides, we have s Letting y = se x j,weget μ j j θ y j 1 ln s θ 1 s Ay dy = ln s θ 1 fydy. 5.6 y αby y l.h.s. of 5.6 = = θ 1 μ j j θ se x j j 1 x s x j j e j dx μ j s j x θ 1 e x dx =Γθ[fs μ ]. Then fs is a solution to the differential equation 5.2. Proof of Theorem By Lemma 5.1, the birth-death process ˆX t isergodic. Thusby Theorem 4.45 in [7], the equation u =, d i u i1 u i c i u i 1 u i 1=, i 5.7 has a finite nonnegative solution u i. By Remark 2.5 of [16], we have i 1 1 d k1 d j u =, u i =. 5.8 c k1 c k1 c j1 k= It is apparent that u i u i1. Moreover, we have u i1 u i = 1 c i1 j=i1 j=k1 d i1 d j c i1 c j1, u i u i 1 = 1 c i j=i d i d j c i c j1. Since d i1 /c i1 <d i /c i and 1/c i1 < 1/c i, it is not hard to show that u i1 u i is non-increasing in i. Coming back to the matrix Q, fori 1, q ij u j = q ij u j u i j= j= = c i u i 1 u i r i γa k b k1 k u il u il 1 l=1 k u il u il 1 l=1 c i u i 1 u i r i and q j u j = q j u j u 1 = kb k1 u i1 u i γ ka k u i1 u i = c i u i 1 u i r i L γmu i1 u i = c i u i 1 u i d i u i1 u i = q j i=1 j u i u i 1 < q j ju 1 γmu 1 <. 5.1

17 onlinear Branching Processes 137 Then u i is a nonnegative bounded solution to the following equation q j u j <, q ij u j 1, i 1. By Theorem 4.45 in [7], we know the process is positive recurrent. 2 Suppose that the process is ergodic. Then letting s = 1 in 5.2, we get j= > Γθ μ j μ jy j Ay ln 1 θ 1 μ Ay dy ln 1 θ 1 dy. αby y αby y j= Since μ >, we have 1.6. Conversely, suppose that 1.6 holds. By the strong Markov property, we have E σ =E τ 1 E [E Xτ1 σ ] = 1 q i E i σ = 1 q q γ a i E i σ. Using 4.6 we have E σ 1 γ 1 γγθ [ i=1 Ay ln 1 θ 1 ] [ 1 Ay dy exp ln 1 θ 1 ] dy. αby y Γθ αby y By 1.6, the right-hand side is finite. Thus the process is ergodic. 3 By the assumption, there exists C>such that r i C b Γi for large enough i. Therefore, jq ij = i kr i b k1 i kγa k i 1r i b γ r i i j= m r i b Γ m Ci. Applying Corollary 4.49 in [7], we know the process is exponentially ergodic. Proof of Theorem Using Lemma 5.1, we see the birth-death process ˆX t is strongly ergodic. Let u i := E i T fori. Applying Theorem 4.44 and Lemma 4.48 in [7], we find that u i is a bounded non-negative solution to equation 5.7. By 5.9 and 5.1, u i isalso a non-negative bounded solution to the following equation q j u j <, q ij u j 1, i 1. By Theorem 4.45 in [7], we know the process is strongly ergodic. 2 Suppose that 1.7 holds. Then j= Ay ln 1 θ 1 dy γ αby y Letting i in 4.6, we get sup E i σ 1 i Γθ [ 1 αby i=1 1 ln 1 θ 1 dy <. αby y ln 1 θ 1 ] [ 1 dy exp y Γθ Ay ln 1 θ 1 ] dy <. αby y Then by Theorem 4.44 in [7], the process is strongly ergodic. Conversely, suppose that X t is strongly ergodic. By Theorem 4.44 in [7] and 4.7, we know 1.7 holds.

18 138 Li P. S. 3 By the strong Markov property, for i 1, we have E i σ = i E kσ k 1. otice that E k σ k 1 E k [time spent at k until the next jump] = 1 r k γ. Thus E i σ i r k γ 1. By the assumption i=1 r 1 i =, wehavesup i E i σ =. Applying Theorem 4.44 in [7], we know the process is not strongly ergodic. Acknowledgements The author would like to thank Professors Mu-Fa Chen, Yong-Hua Mao and Yu-Hui Zhang for their advice and encouragement. I am grateful to the two referees for pointing out a number of typos in the first version of the paper. References [1] Anderson, W. J.: Continuous-Time Markov Chains: An Applications-Oriented Approach, Springer, ew York, 1991 [2] Athreya, K. B., ey, P. E.: Branching Processes, Springer, Berlin, 1972 [3] Chen, A. Y.: Ergodicity and stability of generalised Markov branching processes with resurrection. J. Appl. Probab., 39, [4] Chen, A. Y., Li, J. P., Ramesh,. I.: Uniqueness and extinction of weighted Markov branching processes. Methodol. Comput. Appl. Prob., 7, [5] Chen, A. Y., Li, J. P., Ramesh,. I.: General Harris regularity criterion for non-linear Markov branching process. Statist. Probab. Letters., 76, [6] Chen, A. Y., Renshaw, E.: Markov branching processes with instantaneous immigration. Probab. Theory Relat. Fields, 87, [7] Chen, M. F.: From Markov Chains to on-equilibrium Particle Systems, Second edition, World Scientific, Singapore, 24 [8] Chen, R. R.: An extended class of time-continuous branching processes. J. Appl. Probab., 34, [9] Harris, T. E.: The Theory of Branching Processes, Springer, Berlin, 1963 [1] Ikeda,., Watanabe, S.: Stochastic Differential Equations and Diffusion Processes, Second edition, orth- Holland/Kodasha, Amsterdam/Tokyo, 1989 [11] Karlin, S., Taylor, H. M.: A First Course in Stochastic Processes. Second edition, Academic Press, ew York, 1975 [12] Li, J. P., Chen, A. Y.: Markov branching processes with immigration and resurrection. Markov Processes Relat. Fields, 12, [13] Meyn, S. P., Tweedie, R. L.: Markov Chains and Stochastic Stability, Second edition, Cambridge Univ. Press, Cambridge, 29 [14] Pakes, A. G.: Extinction and explosion of nonlinear Markov branching processes. J. Austr. Math. Soc., 82, [15] Yang, Y. S.: On branching processes allowing immigration. J. Appl. Probab., 9, [16] Zhang, Y. H.: Strong-ergodicity for single-birth processes. J. Appl. Probab., 38,

arxiv: v1 [math.pr] 4 Nov 2016

arxiv: v1 [math.pr] 4 Nov 2016 Perturbations of continuous-time Markov chains arxiv:1611.1254v1 [math.pr 4 Nov 216 Pei-Sen Li School of Mathematical Sciences, Beijing Normal University, Beijing 1875, China E-ma: peisenli@ma.bnu.edu.cn

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes

Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Front. Math. China 215, 1(4): 933 947 DOI 1.17/s11464-15-488-5 Central limit theorems for ergodic continuous-time Markov chains with applications to single birth processes Yuanyuan LIU 1, Yuhui ZHANG 2

More information

Statistics 150: Spring 2007

Statistics 150: Spring 2007 Statistics 150: Spring 2007 April 23, 2008 0-1 1 Limiting Probabilities If the discrete-time Markov chain with transition probabilities p ij is irreducible and positive recurrent; then the limiting probabilities

More information

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan

Chapter 5. Continuous-Time Markov Chains. Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Chapter 5. Continuous-Time Markov Chains Prof. Shun-Ren Yang Department of Computer Science, National Tsing Hua University, Taiwan Continuous-Time Markov Chains Consider a continuous-time stochastic process

More information

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS (February 25, 2004) EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS, University of Queensland PHIL POLLETT, University of Queensland Abstract The birth, death and catastrophe

More information

The Transition Probability Function P ij (t)

The Transition Probability Function P ij (t) The Transition Probability Function P ij (t) Consider a continuous time Markov chain {X(t), t 0}. We are interested in the probability that in t time units the process will be in state j, given that it

More information

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes

Birth and Death Processes. Birth and Death Processes. Linear Growth with Immigration. Limiting Behaviour for Birth and Death Processes DTU Informatics 247 Stochastic Processes 6, October 27 Today: Limiting behaviour of birth and death processes Birth and death processes with absorbing states Finite state continuous time Markov chains

More information

Integrals for Continuous-time Markov chains

Integrals for Continuous-time Markov chains Integrals for Continuous-time Markov chains P.K. Pollett Abstract This paper presents a method of evaluating the expected value of a path integral for a general Markov chain on a countable state space.

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

STAT STOCHASTIC PROCESSES. Contents

STAT STOCHASTIC PROCESSES. Contents STAT 3911 - STOCHASTIC PROCESSES ANDREW TULLOCH Contents 1. Stochastic Processes 2 2. Classification of states 2 3. Limit theorems for Markov chains 4 4. First step analysis 5 5. Branching processes 5

More information

EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING

EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING EXTINCTION PROBABILITY IN A BIRTH-DEATH PROCESS WITH KILLING Erik A. van Doorn and Alexander I. Zeifman Department of Applied Mathematics University of Twente P.O. Box 217, 7500 AE Enschede, The Netherlands

More information

Continuous-Time Markov Chain

Continuous-Time Markov Chain Continuous-Time Markov Chain Consider the process {X(t),t 0} with state space {0, 1, 2,...}. The process {X(t),t 0} is a continuous-time Markov chain if for all s, t 0 and nonnegative integers i, j, x(u),

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing Advances in Dynamical Systems and Applications ISSN 0973-5321, Volume 8, Number 2, pp. 401 412 (2013) http://campus.mst.edu/adsa Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes

More information

Lecture 10: Semi-Markov Type Processes

Lecture 10: Semi-Markov Type Processes Lecture 1: Semi-Markov Type Processes 1. Semi-Markov processes (SMP) 1.1 Definition of SMP 1.2 Transition probabilities for SMP 1.3 Hitting times and semi-markov renewal equations 2. Processes with semi-markov

More information

Irregular Birth-Death process: stationarity and quasi-stationarity

Irregular Birth-Death process: stationarity and quasi-stationarity Irregular Birth-Death process: stationarity and quasi-stationarity MAO Yong-Hua May 8-12, 2017 @ BNU orks with W-J Gao and C Zhang) CONTENTS 1 Stationarity and quasi-stationarity 2 birth-death process

More information

Stationary distribution and pathwise estimation of n-species mutualism system with stochastic perturbation

Stationary distribution and pathwise estimation of n-species mutualism system with stochastic perturbation Available online at www.tjnsa.com J. Nonlinear Sci. Appl. 9 6), 936 93 Research Article Stationary distribution and pathwise estimation of n-species mutualism system with stochastic perturbation Weiwei

More information

1 Types of stochastic models

1 Types of stochastic models 1 Types of stochastic models Models so far discussed are all deterministic, meaning that, if the present state were perfectly known, it would be possible to predict exactly all future states. We have seen

More information

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1

Irreducibility. Irreducible. every state can be reached from every other state For any i,j, exist an m 0, such that. Absorbing state: p jj =1 Irreducibility Irreducible every state can be reached from every other state For any i,j, exist an m 0, such that i,j are communicate, if the above condition is valid Irreducible: all states are communicate

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

AARMS Homework Exercises

AARMS Homework Exercises 1 For the gamma distribution, AARMS Homework Exercises (a) Show that the mgf is M(t) = (1 βt) α for t < 1/β (b) Use the mgf to find the mean and variance of the gamma distribution 2 A well-known inequality

More information

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing Department of Applied Mathematics Faculty of EEMCS t University of Twente The Netherlands P.O. Box 27 75 AE Enschede The Netherlands Phone: +3-53-48934 Fax: +3-53-48934 Email: memo@math.utwente.nl www.math.utwente.nl/publications

More information

Markov processes and queueing networks

Markov processes and queueing networks Inria September 22, 2015 Outline Poisson processes Markov jump processes Some queueing networks The Poisson distribution (Siméon-Denis Poisson, 1781-1840) { } e λ λ n n! As prevalent as Gaussian distribution

More information

Quasi-Stationary Distributions in Linear Birth and Death Processes

Quasi-Stationary Distributions in Linear Birth and Death Processes Quasi-Stationary Distributions in Linear Birth and Death Processes Wenbo Liu & Hanjun Zhang School of Mathematics and Computational Science, Xiangtan University Hunan 411105, China E-mail: liuwenbo10111011@yahoo.com.cn

More information

Characterizations on Heavy-tailed Distributions by Means of Hazard Rate

Characterizations on Heavy-tailed Distributions by Means of Hazard Rate Acta Mathematicae Applicatae Sinica, English Series Vol. 19, No. 1 (23) 135 142 Characterizations on Heavy-tailed Distributions by Means of Hazard Rate Chun Su 1, Qi-he Tang 2 1 Department of Statistics

More information

STAT 380 Continuous Time Markov Chains

STAT 380 Continuous Time Markov Chains STAT 380 Continuous Time Markov Chains Richard Lockhart Simon Fraser University Spring 2018 Richard Lockhart (Simon Fraser University)STAT 380 Continuous Time Markov Chains Spring 2018 1 / 35 Continuous

More information

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS J. Appl. Prob. 41, 1211 1218 (2004) Printed in Israel Applied Probability Trust 2004 EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS and P. K. POLLETT, University of Queensland

More information

ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS

ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS PORTUGALIAE MATHEMATICA Vol. 55 Fasc. 4 1998 ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS C. Sonoc Abstract: A sufficient condition for uniqueness of solutions of ordinary

More information

Introduction to self-similar growth-fragmentations

Introduction to self-similar growth-fragmentations Introduction to self-similar growth-fragmentations Quan Shi CIMAT, 11-15 December, 2017 Quan Shi Growth-Fragmentations CIMAT, 11-15 December, 2017 1 / 34 Literature Jean Bertoin, Compensated fragmentation

More information

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling

TMA4265 Stochastic processes ST2101 Stochastic simulation and modelling Norwegian University of Science and Technology Department of Mathematical Sciences Page of 7 English Contact during examination: Øyvind Bakke Telephone: 73 9 8 26, 99 4 673 TMA426 Stochastic processes

More information

A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime

A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime A Method for Evaluating the Distribution of the Total Cost of a Random Process over its Lifetime P.K. Pollett a and V.T. Stefanov b a Department of Mathematics, The University of Queensland Queensland

More information

Some Properties of NSFDEs

Some Properties of NSFDEs Chenggui Yuan (Swansea University) Some Properties of NSFDEs 1 / 41 Some Properties of NSFDEs Chenggui Yuan Swansea University Chenggui Yuan (Swansea University) Some Properties of NSFDEs 2 / 41 Outline

More information

Stochastic modelling of epidemic spread

Stochastic modelling of epidemic spread Stochastic modelling of epidemic spread Julien Arino Centre for Research on Inner City Health St Michael s Hospital Toronto On leave from Department of Mathematics University of Manitoba Julien Arino@umanitoba.ca

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON

THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON GEORGIAN MATHEMATICAL JOURNAL: Vol. 3, No. 2, 1996, 153-176 THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON M. SHASHIASHVILI Abstract. The Skorokhod oblique reflection problem is studied

More information

Stochastic Processes

Stochastic Processes MTHE/STAT455, STAT855 Fall 202 Stochastic Processes Final Exam, Solutions (5 marks) (a) (0 marks) Condition on the first pair to bond; each of the n adjacent pairs is equally likely to bond Given that

More information

HJB equations. Seminar in Stochastic Modelling in Economics and Finance January 10, 2011

HJB equations. Seminar in Stochastic Modelling in Economics and Finance January 10, 2011 Department of Probability and Mathematical Statistics Faculty of Mathematics and Physics, Charles University in Prague petrasek@karlin.mff.cuni.cz Seminar in Stochastic Modelling in Economics and Finance

More information

A Note on the Central Limit Theorem for a Class of Linear Systems 1

A Note on the Central Limit Theorem for a Class of Linear Systems 1 A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Research Article Existence and Uniqueness Theorem for Stochastic Differential Equations with Self-Exciting Switching

Research Article Existence and Uniqueness Theorem for Stochastic Differential Equations with Self-Exciting Switching Discrete Dynamics in Nature and Society Volume 211, Article ID 549651, 12 pages doi:1.1155/211/549651 Research Article Existence and Uniqueness Theorem for Stochastic Differential Equations with Self-Exciting

More information

Almost Sure Convergence of the General Jamison Weighted Sum of B-Valued Random Variables

Almost Sure Convergence of the General Jamison Weighted Sum of B-Valued Random Variables Acta Mathematica Sinica, English Series Feb., 24, Vol.2, No., pp. 8 92 Almost Sure Convergence of the General Jamison Weighted Sum of B-Valued Random Variables Chun SU Tie Jun TONG Department of Statistics

More information

Insert your Booktitle, Subtitle, Edition

Insert your Booktitle, Subtitle, Edition C. Landim Insert your Booktitle, Subtitle, Edition SPIN Springer s internal project number, if known Monograph October 23, 2018 Springer Page: 1 job: book macro: svmono.cls date/time: 23-Oct-2018/15:27

More information

Limit theorems for continuous time branching flows 1

Limit theorems for continuous time branching flows 1 (Version: 212/4/11) arxiv:124.2755v1 [math.pr 12 Apr 212 Limit theorems for continuous time branching flows 1 Hui He and Rugang Ma 2 Beijing ormal University Abstract: We construct a flow of continuous

More information

Statistics 992 Continuous-time Markov Chains Spring 2004

Statistics 992 Continuous-time Markov Chains Spring 2004 Summary Continuous-time finite-state-space Markov chains are stochastic processes that are widely used to model the process of nucleotide substitution. This chapter aims to present much of the mathematics

More information

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS Qiao, H. Osaka J. Math. 51 (14), 47 66 EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS HUIJIE QIAO (Received May 6, 11, revised May 1, 1) Abstract In this paper we show

More information

for all f satisfying E[ f(x) ] <.

for all f satisfying E[ f(x) ] <. . Let (Ω, F, P ) be a probability space and D be a sub-σ-algebra of F. An (H, H)-valued random variable X is independent of D if and only if P ({X Γ} D) = P {X Γ}P (D) for all Γ H and D D. Prove that if

More information

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains A regeneration proof of the central limit theorem for uniformly ergodic Markov chains By AJAY JASRA Department of Mathematics, Imperial College London, SW7 2AZ, London, UK and CHAO YANG Department of Mathematics,

More information

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE Surveys in Mathematics and its Applications ISSN 1842-6298 (electronic), 1843-7265 (print) Volume 5 (2010), 275 284 UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE Iuliana Carmen Bărbăcioru Abstract.

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

Zdzis law Brzeźniak and Tomasz Zastawniak

Zdzis law Brzeźniak and Tomasz Zastawniak Basic Stochastic Processes by Zdzis law Brzeźniak and Tomasz Zastawniak Springer-Verlag, London 1999 Corrections in the 2nd printing Version: 21 May 2005 Page and line numbers refer to the 2nd printing

More information

The Continuity of SDE With Respect to Initial Value in the Total Variation

The Continuity of SDE With Respect to Initial Value in the Total Variation Ξ44fflΞ5» ο ffi fi $ Vol.44, No.5 2015 9" ADVANCES IN MATHEMATICS(CHINA) Sep., 2015 doi: 10.11845/sxjz.2014024b The Continuity of SDE With Respect to Initial Value in the Total Variation PENG Xuhui (1.

More information

Latent voter model on random regular graphs

Latent voter model on random regular graphs Latent voter model on random regular graphs Shirshendu Chatterjee Cornell University (visiting Duke U.) Work in progress with Rick Durrett April 25, 2011 Outline Definition of voter model and duality with

More information

Stability of Stochastic Differential Equations

Stability of Stochastic Differential Equations Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010

More information

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321

Lecture 11: Introduction to Markov Chains. Copyright G. Caire (Sample Lectures) 321 Lecture 11: Introduction to Markov Chains Copyright G. Caire (Sample Lectures) 321 Discrete-time random processes A sequence of RVs indexed by a variable n 2 {0, 1, 2,...} forms a discretetime random process

More information

Mean-field dual of cooperative reproduction

Mean-field dual of cooperative reproduction The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks

Recap. Probability, stochastic processes, Markov chains. ELEC-C7210 Modeling and analysis of communication networks Recap Probability, stochastic processes, Markov chains ELEC-C7210 Modeling and analysis of communication networks 1 Recap: Probability theory important distributions Discrete distributions Geometric distribution

More information

A MODEL FOR THE LONG-TERM OPTIMAL CAPACITY LEVEL OF AN INVESTMENT PROJECT

A MODEL FOR THE LONG-TERM OPTIMAL CAPACITY LEVEL OF AN INVESTMENT PROJECT A MODEL FOR HE LONG-ERM OPIMAL CAPACIY LEVEL OF AN INVESMEN PROJEC ARNE LØKKA AND MIHAIL ZERVOS Abstract. We consider an investment project that produces a single commodity. he project s operation yields

More information

Stochastic process. X, a series of random variables indexed by t

Stochastic process. X, a series of random variables indexed by t Stochastic process X, a series of random variables indexed by t X={X(t), t 0} is a continuous time stochastic process X={X(t), t=0,1, } is a discrete time stochastic process X(t) is the state at time t,

More information

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains

Markov Chains CK eqns Classes Hitting times Rec./trans. Strong Markov Stat. distr. Reversibility * Markov Chains Markov Chains A random process X is a family {X t : t T } of random variables indexed by some set T. When T = {0, 1, 2,... } one speaks about a discrete-time process, for T = R or T = [0, ) one has a continuous-time

More information

Asymptotic behavior for sums of non-identically distributed random variables

Asymptotic behavior for sums of non-identically distributed random variables Appl. Math. J. Chinese Univ. 2019, 34(1: 45-54 Asymptotic behavior for sums of non-identically distributed random variables YU Chang-jun 1 CHENG Dong-ya 2,3 Abstract. For any given positive integer m,

More information

Noncooperative continuous-time Markov games

Noncooperative continuous-time Markov games Morfismos, Vol. 9, No. 1, 2005, pp. 39 54 Noncooperative continuous-time Markov games Héctor Jasso-Fuentes Abstract This work concerns noncooperative continuous-time Markov games with Polish state and

More information

Theoretical Tutorial Session 2

Theoretical Tutorial Session 2 1 / 36 Theoretical Tutorial Session 2 Xiaoming Song Department of Mathematics Drexel University July 27, 216 Outline 2 / 36 Itô s formula Martingale representation theorem Stochastic differential equations

More information

Examination paper for TMA4265 Stochastic Processes

Examination paper for TMA4265 Stochastic Processes Department of Mathematical Sciences Examination paper for TMA4265 Stochastic Processes Academic contact during examination: Andrea Riebler Phone: 456 89 592 Examination date: December 14th, 2015 Examination

More information

STOCHASTIC PROCESSES Basic notions

STOCHASTIC PROCESSES Basic notions J. Virtamo 38.3143 Queueing Theory / Stochastic processes 1 STOCHASTIC PROCESSES Basic notions Often the systems we consider evolve in time and we are interested in their dynamic behaviour, usually involving

More information

Existence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models

Existence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models Existence, Uniqueness and Stability of Invariant Distributions in Continuous-Time Stochastic Models Christian Bayer and Klaus Wälde Weierstrass Institute for Applied Analysis and Stochastics and University

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Functional central limit theorem for super α-stable processes

Functional central limit theorem for super α-stable processes 874 Science in China Ser. A Mathematics 24 Vol. 47 No. 6 874 881 Functional central it theorem for super α-stable processes HONG Wenming Department of Mathematics, Beijing Normal University, Beijing 1875,

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

Breakdown of Pattern Formation in Activator-Inhibitor Systems and Unfolding of a Singular Equilibrium

Breakdown of Pattern Formation in Activator-Inhibitor Systems and Unfolding of a Singular Equilibrium Breakdown of Pattern Formation in Activator-Inhibitor Systems and Unfolding of a Singular Equilibrium Izumi Takagi (Mathematical Institute, Tohoku University) joint work with Kanako Suzuki (Institute for

More information

Applied Mathematics Letters. Stationary distribution, ergodicity and extinction of a stochastic generalized logistic system

Applied Mathematics Letters. Stationary distribution, ergodicity and extinction of a stochastic generalized logistic system Applied Mathematics Letters 5 (1) 198 1985 Contents lists available at SciVerse ScienceDirect Applied Mathematics Letters journal homepage: www.elsevier.com/locate/aml Stationary distribution, ergodicity

More information

A review of Continuous Time MC STA 624, Spring 2015

A review of Continuous Time MC STA 624, Spring 2015 A review of Continuous Time MC STA 624, Spring 2015 Ruriko Yoshida Dept. of Statistics University of Kentucky polytopes.net STA 624 1 Continuous Time Markov chains Definition A continuous time stochastic

More information

Chapter 2: Markov Chains and Queues in Discrete Time

Chapter 2: Markov Chains and Queues in Discrete Time Chapter 2: Markov Chains and Queues in Discrete Time L. Breuer University of Kent 1 Definition Let X n with n N 0 denote random variables on a discrete space E. The sequence X = (X n : n N 0 ) is called

More information

Newton, Fermat, and Exactly Realizable Sequences

Newton, Fermat, and Exactly Realizable Sequences 1 2 3 47 6 23 11 Journal of Integer Sequences, Vol. 8 (2005), Article 05.1.2 Newton, Fermat, and Exactly Realizable Sequences Bau-Sen Du Institute of Mathematics Academia Sinica Taipei 115 TAIWAN mabsdu@sinica.edu.tw

More information

Squared Bessel Process with Delay

Squared Bessel Process with Delay Southern Illinois University Carbondale OpenSIUC Articles and Preprints Department of Mathematics 216 Squared Bessel Process with Delay Harry Randolph Hughes Southern Illinois University Carbondale, hrhughes@siu.edu

More information

SIMILAR MARKOV CHAINS

SIMILAR MARKOV CHAINS SIMILAR MARKOV CHAINS by Phil Pollett The University of Queensland MAIN REFERENCES Convergence of Markov transition probabilities and their spectral properties 1. Vere-Jones, D. Geometric ergodicity in

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

BRANCHING PROCESS WITH

BRANCHING PROCESS WITH Chapter 2 A MODIFIED MARKOV BRANCHING PROCESS WITH IMMIGRATION 2. Introduction Several physical and biological populations exhibit a branching character in their evolutionary behaviour in the sense that

More information

Existence of Positive Periodic Solutions of Mutualism Systems with Several Delays 1

Existence of Positive Periodic Solutions of Mutualism Systems with Several Delays 1 Advances in Dynamical Systems and Applications. ISSN 973-5321 Volume 1 Number 2 (26), pp. 29 217 c Research India Publications http://www.ripublication.com/adsa.htm Existence of Positive Periodic Solutions

More information

(2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS

(2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS (2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS Svetlana Janković and Miljana Jovanović Faculty of Science, Department of Mathematics, University

More information

ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES. Masatoshi Fukushima

ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES. Masatoshi Fukushima ON ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES Masatoshi Fukushima Symposium in Honor of Kiyosi Itô: Stocastic Analysis and Its Impact in Mathematics and Science, IMS, NUS July 10, 2008 1 1. Itô s point

More information

Abstract. 1. Introduction

Abstract. 1. Introduction Journal of Computational Mathematics Vol.28, No.2, 2010, 273 288. http://www.global-sci.org/jcm doi:10.4208/jcm.2009.10-m2870 UNIFORM SUPERCONVERGENCE OF GALERKIN METHODS FOR SINGULARLY PERTURBED PROBLEMS

More information

NON-EXTINCTION OF SOLUTIONS TO A FAST DIFFUSION SYSTEM WITH NONLOCAL SOURCES

NON-EXTINCTION OF SOLUTIONS TO A FAST DIFFUSION SYSTEM WITH NONLOCAL SOURCES Electronic Journal of Differential Equations, Vol. 2016 (2016, No. 45, pp. 1 5. ISSN: 1072-6691. URL: http://ejde.math.txstate.edu or http://ejde.math.unt.edu ftp ejde.math.txstate.edu NON-EXTINCTION OF

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00

EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013 Time: 9:00 13:00 Norges teknisk naturvitenskapelige universitet Institutt for matematiske fag Page 1 of 7 English Contact: Håkon Tjelmeland 48 22 18 96 EXAM IN COURSE TMA4265 STOCHASTIC PROCESSES Wednesday 7. August, 2013

More information

6 Continuous-Time Birth and Death Chains

6 Continuous-Time Birth and Death Chains 6 Continuous-Time Birth and Death Chains Angela Peace Biomathematics II MATH 5355 Spring 2017 Lecture notes follow: Allen, Linda JS. An introduction to stochastic processes with applications to biology.

More information

Spatial Ergodicity of the Harris Flows

Spatial Ergodicity of the Harris Flows Communications on Stochastic Analysis Volume 11 Number 2 Article 6 6-217 Spatial Ergodicity of the Harris Flows E.V. Glinyanaya Institute of Mathematics NAS of Ukraine, glinkate@gmail.com Follow this and

More information

Research Article Mean Square Stability of Impulsive Stochastic Differential Systems

Research Article Mean Square Stability of Impulsive Stochastic Differential Systems International Differential Equations Volume 011, Article ID 613695, 13 pages doi:10.1155/011/613695 Research Article Mean Square Stability of Impulsive Stochastic Differential Systems Shujie Yang, Bao

More information

A New Wavelet-based Expansion of a Random Process

A New Wavelet-based Expansion of a Random Process Columbia International Publishing Journal of Applied Mathematics and Statistics doi:10.7726/jams.2016.1011 esearch Article A New Wavelet-based Expansion of a andom Process Ievgen Turchyn 1* eceived: 28

More information

The Mabinogion Sheep Problem

The Mabinogion Sheep Problem The Mabinogion Sheep Problem Kun Dong Cornell University April 22, 2015 K. Dong (Cornell University) The Mabinogion Sheep Problem April 22, 2015 1 / 18 Introduction (Williams 1991) we are given a herd

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Analysis of the Dynamic Travelling Salesman Problem with Different Policies

Analysis of the Dynamic Travelling Salesman Problem with Different Policies Analysis of the Dynamic Travelling Salesman Problem with Different Policies Santiago Ravassi A Thesis in The Department of Mathematics and Statistics Presented in Partial Fulfillment of the Requirements

More information

Research Article Exponential Inequalities for Positively Associated Random Variables and Applications

Research Article Exponential Inequalities for Positively Associated Random Variables and Applications Hindawi Publishing Corporation Journal of Inequalities and Applications Volume 008, Article ID 38536, 11 pages doi:10.1155/008/38536 Research Article Exponential Inequalities for Positively Associated

More information

All Good (Bad) Words Consisting of 5 Blocks

All Good (Bad) Words Consisting of 5 Blocks Acta Mathematica Sinica, English Series Jun, 2017, Vol 33, No 6, pp 851 860 Published online: January 25, 2017 DOI: 101007/s10114-017-6134-2 Http://wwwActaMathcom Acta Mathematica Sinica, English Series

More information

GENERALIZED RAY KNIGHT THEORY AND LIMIT THEOREMS FOR SELF-INTERACTING RANDOM WALKS ON Z 1. Hungarian Academy of Sciences

GENERALIZED RAY KNIGHT THEORY AND LIMIT THEOREMS FOR SELF-INTERACTING RANDOM WALKS ON Z 1. Hungarian Academy of Sciences The Annals of Probability 1996, Vol. 24, No. 3, 1324 1367 GENERALIZED RAY KNIGHT THEORY AND LIMIT THEOREMS FOR SELF-INTERACTING RANDOM WALKS ON Z 1 By Bálint Tóth Hungarian Academy of Sciences We consider

More information

c 2004 Society for Industrial and Applied Mathematics

c 2004 Society for Industrial and Applied Mathematics SIAM J. COTROL OPTIM. Vol. 43, o. 4, pp. 1222 1233 c 2004 Society for Industrial and Applied Mathematics OZERO-SUM STOCHASTIC DIFFERETIAL GAMES WITH DISCOTIUOUS FEEDBACK PAOLA MAUCCI Abstract. The existence

More information

Stochastic Processes

Stochastic Processes Stochastic Processes 8.445 MIT, fall 20 Mid Term Exam Solutions October 27, 20 Your Name: Alberto De Sole Exercise Max Grade Grade 5 5 2 5 5 3 5 5 4 5 5 5 5 5 6 5 5 Total 30 30 Problem :. True / False

More information