9 Continuous-State Branching Processes

Size: px
Start display at page:

Download "9 Continuous-State Branching Processes"

Transcription

1 Lévy processes and continuous-state branching processes: part III Andreas E. Kyprianou, Department of Mathematical Sciences, University of Bath, Claverton Down, Bath, BA2 7AY. 9 Continuous-State Branching Processes Originating in part from the concerns of the Victorian British upper classes that aristocratic surnames were becoming extinct, the theory of branching processes now forms a cornerstone of classical applied probability. Some of the earliest work on branching processes dates back to Galton and Watson in 874, [25]. However, approximately years later, it was discovered in [3] that the less well-exposed work of I.J. Bienaymé, dated around 845, contained many aspects of the later work of Galton and Watson. The Bienaymé Galton Watson process, as it is now known, is a discrete time Markov chain with state space {,, 2,...} described by the sequence {Z n : n =,, 2,...} satisfying the recursion Z > and Z n = Z n i= ξ (n) i (9.) for n =, 2,... where {ξ (n) i : i =, 2,...} are independent and identically distributed on {,, 2,...}. We use the usual notation i= to represent the empty sum. The basic idea behind this model is that Z n is the population count in the nth generation and from an initial population Z (which may be randomly distributed) individuals reproduce asexually and independently with the same distribution of numbers of offspring. The latter reproductive properties are referred to as the branching property. Note that as soon as Z n = it follows from the given construction that Z n+k = for all k =, 2,... A particular consequence of the branching property is that if Z = a + b then Z n is equal in distribution to Z n () + Z n (2) where Z n () and Z n (2) are independent with the same distribution as an nth generation Bienaymé Galton Watson process initiated from population sizes a and b, respectively. A mild modification of the Bienaymé Galton Watson process is to set it into continuous time by assigning life lengths to each individual which are independent and exponentially distributed with parameter λ >. Individuals reproduce at their moment of death in the same way as described previously for the Bienaymé-Galton-Watson process. If Y = {Y t : t } is the {,, 2,...}- valued process describing the population size then it is straightforward to see that the lack of memory property of the exponential distribution implies that

2 for all s t, Y s Y t = i= Y (i) t s, where given {Y u : u s} the variables {Y (i) t s : i =,..., Y s } are independent with the same distribution as Y t s conditional on Y =. In that case, we may talk of Y as a continuous-time Markov chain on {,, 2,...}, with probabilities, say, {P y : y =,, 2,...} where P y is the law of Y under the assumption that Y = y. As before, the state is absorbing in the sense that if Y t = then Y t+u = for all u >. The process Y is called the continuous time Markov branching process. The branching property for Y may now be formulated as follows. Definition 9. (Branching property) For any t and y, y 2 in the state space of Y, Y t under P y+y 2 is equal in law to the independent sum Y () t + Y (2) t where the distribution of Y (i) t is equal to that of Y t under P yi for i =, 2. So far there appears to be little connection with Lévy processes. However a remarkable time transformation shows that the path of Y is intimately linked to the path of a compound Poisson process with jumps whose distribution is supported in {,,, 2,...}, stopped at the first instant that it hits zero. To explain this in more detail let us introduce the probabilities {π i : i =,,, 2,...}, where π i = P(ξ = i + ) and ξ has the same distribution as the typical family size in the Bienaymé Galton Watson process. To avoid complications let us assume that π = so that a transition in the state of Y always occurs when an individual dies. When jumps of Y occur, they are independent and always distributed according to {π i : i =,,,...}. The idea now is to adjust time accordingly with the evolution of Y in such a way that these jumps are spaced out with inter-arrival times that are independent and exponentially distributed. Crucial to the following exposition is the simple and well-known fact that the minimum of n {, 2,...} independent and exponentially distributed random variables, each with parameter λ, is exponentially distributed with parameter λn. Further, that if e α is exponentially distributed with parameter α > then for β >, βe α is equal in distribution to e α/β. Write for t, set J t = t Y u du ϕ t = inf{s : J s > t} with the usual convention that inf = and define X t = Y ϕt (9.2) with the understanding that when ϕ t = we set X t =. Now observe that when Y = y {, 2,...} the first jump of Y occurs at a time, say T (the minimum of y independent exponential random variables, each with parameter λ > ) which is exponentially distributed with parameter λy and the size of the jump is distributed according to {π i : i =,,, 2,...}. However note that, on account of the fact that ϕ JT = T, the quantity J T = T ydu = 2

3 yt is the first time that the process X = {X t : t } jumps. The latter time is exponentially distributed with parameter λ. The jump at this time is independent and distributed according to {π i : i =,,, 2,...}. Given the information G = σ(y t : t T ), the lack of memory property implies that the continuation {Y T+t : t } has the same law as Y under P y with y = Y T. Hence if T 2 is the time of the second jump of Y then conditional on G we have that T 2 T is exponentially distributed with parameter λy T and J T2 J T = Y T (T 2 T ) which is again exponentially distributed with parameter λ and further, is independent of G. Note that J T2 is the time of the second jump of X and the size of the second jump is again independent and distributed according to {π i : i =,,,...}. Iterating in this way it becomes clear that X is nothing more than a compound Poisson process with arrival rate λ and jump distribution F(dx) = π i δ i (dx) (9.3) i= stopped on first hitting the origin. A converse to this construction is also possible. Suppose now that X = {X t : t } is a compound Poisson process with arrival rate λ > and jump distribution F(dx) = i= π iδ i (dx). Write and set I t = t X u du again with the understanding that inf =. Define θ t = inf{s : I s > t}. (9.4) Y t = X θt τ where τ = inf{t > : X t < }. By analysing the behaviour of Y = {Y t : t } at the jump times of X in a similar way to above one readily shows that the process Y is a continuous time Markov branching process. The details are left as an exercise to the reader. The relationship between compound Poisson processes and continuous time Markov branching processes described above turns out to have a much more general setting. In the foundational work of Lamperti [2, 2] it is shown that there exists a correspondence between a class of branching processes called continuousstate branching processes and Lévy processes with no negative jumps (that is to say spectrally negative Lévy processes or processes which are the negative of a subordinator). We investigate this relation in more detail in the remainder of this text. 3

4 The Lamperti Transform A [, )-valued strong Markov process Y = {Y t : t } with probabilities {P x : x } is called a continuous-state branching process if it has paths that are right continuous with left limits and its law observes the branching property given in Definition 9.. Another way of phrasing the branching property is that for all θ and x, y, E x+y (e θyt ) = E x (e θyt )E y (e θyt ). (.) Note from the above equality that after an iteration we may always write for each x >, E x (e θyt ) = E x/n (e θyt ) n (.2) showing that Y t is infinitely divisible for each t >. If we define for θ, t, g(t, θ, x) = log E x (e θyt ), then (.2) implies that for any positive integer m, g(t, θ, m) = ng(t, θ, m/n) and g(t, θ, m) = mg(t, θ, ) showing that for x Q [, ), g(t, θ, x) = xu t (θ), (.3) where u t (θ) = g(t, θ, ). From (.) we also see that for z < y, g(t, θ, z) g(t, θ, y) which implies that g(t, θ, x ) exists as a left limit and is less than or equal to g(t, θ, x+) which exists as a right limit. Thanks to (.3), both left and right limits are the same so that for all x > E x (e θyt ) = e xut(θ). (.4) The Markov property in conjunction with (.4) implies that for all x > and t, s, θ, e xut+s(θ) = E x ( E(e θy t+s Y t ) ) = E x (e Ytus(θ)) = e xut(us(θ)). In other words the Laplace exponent of Y obeys the semi-group property u t+s (θ) = u t (u s (θ)). The first significant glimpse one gets of Lévy processes in relation to the above definition of a continuous-state branching process comes with the following result for which we offer no proof on account of technicalities (see however Exercise 9 for intuitive motivation and Chap. II of Le Gall (999) or Silverstein [24] for a proof). 4

5 Theorem. For t, θ, suppose that u t (θ) is the Laplace functional given by (.4) of some continuous-state branching process. Then it is necessarily differentiable in t and satisfies u t t (θ) + ψ(u t(θ)) = (.5) with initial condition u (θ) = θ where for λ, ψ(λ) = q aλ + 2 σ2 λ 2 + (e λx + λx (x<) )Π(dx) (.6) (, ) and in the above expression, q, a R, σ and Π is a measure supported in (, ) satisfying (, ) ( x2 )Π(dx) <. Note that for λ, ψ(λ) = log E(e λx ) where X is either a spectrally positive Lévy process or a subordinator, killed independently at rate q. 2 Otherwise said, ψ is the Laplace exponent of a killed spectrally negative Lévy process or the negative of the Laplace exponent of a killed subordinator. From Section 7.2, we know for example that ψ is convex, infinitely differentiable on (, ), ψ() = q and ψ (+) [, ). Further, if X is a (killed) subordinator, then ψ( ) ψ() < and otherwise we have that ψ( ) =. For each θ > the solution to (.5) can be uniquely identified by the relation ut(θ) θ dξ = t. (.7) ψ(ξ) (This is easily confirmed by elementary differentiation, note also that the lower delimiter implies that u (θ) = θ by letting t.) From the discussion earlier we may deduce that if a continuous-state branching process exists, then it is associated with a particular function ψ : [, ) R given by (.6). Formally speaking, we shall refer to all such ψ as branching mechanisms. We will now state without proof the Lamperti transform which, amongst other things, shows that for every branching mechanism ψ there exists an associated continuous-state branching process. Theorem.2 Let ψ be any given branching mechanism. (i) Suppose that X = {X t : t } is a Lévy process with no negative jumps, initial position X = x, killed at an independent exponentially distributed time with parameter q. Further, ψ(λ) = log E x (e λ(x x) ). Define for t, Y t = X θt τ, Obviously a spectrally positive processes is, by definition, the negative of a spectrally negative Lévy process and thus excludes subordinators. 2 As usual, we understand the process X killed at rate q to mean that it is killed after an independent and exponentially distributed time with parameter q. Further q = means there is no killing. 5

6 where τ = inf{t > : X t < } and θ t = inf{s > : s du X u > t} then Y = {Y t : t } is a continuous-state branching process with branching mechanism ψ and initial value Y = x. (ii) Conversely suppose that Y = {Y t : t } is a continuous-state branching process with branching mechanism ψ, such that Y = x >. Define for t, X t = Y ϕt, where ϕ t = inf{s > : s Y u du > t}. Then X = {X t : t } is a Lévy process with no negative jumps, killed at the minimum of the time of the first entry into (, ) and an independent and exponentially distributed time with parameter q, with initial position X = x and satisfying ψ(λ) = log E(e λx ). It can be shown that a general continuous-state branching process appears as the result of an asymptotic re-scaling (in time and space) of the continuous time Bienaymé Galton Watson process discussed in Sect. 9; see [4]. Roughly speaking the Lamperti transform for continuous-state branching processes then follows as a consequence of the analogous construction being valid for the continuous time Bienaymé Galton Watson process; recall the discussion in Sect. 9. Long-term Behaviour Recalling the definition of Z = {Z n : n =,, 2,...}, the Bienaymé Galton Watson process, without specifying anything further about the common distribution of the offspring there are two events which are of immediate concern for the Markov chain Z; explosion and absorption. In the first case it is not clear whether or not the event {Z n = } has positive probability for some n (the latter could happen if, for example, the offspring distribution has no moments). When P x (Z n < ) = for all n we say the process is conservative (in other words there is no explosion). In the second case, we note from the definition of Z that if Z n = for some n then Z n+m = for all m which makes an absorbing state. As Z n is to be thought of as the size of the nth generation of some asexually reproducing population, the event {Z n = for some n > } is referred to as extinction. In this section we consider the analogues of conservative behaviour and extinction within the setting of continuous-state branching processes. In addition we shall examine the laws of the supremum and total progeny process of 6

7 continuous-state branching processes. These are the analogues of sup Z n and { Z k : n } n k n for the Bienaymé Galton Watson process. Note in the latter case, total progeny is interpreted as the total number of offspring to date.. Conservative Processes A continuous-state branching process Y = {Y t : t } is said to be conservative if for all t >, P(Y t < ) =. The following result is taken from Grey [2]. Theorem. A continuous-state branching process with branching mechanism ψ is conservative if and only if dξ =. ψ(ξ) + A necessary condition is, therefore, ψ() = and a sufficient condition is ψ() = and ψ (+) < (equivalently q = and E X < ). Proof. From the definition of u t (θ), a continuous-state branching process is conservative if and only if lim θ u t (θ) = since, for each x >, P x (Y t < ) = lim θ E x (e θyt ) = exp{ xlim θ u t (θ)}, where the limits are justified by monotonicity. However, note from (.7) that as θ, δ δ t = θ ψ(ξ) dξ + u t(θ) ψ(ξ) dξ, where δ > is sufficiently small. However, as the left-hand side is independent of θ we are forced to conclude that lim θ u t (θ) = if and only if dξ =. ψ(ξ) + Note that ψ(θ) may be negative in the neighbourhood of the origin and hence the absolute value is taken in the integral. From this condition and the fact that ψ is a smooth function, one sees immediately that a necessary condition for a continuous-state branching process to be conservative is that ψ() = ; in other words the killing rate q =. It is also apparent that a sufficient condition is that q = and that ψ (+) < (so that ψ is locally linear passing through the origin). Due to the fact that ψ(λ) = log E(e λx ) where X is a Lévy process with no negative jumps, it follows that the latter condition is equivalent to E X < where X is the Lévy processes with no negative jumps associated with ψ. Henceforth we shall assume that there always conservativeness. 7

8 .2 Extinction Probabilities Thanks to the representation of continuous-state branching processes given in Theorem.2 (i), it is clear that the latter processes observe the fundamental property that if Y t = for some t >, then Y t+s = for s. Let ζ = inf{t > : Y t = }. The event {ζ < } = {Y t = for some t > } is thus referred to as extinction in line with terminology used for the Bienaymé Galton Watson process. This can also be seen from the branching property (.). By taking y = there we see that P must be the measure that assigns probability one to the processes which is identically zero. Hence by the Markov property, once in state zero, the process remains in state zero. Note from (.4) that u t (θ) is continuously differentiable in θ > (since by dominated convergence, the same is true of the left-hand side of the aforementioned equality). Differentiating (.4) in θ > we find that for each x, t >, and hence taking limits as θ we obtain E x (Y t e θyt ) = x u t θ (θ)e xut(θ) (.) E x (Y t ) = x u t (+) (.2) θ so that both sides of the equality are infinite at the same time. Differentiating (.5) in θ > we also find that u t t θ (θ) + ψ (u t (θ)) u t (θ) =. θ Standard techniques for first-order differential equations then imply that u R t t (θ) = ce ψ (u s(θ))ds θ (.3) where c > is a constant. Inspecting (.) as t we see that c =. Now taking limits as θ and recalling that for each fixed s >, u s (θ) (thanks to the assumption of conservativeness) it is straightforward to deduce from (.2) 8

9 and (.3) that E x (Y t ) = xe ψ (+)t, (.4) where we understand the left-hand side to be infinite whenever ψ (+) =. Note that from the definition ψ(θ) = log E(e θx ) where X is a Lévy process with no negative jumps, we know that ψ is convex and ψ(+) [, ) (cf. Section 7.2). Hence in particular to obtain (.4), we have used dominated convergence in the integral in (.3) when ψ (+) < and monotone convergence when ψ (+) =. This leads to the following classification of continuous-state branching processes. Definition. A continuous-state branching process with branching mechanism ψ is called (i) subcritical, if ψ (+) >, (ii) critical, if ψ (+) = and (iii) supercritical, if ψ (+) <. The use of the terminology criticality refers then to whether the process will, on average, decrease, remain constant or increase. The same terminology is employed for Bienaymé Galton Watson processes where now the three cases in Definition. correspond to the mean of the offspring distribution being strictly less than, equal to and strictly greater than unity, respectively. The classic result due to the scientists after which the latter process is named states that there is extinction with probability if and only if the mean offspring size is less than or equal to unity (see Chap. I of Athreya and Ney (972) for example). The analogous result for continuous-state branching processes might therefore read that there is extinction with probability one if and only if ψ (+). However, here we encounter a subtle difference for continuous-state branching processes as the following simple example shows: In the representation given by Theorem.2, take X t = t corresponding to Y t = e t. Clearly ψ(λ) = λ so that ψ (+) = > and yet Y t > for all t >. Extinction is characterised by the following result due to Grey [2]; see also Bingham [4]. Theorem.2 Suppose that Y is a continuous-state branching process with branching mechanism ψ. Let p(x) = P x (ζ < ). (i) If ψ( ) <, then for all x >, p(x) = (ii) Otherwise, when ψ( ) =, p(x) > for some (and then for all) x > if and only if ψ(ξ) dξ < in which case p(x) = e Φ()x where Φ() = sup{λ : ψ(λ) = }. 9

10 Proof. (i) If ψ(λ) = log E(e λx ) where X is a subordinator, then clearly from the path representation given in Theorem.2 (i), extinction occurs with probability zero. From the discussion following Theorem., the case that X is a subordinator is equivalent to ψ(λ) < for all λ >. (ii) Since for s, t >, {Y t = } {Y t+s = } we have by monotonicity that for each x >, P x (Y t = ) p(x) (.5) as t. Hence p(x) > if and only if P x (Y t = ) > for some t >. Since P x (Y t = ) = e xut( ), we see that p(x) > for some (and then all) x > if and only if u t ( ) < for some t >. Fix t >. Taking limits in (.7) as θ we see that if u t ( ) <, then it follows that dξ <. (.6) ψ(ξ) Conversely, if the above integral holds, then again taking limits in (.7) as θ it must necessarily hold that u t ( ) <. Finally, assuming (.6), we have learnt that u t( ) dξ = t. (.7) ψ(ξ) From (.5) and the fact that u t ( ) = x log P x (Y t = ) we see that u t ( ) decreases as t to the largest constant c such that /ψ(ξ)dξ becomes c infinite. Appealing to the convexity and smoothness of ψ, the constant c must necessarily correspond to a root of ψ in [, ), at which point it ψ behave linearly and thus cause /ψ(ξ)dξ to blow up. There are at most two such c points, and the largest of these is described precisely by c = Φ() [, ) (see Section 7.2). In conclusion, p(x) = lim t e xut( ) = e Φ()x as required. On account of the convexity of ψ we also recover the following corollary to part (ii) of the above theorem. Corollary. For a continuous-state branching process with branching mechanism ψ satisfying ψ( ) = and dξ <, ψ(ξ) we have p(x) < for some (and then for all) x > if and only if ψ (+) <. To summarise the conclusions of Theorem.2 and Corollary., we have the following cases for the extinction probability p(x):

11 Condition p(x) ψ( ) < ψ( ) =, ψ(ξ) dξ = ψ( ) =, ψ (+) <, ψ(ξ) dξ < e Φ()x (, ) ψ( ) =, ψ (+), ψ(ξ) dξ < Note that rather interestingly, assuming further that ψ(+) < in the second case above, we have an instance where the path of the underlying Lévy process can be made to have a very strong tendency to move towards and yet, despite this, after the Lamperti transformation has been applied, the corresponding continuous state branching process never becomes extinct. What is happening in such cases is that even though ζ = almost surely, it is also the case that Y t almost surely as t..3 Total Progeny and the Supremum Thinking of a continuous-state branching process, {Y t : t } as the continuous time, continuous-state analogue of the Bienaymé Galton Watson process, it is reasonable to refer to J t := t Y u du as the total progeny until time t. In this section our main goal, given in the theorem below, is to provide distributional identities for J ζ = ζ Y u du and sup Y s. s ζ Let us first recall the following notation. As noted above, for any branching mechanism ψ, when ψ( ) = (in other words when the Lévy process associated with ψ is not a subordinator) we have that ψ is the Laplace exponent of a spectrally negative Lévy process. Let Φ(q) = sup{θ : ψ(θ) = q} (cf. Section 7.2). The following Lemma is due to Bingham [4]. Lemma. Suppose that Y is a continuous-state branching process with branching mechanism ψ which satisfies ψ( ) =. Then and in particular for x >, E x (e q R ζ Ysds ) = e Φ(q)x P x (sup Y s < ) = e Φ()x. s The right-hand side is equal to unity if and only if Y is not supercritical.

12 Proof. Suppose now that X is the Lévy process mentioned in Theorem.2 (ii). Write in the usual way τ = inf{t > : X t < }. Then a little thought shows that ζ τ = Y s ds. The proof of the first part is now completed by invoking Theorem 8.. Note that X is a spectrally positive Lévy process and hence to implement the aforementioned result, which applies to spectrally negative processes, one must consider the problem of first passage to level x of X when X =. For the proof of the second part, note that sup s Y s < if and only if ζ Y sds <. The result follows by taking limits as q in the first part noting that this gives the probability of the latter event. 2 Conditioned Processes and Immigration In the classical theory of Bienaymé Galton Watson processes where the offspring distribution is assumed to have finite mean, it is well understood that by taking a critical or subcritical process (for which extinction occurs with probability one) and conditioning it in the long term to remain positive uncovers a beautiful relationship between a martingale change of measure and processes with immigration; cf. Athreya and Ney [] and Lyons et al. [23]. Let us be a little more specific. A Bienaymé Galton Watson process with immigration is defined as the Markov chain Z = {Zn : n =,,...} where Z = z {,, 2,...} and for n =, 2,..., Z n = Z n + n k= Z (k) n k, (2.) where now Z = {Z n : n } has law P z and for each k =, 2,..., n, Z (k) n k is independent and equal in distribution to numbers in the (n k)th generation of (Z, P ηk ) where it is assumed that the initial numbers, η k, are, independently of everything else, randomly distributed according to the probabilities {p k : k =,, 2,...}. Intuitively speaking one may see the process Z as a variant of the Bienaymé Galton Watson process, Z, in which, from the first and subsequent generations, there is a generational stream of immigrants {η, η 2,...} each of whom initiates an independent copy of (Z, P ). Suppose now that Z is a Bienaymé Galton Watson process with probabilities {P x : x =, 2,...} as described above. For any event A which belongs to the sigma algebra generated by the first n generations, it turns out that for each x =,, 2,... is well defined and further, P x(a) := lim m P x (A Z k > for k =,,..., n + m) P x (A) = E x( A M n ), 2

13 where M n = m n Z n /Z and m = E (Z ) which is assumed to be finite. It is not difficult to show that E x (Z n ) = xm n and that {M n : n } is a martingale using the iteration (9.). What is perhaps more intriguing is that the new process (Z, P x) can be identified in two different ways:. The process {Z n : n } under P x can be shown to have the same law as a Bienaymé Galton Watson process with immigration having x initial ancestors. The immigration probabilities satisfy p k = (k+)p k+/m for k =,, 2,... where {p k : k =,, 2,...} is the offspring distribution of the original Bienaymé Galton Watson process and immigrants initiate independent copies of Z. 2. The process Z under Px has the same law as x initial individuals each initiating independently a Bienaymé Galton Watson process under P together with one individual initiating an independent immortal genealogical line of descent, the spine, along which individuals reproduce with the tilted distribution {kp k /m : k =, 2,...}. The offspring of individuals on the spine who are not themselves part of the spine initiate copies of a Bienaymé Galton Watson process under P. By subtracting off individuals on the spine from the aggregate population, one observes a Bienaymé Galton Watson process with immigration described in (). Effectively, taking the second interpretation above to hand, the change of measure has adjusted the statistics on just one genealogical line of descent to ensure that it, and hence the whole process itself, is immortal. See Fig.. Our aim in this section is to establish the analogue of these ideas for critical or subcritical continuous-state branching processes. This is done in Sect However, we first address the issue of how to condition a spectrally positive Lévy process to stay positive. Apart from as being a useful comparison for the case of conditioning a continuous-state branching process, there are reasons to believe that the two classes of conditioned processes might be connected through a Lamperti-type transform on account of the relationship given in Theorem.2. This is the very last point we address in Sect Conditioning a Spectrally Positive Lévy Process to Stay Positive It is possible to talk of conditioning any Lévy process to stay positive and this is now a well understood and well documented phenomenon; also for the case of random walks. See [2, 3, 5, 6, 7] to name but some of the most recent additions to the literature; see also [7] who considers conditioning a spectrally negative Lévy process to stay in a strip. We restrict our attention to the case of spectrally positive Lévy processes; in part because this is what is required for the forthcoming discussion and in part because this facilitates the mathematics. Suppose that X = {X t : t } is a spectrally positive Lévy process with ψ(λ) = log E(e λx ) for all λ. (So as before, ψ is the Laplace exponent of 3

14 BGW BGW BGW BGW BGW BGW BGW BGW Figure : Nodes shaded in black initiate Bienaymé Galton Watson processes under P. Nodes in white are individuals belonging to the immortal genealogical line of descent known as the spine. Nodes shaded in grey represent the offspring of individuals on the spine who are not themselves members of the spine. These individuals may also be considered as immigrants. the spectrally negative process X). First recall from Theorem 8. that for all x >, E x (e qτ (τ < )) = e Φ(q)x, (2.2) where, as usual, τ = inf{t > : X t < } and Φ is the right inverse of ψ. In particular when ψ (+) <, so that lim t X t =, we have that Φ() > and P(τ = ) = e Φ()x. In that case, for any A F t, we may simply apply Bayes formula and the Markov property, respectively, to deduce that for all x >, P x (A) := P x(a τ = ) ) E x ( (A,t<τ = )P(τ = F t) P x (τ = ) = E x ( (A,t<τ ) e Φ()Xt e Φ()x 4 )

15 thus giving sense to conditioning X to stay positive. If however ψ (+), in other words, liminf t X t =, then the above calculation is not possible as Φ() = and it is less clear what it means to condition the process to stay positive. The sense in which this may be understood is given in [5]. Theorem 2. Suppose that e q is an exponentially distributed random variable with parameter q independent of X. Suppose that ψ (+). For all x, t > and A F t, P x (A) := lim P x(a, t < e q τ > e q) q exists and satisfies P x (A) = E x( (A,t<τ ) X t x ). Proof. Again appealing to Bayes formula followed by the Markov property in conjunction with the lack of memory property and (2.2), we have E x (A, t < e q τ > e q) = P x(a, t < e q, τ > e q) P x (τ > e q) = E x( (A,t<eq τ )E(τ > e q F t )) E x ( e qτ ) e Φ(q)Xt = E x ( (A,t<τ )e qt e Φ(q)x ). (2.3) Under the assumption ψ (+), we know that Φ() = and hence by l Hôpital s rule e Φ(q)Xt lim q e = X t Φ(q)x x. (2.4) Noting also that for all q sufficiently small, e Φ(q)Xt e Φ(q)x Φ(q)X t e Φ(q)x C X t x, where C > is a constant. The condition ψ (+) also implies that for all t >, E( X t ) < and hence by dominated convergence we may take limits in (2.3) as q and apply (2.4) to deduce the result. It is interesting to note that, whilst P x is a probability measure for each x >, when ψ (+) <, this is not necessarily the case when ψ (+). The following lemma gives a precise account. Lemma 2. Fix x >. When ψ (+) =, P x is a probability measure and when ψ (+) >, P x is a sub-probability measure. 5

16 Proof. All that is required to be shown is that for each t >, E x ( (t<τ )X t) = x for P x to be a probability measure and E x ( (t<τ )X t) < x for a sub-probability measure. To this end, recall from the proof of Theorem 2. that E x ( (t<τ ) X t x ) = lim P x( t < e q τ > e q) q = lim P x (e q t τ > e q) q = lim q t q = lim q Φ(q)x = lim q ψ (+) x qe qu e Φ(q)x P x(τ > u)du t t e qu P x (τ > u)du e qu P x (τ > u)du. It is now clear that when ψ (+) = the right-hand side above is equal to unity and otherwise is strictly less than unity thus distinguishing the case of a probability measure from a sub-probability measure. Note that when E x ( (t<τ )X t) = x, an easy application of the Markov property implies that { (t<τ )X t/x : t } is a unit mean P x -martingale so that P x is obtained by a martingale change of measure. Similarly when E x ( (t<τ )X t) x, the latter process is a supermartingale. On a final note, the reader may be curious as to how one characterises spectrally positive Lévy processes, and indeed a general Lévy process, to stay positive when the initial value x =. In general, this is a non-trivial issue, but possible by considering the weak limit of the measure P x as measure on the space of paths that are right continuous with left limits. The interested reader should consult [7] for the most recent and up to date account. 2.2 Conditioning a (sub)critical Continuous-State Branching Process to Stay Positive Let us now progress to conditioning of continuous-state branching processes to stay positive, following closely Chap. 3 of [8]. We continue to adopt the notation of Sect.. Our interest is restricted to the case that there is extinction with probability one for all initial values x >. According to Corollary. this corresponds to ψ( ) =, ψ (+) and ψ(ξ) dξ < and henceforth we assume that these conditions are in force. For notational convenience we also set ρ = ψ (+). 6

17 Theorem 2.2 Suppose that Y = {Y t : t } is a continuous-state branching process with branching mechanism ψ satisfying the above conditions. For each event A σ(y s : s t), and x >, Px (A) := lim P x(a ζ > t + s) s is well defined as a probability measure and satisfies P x (A) = E x( A e ρt Y t x ). In particular, P x (ζ < ) = and {eρt Y t : t } is a martingale. Proof. From the proof of Theorem.2 we have seen that for x >, P x (ζ t) = P x (Y t = ) = e xut( ), where u t (θ) satisfies (.7). Crucial to the proof will be the convergence lim s u s ( ) u t+s ( ) = eρt (2.5) for each t > and hence we first show that this result holds. To this end note from (.7) that us( ) u t+s( ) dξ = t. ψ(ξ) On the other hand, recall from the proof of Theorem.2 that u t (θ) is decreasing to Φ() = as t. Hence, since lim ξ ψ(ξ)/ξ = ψ (+) = ρ, it follows that log u s( ) u t+s ( ) = us( ) u t+s( ) ξ dξ = us( ) u t+s( ) ψ(ξ) ξ as s thus proving the claim. With (2.5) in hand we may now proceed to note that lim s In addition, for s sufficiently large, e Ytus( ) e xut+s( ) = Y t x eρt. e Ytus( ) e xut+s( ) Y t u s ( ) e xut+s( ) C Y t x eρt dξ ρt, ψ(ξ) for some C >. Hence we may now apply the Markov property and then the Dominated Convergence Theorem to deduce that ( ) lim P P Yt (ζ > s) x(a ζ > t + s) = lim E x (A,ζ>t) s s P x (ζ > t + s) e = lim E x ( Ytus( ) ) (A,ζ>t) s e xut+s( ) = E x ( (A,ζ>t) Y t x eρt ). 7

18 Note that we may remove the qualification {t < ζ} from the indicator on the right-hand side above as Y t = on {t ζ}. To show that P x is a probability measure it suffices to show that for each x, t >, E x (Y t ) = e ρt x. However, the latter was already proved in (.4). A direct consequence of this is that P x(ζ > t) = for all t which implies that P x(ζ < ) =. The fact that {e ρt Y t : t } is a martingale follows in the usual way from consistency of Radon Nikodym densities. Alternatively, it follows directly from (.4) by applying the Markov property as follows. For s t, E x (e ρt Y t σ(y u : u s)) = e ρs E Ys (e ρ(t s) Y t s ) = e ρs Y s, which establishes the martingale property. Note that in older literature, the process (Y, P x) is called the Q-process. See for example []. We have thus far seen that conditioning a (sub)critical continuous-state branching process to stay positive can be performed mathematically in a similar way to conditioning a spectrally positive Lévy processes to stay positive. Our next objective is to show that, in an analogous sense to what has been discussed for Bienaymé Galton Watson processes, the conditioned process has the same law as a continuous-state branching process with immigration. Let us spend a little time to give a mathematical description of the latter. 2.3 Continuous-state branching process with immigration In general we define a Markov process Y = {Yt : t } with probabilities {P x : x } to be a continuous-state branching process with branching mechanism ψ and immigration mechanism φ if it is [, )-valued and has paths that are right continuous with left limits and for all x, t > and θ E x (e θy t ) = exp{ xut (θ) t φ(u t s (θ))ds}, (2.6) where u t (θ) is the unique solution to (.5) and φ is the Laplace exponent of any subordinator. Specifically, for θ, φ(θ) = dθ + ( e θx )Λ(dx), (, ) where Λ is a measure concentrated on (, ) satisfying (, )( x)λ(dx) <. It is possible to see how the above definition plays an analogous role to (2.) by considering the following sample calculations (which also show the existence of continuous-state branching processes with immigration). Suppose that S = {S t : t } under P is a pure jump subordinator with Laplace exponent φ(θ) (hence d = ). Now define a process Yt = Y t + Y ( Su) t u, u t 8

19 where S u = S u S u so that S u = at all but a countable number of u [, t], Y t is a continuous-state branching process and for each (u, S u ) the quantity Y ( Su) t u is an independent copy of the process (Y, P x ) at time t u with x = S u. We immediately see that Y = {Yt : t } is a natural analogue of (2.) where now the subordinator S t plays the role of n i= η i, the total number of immigrants in Z up to and including generation n. It is not difficult to see that it is also a Markov process. Let us proceed further to compute its Laplace exponent. To this end, let us establish briefly the following result. Lemma 2.2 Consider a compound Poisson process with rate λ and positive jumps with common distribution F, then for any continuous α : [, ) [, ) we have { ( E e P N t ξiα(t τi)) t } i= = exp λ ( e xα(t s) )F(dx)ds, (, ) where {τ i : i =, 2, } are the arrival times in the underlying Poisson process N = {N t : t }. Proof. We have ( E e P N t ξiα(t τi)) i= = E = E ( Nt ) E(e α(t τi)ξ ) i= ( Nt i= (, ) e α(t τi)x F(dx) = ( λt (λt)n n ) e E e α(t τi)x F(dx) n! n i= (, ) N t = n = ( λt (λt)n t ) n e e α(t s)x F(dx)ds n! t n (, ) { t } = exp λ ( e xα(t s) )F(dx)ds, (, ) where in the fourth equality we have used the classical result that given N t = n, the arrival times are have the joint distribution of n ranked i.i.d. uniformly distributed random variables on [, t]. Returning to the computation of the Laplace exponent of Yt, suppose that P x is the law of Y when Y = Y = x then, with E x as the associated expectation operator, for all θ, E x (e θy t ) = Ex e θyt v t ) Sv θy E(e t v S), 9

20 where the interchange of the product and the conditional expectation is a consequence of monotone convergence. 3 Continuing this calculation we have E x (e θy t ) = Ex (e θyt )E E Sv (e θyt v ) v t = e xut(θ) E e Svut v(θ) v t = e xut(θ) lim E (e P ) v t Sv { Sv>ε}u t v(θ) ε = e xut(θ) lim exp{ Λ(ε, ) ( e xut s(θ) )ds Λ(dx) ε Λ(ε, ) } = exp{ xu t (θ) t [,t] φ(u t s (θ))ds}, (ε, ) where the third equality follows from the previous lemma on account of the fact that u t S u ( Su>ε) is a compound Poisson process with arrival rate Λ(ε, ) and jump distribution Λ(ε, ) Λ(dx) (ε, ). Allowing a drift component in φ introduces some lack of clarity with regard to a path-wise construction of Y in the manner shown above (and hence its existence). Intuitively speaking, if d is the drift of the underlying subordinator, then the term d t u t s(θ)ds which appears in the Laplace exponent of (2.6) may be thought of as due to a continuum immigration where, with rate d, in each dt an independent copy of (Y, P ) immigrates with infinitesimally small initial value. The problem with this intuitive picture is that there are an uncountable number of immigrating processes which creates measurability problems when trying to construct the aggregate integrated mass that has immigrated up to time t. Nonetheless, [9] gives a path-wise construction with the help of excursion theory and Itô synthesis; a technique which goes beyond the scope of this text. Returning to the relationship between processes with immigration and conditioned processes, we see that the existence of a process Y with an immigration mechanism containing drift can otherwise be seen from the following lemma. Lemma 2.3 Fix x >. Suppose that (Y, P x ) is a continuous-state branching process with branching mechanism ψ. Then (Y, Px ) has the same law as a continuous-state branching process with branching mechanism ψ and immigration mechanism φ where for θ, φ(θ) = ψ (θ) ρ. 3 Note that for each ε >, the Lévy Itô decomposition tells us that E( Y u t Su θy ( Su>ε)e t u S) = Y Su θy ( Su>ε)E(e t u S) u t due to there being a finite number of independent jumps greater than ε. Now take limits as ε and apply monotone convergence. 2

21 Proof. Fix x >. Clearly (Y, Px ) has paths that are right continuous with left limits as for each t >, when restricted to σ(y s : s t) we have Px << P x. Next we compute the Laplace exponent of Y t under P making use of (.4), Recall from (.3) that E x (e θyt ) = E x (e ρt Y t x e θyt ) = eρt x = eρt x θ E x(e θyt ) θ e xut(θ) = e ρt e xut(θ) u t (θ). (2.7) θ u R t t (θ) = e ψ (u s(θ))ds = e R t ψ (u t s(θ))ds, θ in which case we may identify with the help of (.6), φ(θ) = ψ (θ) ρ = σ 2 θ + (, ) ( e θx )xπ(dx). The latter is the Laplace exponent of a subordinator with drift σ 2 and Lévy measure xπ(dx). Looking again to the analogy with conditioned Bienaymé Galton Watson processes, it is natural to ask if there is any way to decompose the conditioned process in some way as to identify the analogue of the genealogical line of descent, earlier referred to as the spine, along which copies of the original process immigrate. This is possible, but again somewhat beyond the scope of this text. We refer the reader instead to [9] and [9]. Finally, as promised earlier, we show the connection between (X, P x) and (Y, Px ) for each x >. We are only able to make a statement for the case that ψ (+) =. Lemma 2.4 Suppose that Y = {Y t : t } is a continuous-state branching process with branching mechanism ψ. Suppose further that X = {X t : t } is a spectrally positive Lévy process with Laplace exponent ψ(θ) for θ. Fix x >. If ψ (+) = and dξ <, ψ(ξ) then (i) the process {X θt : t } under P x has the same law as (Y, P x ) where θ t = inf{s > : s X u du > t}, 2

22 (ii) the process {Y ϕt : t } under P x has the same law as (X, P x ) where ϕ t = inf{s > : s Y u du > t}. Proof. (i) It is easy to show that θ t is a stopping time with respect to {F t : t } of X. Using Theorem 2. and the Lamperti transform we have that if F(X θs : s t) is a non-negative measurable functional of X, then for each x >, E x (F(X θ s : s t) (θt< )) = E x ( X θ t x F(X θ s : s t) (θt<τ )) = E x ( Y t x F(Y s : s t) (t<ζ) ) = E x (F(Y s : s t)). (ii) The proof of the second part is a very similar argument and left to the reader. 3 Concluding Remarks It would be impossible to complete this chapter without mentioning that the material presented above is but the tip of the iceberg of a much grander theory of continuous time branching processes. Suppose in the continuous time Bienaymé Galton Watson process we allowed individuals to independently move around according to some Markov process then we would have an example of a spatial Markov branching particle process. If continuous-state branching processes are the continuous-state analogue of continuous time Bienaymé Galton Watson process then what is the analogue of a spatial Markov branching particle process? The answer to this question opens the door to the world of measure valued diffusions (or superprocesses) which, apart from its implicit probabilistic and mathematical interest, has many consequences from the point of view of mathematical biology, genetics and statistical physics. The interested reader is referred to the excellent monographs of Etheridge [], Le Gall [22] and Duquesne and Le Gall [] for an introduction. Exercises Exercise 9 In this exercise, we characterise the Laplace exponent of the continuous time Markov branching process Y described in Sect. 9. (i) Show that for φ > and t there exists some function u t (φ) > satisfying where y {, 2,...}. E y (e φyt ) = e yut(φ), 22

23 (ii) Show that for s, t, u t+s (φ) = u s (u t (φ)). (iii) Appealing to the infinitesimal behaviour of the Markov chain Y show that u t (φ) t = ψ(u t (φ)) and u (φ) = φ where ψ(q) = λ and F is given in (9.3). [, ) ( e qx )F(dx) Exercise 2 This exercise is due to Prof. A.G. Pakes. Suppose that Y = {Y t : t } is a continuous-state branching process with branching mechanism ψ(θ) = cθ ( e θx )λf(dx), (, ) where c, λ > and F is a probability distribution concentrated on (, ). Assume further that ψ (+) > (hence Y is subcritical). (i) Show that Y survives with probability one. (ii) Show that for all t sufficiently large, Y t = e ct where is a positive random variable. Exercise 2 This exercise is taken from [8]. Suppose that Y is a conservative continuous-state branching process with branching mechanism ψ (we shall adopt the same notation as the main text in this chapter). Suppose that ψ ( ) = (so that the underlying Lévy process is not a subordinator), ψ(ξ) dξ < and ρ := ψ (+). (i) Using (.7) show that one may write for each t, x > and θ, E x (e θyt ) = e xut(θ)+ρt ψ(u t(θ)) ψ(θ) which is a slightly different representation to (2.6) used in the text. (ii) Assume that ρ =. Show that for each x > P x (lim t Y t = ) =. (Hint: you may use the conclusion of Exercise?? (iii)). 23

24 (iii) Now assume that ρ >. Recalling that the convexity of ψ implies that xπ(dx) < (cf. Sect.??), show that (, ) θ ψ(ξ) ρξ ξ 2 dξ = 2 σ2 θ + xπ(dx) (, ) θx Hence using the fact that ψ(ξ) ρξ as ξ show that xlog xπ(dx) < ( e λ + λ λ 2 ) dλ. if and only if + ( ρξ ) dξ <. ψ(ξ) (iv) Keeping with the assumption that ρ > and x >, show that P Y x t if xlog xπ(dx) = and otherwise Yt converges in distribution under Px as t to a non-negative random variable Y with Laplace transform { Ex(e θy ) = ρθ θ ( ψ(θ) exp ρ ρξ ) } dξ. ψ(ξ) 24

25 References [] Athreya, S. and Ney, P. (972) Branching Processes. Springer, Berlin Heidelberg New York. [2] Bertoin, J. (993) Splitting at the infimum and excursions in half-lines for random walks and Lévy processes. Stochast. Process. Appl. 47, [3] Bertoin, J. and Doney, R.A. (994b) On conditioning a random walk to stay nonnegative. Ann. Probab. 22, [4] Bingham, N.H. (976) Continuous branching processes and spectral positivity. Stochast. Process. Appl. 4, [5] Chaumont, L. (994) Sur certains processus de Lévy conditionnés à rester positifs. Stoch. Stoch. Rep. 47, 2. [6] Chaumont, L. (996) Conditionings and path decomposition for Lévy processes. Stochast. Process. Appl. 64, [7] Chaumont, L. and Doney, R.A. (25) On Lévy processes conditioned to stay positive. Electron. J. Probab., [8] Duquesne, T. (23) Path decompositions for real Lévy processes. Ann. I. H. Poincaré 39, [9] Duquesne, T. (26) Continuum random trees and branching processes with immigration. To appear in Stoch. Processes and Appl. [] Duquesne, T. and Le Gall, J-F. (22) Random Trees, Lévy Processes and Spatial Branching Processes Astérisque nr. 28. [] Etheridge, A.M. (2) An Introduction to Superprocesses. American Mathematical Society, Providence, RI. [2] Grey, D.R. (974) Asymptotic behaviour of continuous time, continuous state-space branching processes. J. Appl. Probab., [3] Heyde, C.C. and Seneta, E. (977) I. J. Bienaymé: Statistical theory anticipated. In Studies in the History of Mathematics and Physical Sciences. No. 3. Springer, Berlin Heidelberg New York. [4] Jirina, M. (958) Stochastic branching processes with continuous state space. Czech. Math. J. 8, [5] Kingman, J.F.C. (993) Poisson Processes. Oxford University Press, Oxford. [6] Konstantopoulos, T., Last, G. and Lin, S.J. (24) Non-product form and tail asymptotics for a class of Lévy stochastic networks. Queueing Syst. 46,

26 [7] Lambert, A. (2) Completely asymmetric Lévy processes confined in a finite interval. Ann. Inst. H. Poincaré 36, [8] Lambert, A. (2) Arbres, excursions, et processus de Lévy complètement asymétriques. Thèse, doctorat de l Université Pierre et Marie Curie, Paris. [9] Lambert, A. (22) The genealogy of continuous-state branching processes with immigration. Probab. Theory Relat. Fields 22, [2] Lamperti, J. (967a) Continuous-state branching processes. Bull. Am. Math. Soc. 73, [2] Lamperti, J. (976b) The limit of a sequence of branching processes. Zeit. Wahrsch. Verw. Gebiete 7, [22] Le Gall, J.-F. (999) Spatial Branching Processes, Random Snakes and Partial Differential Equations. Lectures in Mathematics, ETH Zürich, Birkhäuser. [23] Lyons, R., Pemantle, R. and Peres, Y. (995) Conceptual proofs of L logl criteria for mean behaviour of branching processes. Ann. Probab. 23, [24] Silverstein, M.L. (968) A new approach to local time. J. Math. Mech. 7, [25] Watson, H.W. and Galton, F. (874) On the probability of the extinction of families. J. Anthropol. Inst. Great Britain Ireland 4,

Asymptotic behaviour near extinction of continuous state branching processes

Asymptotic behaviour near extinction of continuous state branching processes Asymptotic behaviour near extinction of continuous state branching processes G. Berzunza and J.C. Pardo August 2, 203 Abstract In this note, we study the asymptotic behaviour near extinction of sub- critical

More information

Continuous-state branching processes, extremal processes and super-individuals

Continuous-state branching processes, extremal processes and super-individuals Continuous-state branching processes, extremal processes and super-individuals Clément Foucart Université Paris 13 with Chunhua Ma Nankai University Workshop Berlin-Paris Berlin 02/11/2016 Introduction

More information

Introduction to self-similar growth-fragmentations

Introduction to self-similar growth-fragmentations Introduction to self-similar growth-fragmentations Quan Shi CIMAT, 11-15 December, 2017 Quan Shi Growth-Fragmentations CIMAT, 11-15 December, 2017 1 / 34 Literature Jean Bertoin, Compensated fragmentation

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

1 Informal definition of a C-M-J process

1 Informal definition of a C-M-J process (Very rough) 1 notes on C-M-J processes Andreas E. Kyprianou, Department of Mathematical Sciences, University of Bath, Claverton Down, Bath, BA2 7AY. C-M-J processes are short for Crump-Mode-Jagers processes

More information

CONTINUOUS-STATE BRANCHING PROCESSES AND SELF-SIMILARITY

CONTINUOUS-STATE BRANCHING PROCESSES AND SELF-SIMILARITY Applied Probability Trust 2 October 28 CONTINUOUS-STATE BRANCHING PROCESSES AND SELF-SIMILARITY A. E. KYPRIANOU, The University of Bath J.C. PARDO, The University of Bath Abstract In this paper we study

More information

The Contour Process of Crump-Mode-Jagers Branching Processes

The Contour Process of Crump-Mode-Jagers Branching Processes The Contour Process of Crump-Mode-Jagers Branching Processes Emmanuel Schertzer (LPMA Paris 6), with Florian Simatos (ISAE Toulouse) June 24, 2015 Crump-Mode-Jagers trees Crump Mode Jagers (CMJ) branching

More information

THE x log x CONDITION FOR GENERAL BRANCHING PROCESSES

THE x log x CONDITION FOR GENERAL BRANCHING PROCESSES J. Appl. Prob. 35, 537 544 (1998) Printed in Israel Applied Probability Trust 1998 THE x log x CONDITION FOR GENERAL BRANCHING PROCESSES PETER OLOFSSON, Rice University Abstract The x log x condition is

More information

Potentials of stable processes

Potentials of stable processes Potentials of stable processes A. E. Kyprianou A. R. Watson 5th December 23 arxiv:32.222v [math.pr] 4 Dec 23 Abstract. For a stable process, we give an explicit formula for the potential measure of the

More information

CONTINUOUS-STATE BRANCHING PROCESSES AND SELF-SIMILARITY

CONTINUOUS-STATE BRANCHING PROCESSES AND SELF-SIMILARITY J. Appl. Prob. 45, 4 6 28) Printed in England Applied Probability Trust 28 CONTINUOUS-STATE BRANCHING PROCESSES AND SELF-SIMILARITY A. E. KYPRIANOU and J. C. PARDO, University of Bath Abstract In this

More information

Erdős-Renyi random graphs basics

Erdős-Renyi random graphs basics Erdős-Renyi random graphs basics Nathanaël Berestycki U.B.C. - class on percolation We take n vertices and a number p = p(n) with < p < 1. Let G(n, p(n)) be the graph such that there is an edge between

More information

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES Applied Probability Trust 7 May 22 UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES HAMED AMINI, AND MARC LELARGE, ENS-INRIA Abstract Upper deviation results are obtained for the split time of a

More information

Yaglom-type limit theorems for branching Brownian motion with absorption. by Jason Schweinsberg University of California San Diego

Yaglom-type limit theorems for branching Brownian motion with absorption. by Jason Schweinsberg University of California San Diego Yaglom-type limit theorems for branching Brownian motion with absorption by Jason Schweinsberg University of California San Diego (with Julien Berestycki, Nathanaël Berestycki, Pascal Maillard) Outline

More information

Two viewpoints on measure valued processes

Two viewpoints on measure valued processes Two viewpoints on measure valued processes Olivier Hénard Université Paris-Est, Cermics Contents 1 The classical framework : from no particle to one particle 2 The lookdown framework : many particles.

More information

Exponential functionals of Lévy processes

Exponential functionals of Lévy processes Exponential functionals of Lévy processes Víctor Rivero Centro de Investigación en Matemáticas, México. 1/ 28 Outline of the talk Introduction Exponential functionals of spectrally positive Lévy processes

More information

Sharpness of second moment criteria for branching and tree-indexed processes

Sharpness of second moment criteria for branching and tree-indexed processes Sharpness of second moment criteria for branching and tree-indexed processes Robin Pemantle 1, 2 ABSTRACT: A class of branching processes in varying environments is exhibited which become extinct almost

More information

The range of tree-indexed random walk

The range of tree-indexed random walk The range of tree-indexed random walk Jean-François Le Gall, Shen Lin Institut universitaire de France et Université Paris-Sud Orsay Erdös Centennial Conference July 2013 Jean-François Le Gall (Université

More information

Asymptotic Behavior of a Controlled Branching Process with Continuous State Space

Asymptotic Behavior of a Controlled Branching Process with Continuous State Space Asymptotic Behavior of a Controlled Branching Process with Continuous State Space I. Rahimov Department of Mathematical Sciences, KFUPM, Dhahran, Saudi Arabia ABSTRACT In the paper a modification of the

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES

BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES BRANCHING PROCESSES 1. GALTON-WATSON PROCESSES Galton-Watson processes were introduced by Francis Galton in 1889 as a simple mathematical model for the propagation of family names. They were reinvented

More information

Almost sure asymptotics for the random binary search tree

Almost sure asymptotics for the random binary search tree AofA 10 DMTCS proc. AM, 2010, 565 576 Almost sure asymptotics for the rom binary search tree Matthew I. Roberts Laboratoire de Probabilités et Modèles Aléatoires, Université Paris VI Case courrier 188,

More information

Analytical properties of scale functions for spectrally negative Lévy processes

Analytical properties of scale functions for spectrally negative Lévy processes Analytical properties of scale functions for spectrally negative Lévy processes Andreas E. Kyprianou 1 Department of Mathematical Sciences, University of Bath 1 Joint work with Terence Chan and Mladen

More information

On Optimal Stopping Problems with Power Function of Lévy Processes

On Optimal Stopping Problems with Power Function of Lévy Processes On Optimal Stopping Problems with Power Function of Lévy Processes Budhi Arta Surya Department of Mathematics University of Utrecht 31 August 2006 This talk is based on the joint paper with A.E. Kyprianou:

More information

Bernstein-gamma functions and exponential functionals of Lévy processes

Bernstein-gamma functions and exponential functionals of Lévy processes Bernstein-gamma functions and exponential functionals of Lévy processes M. Savov 1 joint work with P. Patie 2 FCPNLO 216, Bilbao November 216 1 Marie Sklodowska Curie Individual Fellowship at IMI, BAS,

More information

Random trees and branching processes

Random trees and branching processes Random trees and branching processes Svante Janson IMS Medallion Lecture 12 th Vilnius Conference and 2018 IMS Annual Meeting Vilnius, 5 July, 2018 Part I. Galton Watson trees Let ξ be a random variable

More information

4.5 The critical BGW tree

4.5 The critical BGW tree 4.5. THE CRITICAL BGW TREE 61 4.5 The critical BGW tree 4.5.1 The rooted BGW tree as a metric space We begin by recalling that a BGW tree T T with root is a graph in which the vertices are a subset of

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing

Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes with Killing Advances in Dynamical Systems and Applications ISSN 0973-5321, Volume 8, Number 2, pp. 401 412 (2013) http://campus.mst.edu/adsa Weighted Sums of Orthogonal Polynomials Related to Birth-Death Processes

More information

Scale functions for spectrally negative Lévy processes and their appearance in economic models

Scale functions for spectrally negative Lévy processes and their appearance in economic models Scale functions for spectrally negative Lévy processes and their appearance in economic models Andreas E. Kyprianou 1 Department of Mathematical Sciences, University of Bath 1 This is a review talk and

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Branching within branching: a general model for host-parasite co-evolution

Branching within branching: a general model for host-parasite co-evolution Branching within branching: a general model for host-parasite co-evolution Gerold Alsmeyer (joint work with Sören Gröttrup) May 15, 2017 Gerold Alsmeyer Host-parasite co-evolution 1 of 26 1 Model 2 The

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Stochastic Processes

Stochastic Processes qmc082.tex. Version of 30 September 2010. Lecture Notes on Quantum Mechanics No. 8 R. B. Griffiths References: Stochastic Processes CQT = R. B. Griffiths, Consistent Quantum Theory (Cambridge, 2002) DeGroot

More information

The Codimension of the Zeros of a Stable Process in Random Scenery

The Codimension of the Zeros of a Stable Process in Random Scenery The Codimension of the Zeros of a Stable Process in Random Scenery Davar Khoshnevisan The University of Utah, Department of Mathematics Salt Lake City, UT 84105 0090, U.S.A. davar@math.utah.edu http://www.math.utah.edu/~davar

More information

Gärtner-Ellis Theorem and applications.

Gärtner-Ellis Theorem and applications. Gärtner-Ellis Theorem and applications. Elena Kosygina July 25, 208 In this lecture we turn to the non-i.i.d. case and discuss Gärtner-Ellis theorem. As an application, we study Curie-Weiss model with

More information

Intertwining of Markov processes

Intertwining of Markov processes January 4, 2011 Outline Outline First passage times of birth and death processes Outline First passage times of birth and death processes The contact process on the hierarchical group 1 0.5 0-0.5-1 0 0.2

More information

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 MATH 56A: STOCHASTIC PROCESSES CHAPTER 6 6. Renewal Mathematically, renewal refers to a continuous time stochastic process with states,, 2,. N t {,, 2, 3, } so that you only have jumps from x to x + and

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

RECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS

RECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS Pliska Stud. Math. Bulgar. 16 (2004), 43-54 STUDIA MATHEMATICA BULGARICA RECENT RESULTS FOR SUPERCRITICAL CONTROLLED BRANCHING PROCESSES WITH CONTROL RANDOM FUNCTIONS Miguel González, Manuel Molina, Inés

More information

process on the hierarchical group

process on the hierarchical group Intertwining of Markov processes and the contact process on the hierarchical group April 27, 2010 Outline Intertwining of Markov processes Outline Intertwining of Markov processes First passage times of

More information

DR.RUPNATHJI( DR.RUPAK NATH )

DR.RUPNATHJI( DR.RUPAK NATH ) Contents 1 Sets 1 2 The Real Numbers 9 3 Sequences 29 4 Series 59 5 Functions 81 6 Power Series 105 7 The elementary functions 111 Chapter 1 Sets It is very convenient to introduce some notation and terminology

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Logarithmic scaling of planar random walk s local times

Logarithmic scaling of planar random walk s local times Logarithmic scaling of planar random walk s local times Péter Nándori * and Zeyu Shen ** * Department of Mathematics, University of Maryland ** Courant Institute, New York University October 9, 2015 Abstract

More information

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS

EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS (February 25, 2004) EXTINCTION TIMES FOR A GENERAL BIRTH, DEATH AND CATASTROPHE PROCESS BEN CAIRNS, University of Queensland PHIL POLLETT, University of Queensland Abstract The birth, death and catastrophe

More information

Introduction and Preliminaries

Introduction and Preliminaries Chapter 1 Introduction and Preliminaries This chapter serves two purposes. The first purpose is to prepare the readers for the more systematic development in later chapters of methods of real analysis

More information

arxiv: v1 [math.pr] 20 May 2018

arxiv: v1 [math.pr] 20 May 2018 arxiv:180507700v1 mathr 20 May 2018 A DOOB-TYE MAXIMAL INEQUALITY AND ITS ALICATIONS TO VARIOUS STOCHASTIC ROCESSES Abstract We show how an improvement on Doob s inequality leads to new inequalities for

More information

ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS

ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS JOHANNES RUF AND JAMES LEWIS WOLTER Appendix B. The Proofs of Theorem. and Proposition.3 The proof

More information

Metric spaces and metrizability

Metric spaces and metrizability 1 Motivation Metric spaces and metrizability By this point in the course, this section should not need much in the way of motivation. From the very beginning, we have talked about R n usual and how relatively

More information

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing

Department of Applied Mathematics Faculty of EEMCS. University of Twente. Memorandum No Birth-death processes with killing Department of Applied Mathematics Faculty of EEMCS t University of Twente The Netherlands P.O. Box 27 75 AE Enschede The Netherlands Phone: +3-53-48934 Fax: +3-53-48934 Email: memo@math.utwente.nl www.math.utwente.nl/publications

More information

A REFINED FACTORIZATION OF THE EXPONENTIAL LAW P. PATIE

A REFINED FACTORIZATION OF THE EXPONENTIAL LAW P. PATIE A REFINED FACTORIZATION OF THE EXPONENTIAL LAW P. PATIE Abstract. Let ξ be a (possibly killed) subordinator with Laplace exponent φ and denote by I φ e ξs ds, the so-called exponential functional. Consider

More information

Shifting processes with cyclically exchangeable increments at random

Shifting processes with cyclically exchangeable increments at random Shifting processes with cyclically exchangeable increments at random Gerónimo URIBE BRAVO (Collaboration with Loïc CHAUMONT) Instituto de Matemáticas UNAM Universidad Nacional Autónoma de México www.matem.unam.mx/geronimo

More information

Stochastic flows associated to coalescent processes

Stochastic flows associated to coalescent processes Stochastic flows associated to coalescent processes Jean Bertoin (1) and Jean-François Le Gall (2) (1) Laboratoire de Probabilités et Modèles Aléatoires and Institut universitaire de France, Université

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

Random trees and applications

Random trees and applications Probability Surveys Vol. 2 25) 245 311 ISSN: 1549-5787 DOI: 1.1214/15495785114 Random trees and applications Jean-François Le Gall DMA-ENS, 45 rue d Ulm, 755 PARIS, FRANCE e-mail: legall@dma.ens.fr Abstract:

More information

Other properties of M M 1

Other properties of M M 1 Other properties of M M 1 Přemysl Bejda premyslbejda@gmail.com 2012 Contents 1 Reflected Lévy Process 2 Time dependent properties of M M 1 3 Waiting times and queue disciplines in M M 1 Contents 1 Reflected

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Erik J. Baurdoux Some excursion calculations for reflected Lévy processes

Erik J. Baurdoux Some excursion calculations for reflected Lévy processes Erik J. Baurdoux Some excursion calculations for reflected Lévy processes Article (Accepted version) (Refereed) Original citation: Baurdoux, Erik J. (29) Some excursion calculations for reflected Lévy

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

CASE STUDY: EXTINCTION OF FAMILY NAMES

CASE STUDY: EXTINCTION OF FAMILY NAMES CASE STUDY: EXTINCTION OF FAMILY NAMES The idea that families die out originated in antiquity, particilarly since the establishment of patrilineality (a common kinship system in which an individual s family

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

SIMILAR MARKOV CHAINS

SIMILAR MARKOV CHAINS SIMILAR MARKOV CHAINS by Phil Pollett The University of Queensland MAIN REFERENCES Convergence of Markov transition probabilities and their spectral properties 1. Vere-Jones, D. Geometric ergodicity in

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM

A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM J. Appl. Prob. 49, 876 882 (2012 Printed in England Applied Probability Trust 2012 A NEW PROOF OF THE WIENER HOPF FACTORIZATION VIA BASU S THEOREM BRIAN FRALIX and COLIN GALLAGHER, Clemson University Abstract

More information

ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES. Masatoshi Fukushima

ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES. Masatoshi Fukushima ON ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES Masatoshi Fukushima Symposium in Honor of Kiyosi Itô: Stocastic Analysis and Its Impact in Mathematics and Science, IMS, NUS July 10, 2008 1 1. Itô s point

More information

Stability of Stochastic Differential Equations

Stability of Stochastic Differential Equations Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010

More information

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008

Gaussian processes. Chuong B. Do (updated by Honglak Lee) November 22, 2008 Gaussian processes Chuong B Do (updated by Honglak Lee) November 22, 2008 Many of the classical machine learning algorithms that we talked about during the first half of this course fit the following pattern:

More information

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974

LIMITS FOR QUEUES AS THE WAITING ROOM GROWS. Bell Communications Research AT&T Bell Laboratories Red Bank, NJ Murray Hill, NJ 07974 LIMITS FOR QUEUES AS THE WAITING ROOM GROWS by Daniel P. Heyman Ward Whitt Bell Communications Research AT&T Bell Laboratories Red Bank, NJ 07701 Murray Hill, NJ 07974 May 11, 1988 ABSTRACT We study the

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Itô s excursion theory and random trees

Itô s excursion theory and random trees Itô s excursion theory and random trees Jean-François Le Gall January 3, 200 Abstract We explain how Itô s excursion theory can be used to understand the asymptotic behavior of large random trees. We provide

More information

The exact asymptotics for hitting probability of a remote orthant by a multivariate Lévy process: the Cramér case

The exact asymptotics for hitting probability of a remote orthant by a multivariate Lévy process: the Cramér case The exact asymptotics for hitting probability of a remote orthant by a multivariate Lévy process: the Cramér case Konstantin Borovkov and Zbigniew Palmowski Abstract For a multivariate Lévy process satisfying

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

arxiv: v1 [math.pr] 1 Jan 2013

arxiv: v1 [math.pr] 1 Jan 2013 The role of dispersal in interacting patches subject to an Allee effect arxiv:1301.0125v1 [math.pr] 1 Jan 2013 1. Introduction N. Lanchier Abstract This article is concerned with a stochastic multi-patch

More information

Spectral asymptotics for stable trees and the critical random graph

Spectral asymptotics for stable trees and the critical random graph Spectral asymptotics for stable trees and the critical random graph EPSRC SYMPOSIUM WORKSHOP DISORDERED MEDIA UNIVERSITY OF WARWICK, 5-9 SEPTEMBER 2011 David Croydon (University of Warwick) Based on joint

More information

Mandelbrot s cascade in a Random Environment

Mandelbrot s cascade in a Random Environment Mandelbrot s cascade in a Random Environment A joint work with Chunmao Huang (Ecole Polytechnique) and Xingang Liang (Beijing Business and Technology Univ) Université de Bretagne-Sud (Univ South Brittany)

More information

Regularity of the density for the stochastic heat equation

Regularity of the density for the stochastic heat equation Regularity of the density for the stochastic heat equation Carl Mueller 1 Department of Mathematics University of Rochester Rochester, NY 15627 USA email: cmlr@math.rochester.edu David Nualart 2 Department

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

A Note on the Central Limit Theorem for a Class of Linear Systems 1

A Note on the Central Limit Theorem for a Class of Linear Systems 1 A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.

More information

Boolean Algebras. Chapter 2

Boolean Algebras. Chapter 2 Chapter 2 Boolean Algebras Let X be an arbitrary set and let P(X) be the class of all subsets of X (the power set of X). Three natural set-theoretic operations on P(X) are the binary operations of union

More information

MA103 Introduction to Abstract Mathematics Second part, Analysis and Algebra

MA103 Introduction to Abstract Mathematics Second part, Analysis and Algebra 206/7 MA03 Introduction to Abstract Mathematics Second part, Analysis and Algebra Amol Sasane Revised by Jozef Skokan, Konrad Swanepoel, and Graham Brightwell Copyright c London School of Economics 206

More information

Solving the Poisson Disorder Problem

Solving the Poisson Disorder Problem Advances in Finance and Stochastics: Essays in Honour of Dieter Sondermann, Springer-Verlag, 22, (295-32) Research Report No. 49, 2, Dept. Theoret. Statist. Aarhus Solving the Poisson Disorder Problem

More information

On Reflecting Brownian Motion with Drift

On Reflecting Brownian Motion with Drift Proc. Symp. Stoch. Syst. Osaka, 25), ISCIE Kyoto, 26, 1-5) On Reflecting Brownian Motion with Drift Goran Peskir This version: 12 June 26 First version: 1 September 25 Research Report No. 3, 25, Probability

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

WXML Final Report: Chinese Restaurant Process

WXML Final Report: Chinese Restaurant Process WXML Final Report: Chinese Restaurant Process Dr. Noah Forman, Gerandy Brito, Alex Forney, Yiruey Chou, Chengning Li Spring 2017 1 Introduction The Chinese Restaurant Process (CRP) generates random partitions

More information

arxiv: v1 [math.pr] 11 Jan 2013

arxiv: v1 [math.pr] 11 Jan 2013 Last-Hitting Times and Williams Decomposition of the Bessel Process of Dimension 3 at its Ultimate Minimum arxiv:131.2527v1 [math.pr] 11 Jan 213 F. Thomas Bruss and Marc Yor Université Libre de Bruxelles

More information

SOLUTIONS OF SEMILINEAR WAVE EQUATION VIA STOCHASTIC CASCADES

SOLUTIONS OF SEMILINEAR WAVE EQUATION VIA STOCHASTIC CASCADES Communications on Stochastic Analysis Vol. 4, No. 3 010) 45-431 Serials Publications www.serialspublications.com SOLUTIONS OF SEMILINEAR WAVE EQUATION VIA STOCHASTIC CASCADES YURI BAKHTIN* AND CARL MUELLER

More information

A slow transient diusion in a drifted stable potential

A slow transient diusion in a drifted stable potential A slow transient diusion in a drifted stable potential Arvind Singh Université Paris VI Abstract We consider a diusion process X in a random potential V of the form V x = S x δx, where δ is a positive

More information

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk

Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk Stability and Rare Events in Stochastic Models Sergey Foss Heriot-Watt University, Edinburgh and Institute of Mathematics, Novosibirsk ANSAPW University of Queensland 8-11 July, 2013 1 Outline (I) Fluid

More information

The Wright-Fisher Model and Genetic Drift

The Wright-Fisher Model and Genetic Drift The Wright-Fisher Model and Genetic Drift January 22, 2015 1 1 Hardy-Weinberg Equilibrium Our goal is to understand the dynamics of allele and genotype frequencies in an infinite, randomlymating population

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Crump Mode Jagers processes with neutral Poissonian mutations

Crump Mode Jagers processes with neutral Poissonian mutations Crump Mode Jagers processes with neutral Poissonian mutations Nicolas Champagnat 1 Amaury Lambert 2 1 INRIA Nancy, équipe TOSCA 2 UPMC Univ Paris 06, Laboratoire de Probabilités et Modèles Aléatoires Paris,

More information

Resistance Growth of Branching Random Networks

Resistance Growth of Branching Random Networks Peking University Oct.25, 2018, Chengdu Joint work with Yueyun Hu (U. Paris 13) and Shen Lin (U. Paris 6), supported by NSFC Grant No. 11528101 (2016-2017) for Research Cooperation with Oversea Investigators

More information