GAUSSIAN PROCESSES AND THE LOCAL TIMES OF SYMMETRIC LÉVY PROCESSES

Size: px
Start display at page:

Download "GAUSSIAN PROCESSES AND THE LOCAL TIMES OF SYMMETRIC LÉVY PROCESSES"

Transcription

1 GAUSSIAN PROCESSES AND THE LOCAL TIMES OF SYMMETRIC LÉVY PROCESSES MICHAEL B. MARCUS AND JAY ROSEN CITY COLLEGE AND COLLEGE OF STATEN ISLAND Abstract. We give a relatively simple proof of the necessary and sufficient condition for the joint continuity of the local times of symmetric Lévy processes. This result was obtained in 1988 by M. Barlow and J. Hawkes without requiring that the Lévy processes be symmetric. In 199 the authors used a very different approach to obtain necessary and sufficient condition for the joint continuity of the local times of strongly symmetric Markov processes, which includes symmetric Lévy processes. Both the 1988 proof and the 199 proof are long and difficult. In this paper the 199 proof is significantly simplified. This is accomplished by using two recent isomorphism theorems, which relate the local times of strongly symmetric Markov processes to certain Gaussian processes, one due to N. Eisenbaum alone and the other to N. Eisenbaum, H. Kaspi, M.B. Marcus, J. Rosen and Z. Shi. Simple proofs of these isomorphism theorems are given in this paper. 1. Introduction We give a relatively simple proof of the necessary and sufficient condition for the joint continuity of the local times of symmetric Lévy processes. Let X = {Xt, t R + } be a symmetric Lévy process with values in R and characteristic function 1.1 Ee iξxt = e tψξ. Let L x t denote the local time of X at x R. Heuristically, L x t is the amount of time that the process spends at x, up to time t. A necessary and sufficient condition for the existence of the local time of X is that 1. 1 dλ <. 1 + ψλ When 1. holds we define 1.3 L x t = lim ɛ t f ɛ,x X s ds where f ɛ,x is an approximate delta function at x. Specifically, we assume that f ɛ,x is ported in [x ɛ, x + ɛ] and f ɛ,x y dy = 1. Convergence in 1.3 is locally uniform in t almost surely, see Theorem. When 1. is satisfied the process X has a transition probability density which we denote by p t x, y. The α-potential density of X is defined as 1.4 u α x, y = 1 π = 1 π 1 p t x, ye αt dt cos λx y α + ψλ dλ.

2 Local times of symmetric Lévy processes Since X is a symmetric Lévy process p t x, y and u α x, y are actually functions of x y. We occasionally use the notation p t x y def = p t x, y and similarly for u α. Note that u α is the Fourier transform of an L 1 function. Consequently it is uniformly continuous on R. The transition probability p t x, y is positive definite. This is a simple consequence of the Chapman Kolmogorov equation and uses the fact that p t x, y is symmetric. Therefore, u α is positive definite and hence is the covariance of a stationary Gaussian process. It is also easy to show that u α is positive definite directly from 1.4. Let G = {Gx, x R} be a mean zero Gaussian process with 1.5 EGxGy = u 1 x, y. The next theorem relates the continuity of the local times of X with that of G. Theorem 1. Barlow Hawkes, Barlow 1988 Let L = {L x t, x, t R R + } be the local time process of a symmetric Lévy process X with 1 potential density u 1 x, y. Then L is continuous almost surely if and only if the mean zero stationary Gaussian process {Gx, x R}, with covariance u 1 x, y, is continuous almost surely. In the course of proving Theorem 1 we also show that if L is not continuous almost surely then for all t > and x in R, L x t is unbounded in all neighborhoods of x, P x almost surely. For the historical background of Theorem 1 see the end of Section 1, Marcus and Rosen 199. In Barlow 1988 necessary and sufficient conditions are given for the continuity of the local time process for all Lévy processes, not only for symmetric Lévy processes. For processes that are not symmetric these conditions can not be described in terms of Gaussian processes since u 1 in 1.5, as the covariance of a Gaussian process, must be symmetric. The results in Barlow 1988 are not expressed the way they are in Theorem 1. Theorem 1 is the way Theorem 1, in our paper Marcus and Rosen 199, is stated. The contribution of the latter theorem is that it holds for Markov processes with symmetric potential densities, not just for symmetric Lévy processes. There is another important difference in the work in Barlow 1988 and the work in Marcus and Rosen 199. In Barlow 1988 concrete conditions for continuity are obtained which imply Theorem 1 as stated. In Marcus and Rosen 199 the comparison between local times of Lévy processes and Gaussian processes is obtained abstractly, without obtaining any conditions to verify when either class of processes is continuous. However, since necessary and sufficient conditions for the continuity of Gaussian processes are known, we have them also for the local time processes. Let sin 1.6 λu σu = dλ <. 1 + ψλ and denote by σu the non decreasing rearrangement of σu for u [, 1]. I.e. σu, u [, 1], is a non decreasing function satisfying {u : σu x, u [, 1]} = {u : σu x, u [, 1]}.

3 M. B. Marcus and J. Rosen 3 Corollary 1. Let L = {L x t, x, t R R + } be the local time process of a symmetric Lévy process X with characteristic function given by 1.1 and let σ and σ be as defined in 1.6. Then L is continuous almost surely if and only if 1.7 1/ σu du <. ulog 1/u 1/ In particular 1.7 holds when 1.8 s 1/ 1 1+ψλ dλ slog 1/s 1/ ds < and 1.7 and 1.8 are equivalent when ψ λ. The proofs of Theorem 1 in Barlow 1988 and Marcus and Rosen 199 are long and difficult. So much so that in his recent book on Lévy processes, Bertoin 1996, Bertoin only gives the proof of sufficiency. The proof of necessity in Barlow 1988 is very technical. The proofs in Marcus and Rosen 199 depend on an isomorphism theorem of Dynkin. The form of this isomorphism makes it difficult to apply. We have devoted a lot of effort over the past ten years trying to simplify the proof of Theorem 1. In this paper we present such a proof. Using a new Dynkin type isomorphism theorem, obtained recently in Eisenbaum, Kaspi, Marcus, Rosen and Shi 1999, which has a relatively short proof, we greatly simplify the proof of necessity. We also use an earlier isomorphism theorem of N. Eisenbaum, Eisenbaum 19?? to significantly shorten the proof of sufficiency given in Marcus and Rosen 199. Furthermore, using ideas developed in Eisenbaum, Kaspi, Marcus, Rosen and Shi 1999, we give a simple proof of Eisenbaum s isomorphism theorem. Her original proof followed the more complicated line of the proof of Dynkin s theorem given in Marcus and Rosen 199. Another factor that enables us to simplify the presentation in this paper is that we restrict our attention to Lévy processes. Actually, the same proofs given here extend to prove Theorem 1, Marcus and Rosen 199, in the generality in which it is given in Marcus and Rosen 199. The new isomorphism theorem of Eisenbaum, Kaspi, Marcus, Rosen and Shi 1999 has other applications to Lévy process. In Bass, Eisenbaum and Shi 1999 and Marcus 1999 it is used to show that the most visited site of a large class of Lévy processes is transient. Section provides background material on local times. In Section 3 we state the two isomorphism theorems that are at the heart of this work. The necessity part of Theorem 1 and Corollary 1 are proved in Section 4. In Section 5 the sufficiency part of Theorem 1 is given. Section 6 presents Kac s formula in a form which is convenient for the proofs of the isomorphism theorems. In Section 7 we give new and simple proofs of the isomorphism theorems subject only to a lemma which is proved in Section 8. We have tried to make this paper accessible to readers whose primary interests are in Markov processes or Gaussian processes and consequently may have included more details than some specialists might think necessary. We request your indulgence. We are grateful to Jean Bertoin for helpful discussions.

4 4 Local times of symmetric Lévy processes. Local times of Lévy processes The material in this section is provided for background. It is fairly standard, see e.g. Bertoin 1996, V.1 and Blumenthal and Getoor 1968, V.3. A functional A t of the Lévy process X is called a continuous additive functional if it is continuous in t R +, F t measurable and satisfies the additivity condition.1 A t+s = A t + A s θ t for all s, t R +. Let A t be a continuous additive functional of X, with A = and let. S A ω = inf{t A t ω > }. We call A t a local time of X, at y R, if P y S A = = 1 and, for all x y, P x S A = =. Theorem. Let X be a Lévy process as defined in 1.1 and assume that 1. holds. Then for each y R we can find a local time of X at y, denoted by L y t, such that.3 E x e αt dl y t = u α x, y. where u α x, y is defined in 1.4. Furthermore, there exists a sequence {ɛ n } tending to zero, such that for any finite time T, which may be random.4 L y t = lim ɛ n uniformly for t [, T ]. t f ɛn,yx s ds Proof Let θ be an independent exponential time with mean 1/α, and let W t be the Markov process obtained by killing X t at θ. A simple calculation shows that.5 E x f ɛ,y W s ds f ɛ,yw t dt = u α x, z 1 u α z 1, z f ɛ,y z 1 f ɛ,yz dz 1 dz + u α x, z 1 u α z 1, z f ɛ,yz 1 f ɛ,y z dz 1 dz. Since u α x, y is continuous on R R for all α >, we see that f ɛ,y W s ds converges in L as ɛ. Define the right continuous W -martingale.6 t Mt ɛ = E x f ɛ,y W s ds F t = f ɛ,y W s ds + u α W t, zf ɛ,y z dz. where F t = F t σθ t. Doob s maximal inequality shows that Mt ɛ converges in L uniformly in t R +. Using the uniform continuity of u α x, y we see that the last term in.6 also converges in L uniformly in t R +. Consequently, we can find a sequence ɛ n such that.7 t θ f ɛn,yx s ds converges almost surely, uniformly in t R +.

5 M. B. Marcus and J. Rosen 5 Since P θ > v = e αv > almost surely, it follows from Fubini s theorem that the right hand side of.4 converges almost surely, uniformly on [, v]. We define L y t by.4. It is easy to verify that L y t is a continuous additive functional of X with L y =. It follows from.5 that, { f ɛn,yw s ds} is uniformly integrable. Therefore.8 Also, clearly.9 E x L y θ = uα x, y. E x L y θ = αex e αt L y t dt..3 follows from.8,.9 and integration by parts. To complete the proof we need only show that S = S L y, see. satisfies the conditions given below.. It follows immediately from.4 and the right continuity of X that P x S = = for all x y. We now show that P y S = = 1. Suppose that P y S = =. Since, for any z R,.1 P z L y t > P z S < t we would have lim t P y L y t > =. In fact since P x S = = for all x y, we would actually have.11 lim P z L y t > = z R. t This is not possible. To see this note that it follows from the definition of S, that for any x and t >.1 P x S < = P x L y S+t >, S <. It is easy to see that S is a stopping time. Therefore, using the additivity of L y and the Markov property, we have.13 P x S < = E x P X S L y t >, S <. Using.11 in.13 gives us P x S < = for all x. That is that L y almost surely, which contradicts.3. Thus P y S = >. The Blumenthal 1 law then shows that P y S = = Isomorphism theorems for Lévy processes In this section we present the two isomorphism theorems that play a critical role in the proof of Theorem 1. We first consider an unconditioned isomorphism theorem due to N. Eisenbaum. It is stated below Théorème 1.3, Eisenbaum 19??. In Eisenbaum 19?? results like Theorem 3 are also obtained in more general settings. We state it for a symmetric Lévy process X satisfying 1., with local time process L = {L x t, x, t R R + }. Let G = {Gx, x R} be a mean zero Gaussian process satisfying 3.1 EGxGy = u 1 x, y. Let ϕ denote an exponential random variable with mean one which is independent of X. The next theorem is called unconditioned because, in contrast to Dynkin s original isomorphism theorem, it doesn t depend explicitly on X ϕ.

6 6 Local times of symmetric Lévy processes Theorem 3. Let X, L, ϕ and G be as given immediately above. For any sequence x j R, j = 1,... consider {Lϕ xj, j = 1,...} and {Gx j, j = 1,..., } and let y R. For all measurable functions F on R E y G + s E G F L ϕ + = E 1 + Gy G + s 3. F s for all s >. We defer the proof until Section 7. The second isomorphism theorem used in this paper is a new result which is given in Eisenbaum, Kaspi, Marcus, Rosen and Shi It is a generalization of the second Ray Knight Theorem for Brownian motion. Let 3.3 where 3.4 u {} x, y = φx + φy φx y φx def = 1 1 cos λx dλ x R. π ψλ In Eisenbaum, Kaspi, Marcus, Rosen and Shi 1999 it is shown that when X is recurrent, u {} x, y is the -potential density of X killed at the first time it hits. It is clear that when 1. holds, u {} x, y is continuous and symmetric. Furthermore, it follows from the identity 3.5 cos λx y cos λx cos λy + 1 = Re e iλx 1e iλy 1 that u {} x, y is positive definite. Let η = {η x, x R} be a mean zero Gaussian process with covariance 3.6 E η η x η y = u {} x, y where E η is the expectation operator for η. Also, we take P η to be the probability measure for η. An important process in the proof of necessity in Theorem 1 is the inverse local time at of X. That is 3.7 τt def = inf{s : L s > t}. Theorem 4. Assume that the Lévy process X is recurrent and let η be the mean zero Gaussian process defined in 3.6. For any t >, under the measure P P η, { } { L x τt + η x, x R law ηx + t } 3.8 =, x R. The proof of this theorem is given in Section Necessary condition for continuity We give some properties of Gaussian processes which are used in the proof of the necessity part of Theorem 1. Lemma 1. Let G be the Gaussian process defined in 1.5 and η the Gaussian process defined in 3.6. If G is not continuous almost surely then η is unbounded almost surely, on all intervals of R.

7 M. B. Marcus and J. Rosen 7 Proof By Eη x η y = φx y since 3.6 implies that η =. By By 3.4 and Consequently 4.4 = Eη x y η EGx Gy = EGx y G Eη x η y = π = π 1 cos λx y 1 + ψλ 1 cos λx y ψλ dλ. EGx Gy 1/ Eηx η y 1/. G is a stationary Gaussian process. Therefore, by Theorem 4.9, Chapter III, Jain and Marcus 1978, if G is not continuous almost surely then it is unbounded almost surely, on all intervals of R. The conclusion about η now follows from 4.4, see 5.5, Marcus and Shepp 197. Lemma. Let {Gx, x R} be a mean zero Gaussian process on R and T R a finite set. Let a be the median of x T Gx and σ def = x T EG x 1/. Then 4.5 P Gx a σs x T 1 Φs and 4.6 where 4.7 P Gx a + σs x T 1 Φs Φs = 1 e u / du. π s Proof These statements are consequence of Borell s Lemma. For more details see.3 and.18, Marcus and Rosen 199. We now use the isomorphism theorem 4 to get a sufficient condition for the local time process of the Lévy process X to be unbounded in a neighborhood of a point in R. Without loss of generality we can take this point to be. Theorem 4, as written applies only to recurrent processes. X is recurrent if and only if 1 1/ψλ dλ =. This condition, which depends on ψ at zero, is completely separate from the condition for the existence of the local time of X in 1., which depends on ψ at infinity. To begin we consider recurrent Lévy processes. Lemma 3. Let X be recurrent and let {L x t, t, x R + R} be the local time process of X. Let u {} x, y be as defined in 3.3 and let {η x, x R} be a real valued Gaussian process with mean zero and covariance u {} x, y. Suppose that there exists a countable dense subset C R for which 4.8 lim δ x C [,δ] η x = a.s. P η. dλ.

8 8 Local times of symmetric Lévy processes Then 4.9 lim δ x C [,δ] L x t = t > a.s. P. Proof Fix t, δ > and let T C [, δ] be a finite set. We first note that it follows from 4.5 that η x + P η t a σs + t Φs x T where a is the median of x T η x and σ def = x T Eηx 1/ = x T u 1/ {}x, x. Φs is given in 4.7. By Theorem 4, under the measure P = P P η η x + t, x R}. {L x τt + 1 η x, x R} law = { Combining 4.1 and 4.11 we see that 4.1 P x T L x τt + 1 η x By the triangle inequality 4.13 P Also, by Therefore 4.15 a σs + t 1 Φs. L x τt a σs + t x T x T η x P η ηx a + σs 1 Φs. x T 1 Φs. P L x τt ta σs t + a + t 1 3Φs. x T We can take s arbitrarily large so Φs is arbitrarily close to zero. We can next take δ arbitrarily small so that σ and hence σs is as small as we like, in particular so that it is less than, say a t/1. Finally, we note that because of 4.8 we can take T to be a large enough set so that ta > M for any number M. Thus we see that 4.16 lim δ x C [,δ] L x τt = t > a.s. P. τt is right continuous and by the definition of local time, τ =, P a.s. Therefore, for any t > and ɛ > we can find a < t < t so that 4.17 P τt < t > 1 ɛ. Since the local time is increasing in t, it follows from 4.16 that 4.18 lim δ x C [,δ] L x t = on a set of P measure greater than 1 ɛ. This gives us 4.9. The next theorem gives the necessity part of Theorem 1.

9 M. B. Marcus and J. Rosen 9 Theorem 5. Let X be a symmetric Lévy process with 1 potential density u 1 x, y. Let G = {Gy, y R} be a mean zero Gaussian process with covariance u 1 x, y. If G is not continuous almost surely, then the local time of X is unbounded on all intervals of R +. Proof When X is recurrent this follows immediately from Lemmas 1 and 3. Suppose that X is transient. For s > let ν[s, denote the Lévy measure of X. It is enough to consider ν on the half line because X is symmetric. Set ν[s, = ν 1 [s, + ν [s,, where ν 1 [s, = ν[s, ν[1, for < s < 1 and ν [s, = ν[s, for 1 s <. Let X 1 and X be independent symmetric Lévy processes with Lévy measures ν 1 and ν. Clearly, X law = X 1 + X. Consider these processes on [, T ]. X 1 is recurrent and X is a pure jump process with the absolute value of all its jumps greater than or equal to one. X is a process of bounded variation, see e.g. Lemma 3..3, Stroock Hence it only has a finite number of jumps on [, T ] almost surely. Conditioned on the number of jumps of X being equal to k, the position of these jumps is given by the values on k independent uniform random variables on [, T ]. This shows that the time of the first jump of X with absolute value greater than or equal to one is greater than zero with probability one and that X = X 1 up to this time. Let ψ 1 be the Lévy exponent corresponding to ν 1. Let 4.19 where 4. v {} x, y = φ 1 x + φ 1 y φ 1 x y φ 1 x def = 1 1 cos λx dλ π ψ 1 λ x R and let η 1 = {η 1 x, x R} be a mean zero Gaussian process with covariance v {} x, y. When G is not continuous almost surely, η 1 is unbounded almost surely on all intervals of R. This follows immediately from the proof of Lemma 1 since ψ 1 < ψ so that 4.4 holds with η replaced by η 1. Given this, it follows from Lemma 3 that 4.13 holds for the local times of X 1. But then it also holds for X, since X = X 1 for a strictly positive amount of time, almost surely. Proof of Corollary 1. The Gaussian process G in Theorem 1 is stationary. 1.7 is a necessary and sufficient condition for G to be continuous, see Theorem 7.6 and Corollary 6.3, Chapter IV, Jain and Marcus It is clear from 1.4 that G has spectral density 1/1 + ψλ. 1.8 and the statement following it are criteria for the continuity of a stationary Gaussian process in terms of its spectrum. See Theorems 3.1 and 3., ChapterIV, Jain and Marcus Remark 1. In Proposition 1.7, Barlow 1988, Barlow reduces the proof of Theorem 1 to the recurrent case. Our proof of sufficiency works for transient as well as recurrent processes so we only need to do this when considering the necessary portion of Theorem 1. This is the easier direction because 4.4 follows easily in this direction. Actually in Eisenbaum, Kaspi, Marcus, Rosen and Shi 1999, Theorem 4 is given so that it holds for both transient and recurrent processes. Using this, Lemma 3, with essentially the same proof, works in both cases. When considering Lévy processes it seems simpler to consider only recurrent processes because the extension to transient processes is simple.

10 1 Local times of symmetric Lévy processes 5. Sufficient condition for continuity We are considering a symmetric Lévy process X for which 1. is satisfied. Consequently we can associate with X a local time process L = {L x t, x, t R R + }. X also has a 1 potential density which is denoted by u 1 x, y. Let ϕ be an exponential random variable, with mean one, that is independent of X. We begin the proof of the sufficiency part of Theorem 1 by showing that if the mean zero Gaussian process G, with covariance u 1 x, y, defined in 1.5, is continuous almost surely, then {L x ϕ, x R} is continuous in a very strong sense. Lemma 4. Let X be a symmetric Lévy process satisfying 1., with local time process {L x t, x, t R R + } and 1-potential density u 1 x, y. Let ϕ be an exponential random variable, with mean one, that is independent of X. Let D R be countable dense set. When u 1 x, y is the covariance of a mean zero continuous Gaussian process 5.1 lim δ Ey for any compact subset K of R. x z δ x,z D K L x ϕ L z ϕ = Proof Let def = x z δ x,z D K and def = x D {K {K+1}}. It follows from 3. with s = 1, that 5. E y L x ϕ L z ϕ E G x + 1 G z E Gy G x + 1 G z + 1 E G x 1/ + E G x 4 1/ E Gx + 1 Gz + 1 1/. Because G is continuous on R, all moments of its norm, over a compact subset of R, are finite, see e.g. Corollary 3., Ledoux and Talagrand Thus by the dominated convergence theorem, applied to the last line of 5., we obtain 5.1. A local time process is a family of continuous additive functionals in time. When we say that a stochastic process ˆL = {ˆL y t, t, y R + R} is a version of the local time of a Markov process X we mean more than the traditional statement that one stochastic process is a version of the other. Besides this we also require that the version is itself a local time for X. That is, that for each y R, ˆL y is a local time for X at y. Let us be even more precise. Let L = {L y t, t, y R + R} be a local time process for X. When we say that ˆL = {ˆL y t, t, y R + R} is a jointly continuous version of L we mean that for all compact sets T R +, ˆL is continuous on T R almost surely with respect to P x, for all x R, and satisfies 5.3 ˆL y t = L y t t R + a.s. P x for each x, y R. Following convention, we often say that a Markov process has a continuous local time, when we mean that we can find a continuous version for the local time.

11 M. B. Marcus and J. Rosen 11 The next theorem gives the sufficiency part of Theorem 1. Theorem 6. Let X be a symmetric Lévy process satisfying 1., with local time process L = {L y t, y, t R R + } and 1-potential density u 1 x, y. Let G = {Gy, y R} be a mean zero Gaussian process with covariance u 1 x, y. If G is continuous almost surely, there is a version of L which is jointly continuous on R R +. Proof Recall that ϕ is an exponential random variable with mean one. Let W be the symmetric Markov process obtained by killing X at time ϕ and let L = {L y t, t, y R + R} denote the local time process of W. By.7 and the material immediately following it we see that L y t = L y t ϕ. Let Ω, F t, P x denote the probability space of W. Consider the martingale 5.4 and note that 5.5 A y t = E x L y F t L y = L y t + L y θ t where θ t is the shift operator on Ω, F t, P x. Therefore, by the Markov property A y t = L y t + E x L y θ t F t = L y t + E Xt L y. Since L y is just the local time of X at y evaluated at time ϕ, by.8 we have E x L y = u 1 x, y. Therefore we can write 5.6 as 5.6 A y t = L y t + u 1 X t, y. Note that A y t is right continuous. Let K be a compact subset of R, D a countable dense subset of R and F a finite subset of D. We have 5.7 P x t Furthermore, since y z δ y,z F K L y t L z t ɛ P x t +P x t y z δ y,z D K y z δ y,z F K A y t A z t = A y t A z t y z δ y z δ y,z F K y,z F K A y t A z t ɛ u 1 X t, y u 1 X t, z ɛ. is a right continuous, non negative submartingale, we have that for any ɛ > 5.8 P x t y z δ y,z F K A y t A z t ɛ 1 ɛ Ex L y L z y z δ y,z F K 1 ɛ Ex L y L z. y z δ y,z D K Since, as mentioned above, L y is just the local time of X at y evaluated at time ϕ, we see from 5.1 that this last term goes to zero as δ goes to zero. Consequently, for any ɛ, ɛ >, we can choose a δ > such that 5.9 P x A y t A z t ɛ ɛ. t y z δ y,z F K

12 1 Local times of symmetric Lévy processes Using this in 5.7 we see that 5.1 P x L y t L z t ɛ t y z δ y,z F K ɛ + P x t y z δ y,z D K u 1 X t, y u 1 X t, z ɛ. u 1 is uniformly continuous on R. Therefore, we can take δ small enough so that the last term in 5.1 is equal to zero. Then, taking the limit over a sequence of finite sets increasing to D, we see for any ɛ and ɛ > we can find a δ > such that 5.11 P x t y z δ y,z D K L y t L z t ɛ ɛ It now follows by the Borel Cantelli Lemma that we can find a sequence {δ i }, δ i >, such that lim i δ i = and 5.1 t y z δ i y,z D K L y t L z t 1 i for all i Iω, almost surely with respect to P x. Fix T <. We now show that L y t is uniformly continuous on [, T ] K D, almost surely with respect to P x. That is, we show that for each ω in a set of measure one, with respect to P x, we can find an Iω such that for i Iω 5.13 s t δ i s,t [,T ] y z δ i y,z D K L y s L z t 1 i where {δ i } is a sequence of real numbers such that δ i > and lim i δ i =. To obtain 5.13, fix ω and assume that i Iω, so that 5.1 holds. Let {y 1,..., y n } be a finite subset of K D such that n 5.14 K By j, δ i+ j=1 where By, δ is a ball of radius δ in the Euclidean metric with center y. For each y j, j = 1,..., n, L yj t ω is the local time of W ω at y j. Hence it is continuous in t and consequently, uniformly continuous on [, T ]. Therefore, we can find a finite increasing sequence t 1 =, t,..., t k 1 < T, t k T such that t m t m 1 = δ i+ for all m = 1,..., k where δ i+ is chosen so that 5.15 L yj t m+1 ω L yj t m 1 ω 1 i+ j = 1,..., n; m = 1,..., k 1. Let s 1, s [, T ] and assume that s 1 s and that s s 1 δ i+. There exists an 1 m k 1 such that t m 1 s 1 s t m+1. Assume also that y, z K D satisfy y z δ i+. We can find a y j Y such that y By j, δ i+. If L y s ω L z s 1 ω we have 5.16 L y s ω L z s 1 ω L y t m+1 ω L z t m 1 ω L y t m+1 ω L yj t m+1 ω + L yj t m+1 ω L yj t m 1 ω L yj t m 1 ω L y t m 1 ω + L y t m 1 ω L z t m 1 ω

13 M. B. Marcus and J. Rosen 13 where we use the fact that L y t is non decreasing in t. The second term to the right of the last inequality in 5.16 is less than or equal to i+ by It follows from 5.1 that the other three terms are also less than or equal to i+, since y y j δ i+ and y z δ i+. Taking δ i = δ i+ δ i+ we get 5.13 on the larger set [, T ] K D for some T T. Obviously this implies 5.13 as stated in the case when L y s ω L z s 1 ω. A similar argument gives 5.13 when L y s ω L z s 1 ω. Thus 5.13 is established. Recall that L y t = L y t ϕ. Consequently L y t is uniformly continuous on [, T ϕ] K D, almost surely with respect to P x. Therefore by Fubini s theorem we see that 5.17 L y t is uniformly continuous on [, T ] K D, P x a.s. In what follows we say that a function is locally uniformly continuous on a measurable set in a locally compact metric space if it is uniformly continuous on all compact subsets of the set. Let K n be a sequence of compact subsets of R such that R = n=1k n. Let 5.18 ˆΩ = {ω L y t ωis locally uniformly continuous on R + R D} Let R denote the rational numbers. Then 5.19 ˆΩ c = {ω L y t ωis not uniformly continuous on [, s] K n D}. s R 1 n It follows from 5.17 that P x ˆΩ c = for all x R. Consequently 5. P x ˆΩ = 1 x R. We now construct a stochastic process ˆL = {ˆL y t, t, y R + R} which is continuous and which is a version of L. For ω ˆΩ, let {ˆL y t ω, t, y R + R} be the continuous extension of {L y t ω, t, y R + R D} to R + R. For ω ˆΩ c set 5.1 ˆL y t t, y R + R. {ˆL y t, t, y R + R} is a well defined stochastic process which, clearly, is jointly continuous on R + R. We now show that ˆL satisfies 5.3. To begin note that we could just as well have obtained 5.17 with D replaced by D {y} and hence obtained 5. with D replaced by D {y} in the definition of ˆΩ. Therefore if we take a sequence {y i } with y i D such that lim i y i = y we have that 5. lim i Lyi t = L y t locally uniformly on R + a.s. P x. By the definition of ˆL we also have 5.3 This shows that 5.4 lim i Lyi t = ˆL y t locally uniformly on R + a.s. P x. ˆL y t = L y t t a.s. P x which is 5.3. This completes the proof of Theorem 6.

14 14 Local times of symmetric Lévy processes 6. Kac s formula We give a version of Kac s formula for the moment generating function of the local time process evaluated at certain random times ξ. The formula is used in Section 7, with ξ taken to be an independent exponential T, in the proof of first isomorphism theorem and with ξ taken to be τt, in the proof of the second isomorphism theorem. Lemma 5. Let X be a Lévy process with finite α-potential density u α x, y. Let ξ be a finite random time such that V t, the process X t killed at ξ, is a Markov process with continuous zero potential density vx, y. Let Σ be the matrix with elements Σ i,j = vx i, x j, i, j = 1,..., n and let x 1 = y. Let Λ be the matrix with elements {Λ} i,j = λ i δ i,j. For all λ 1,..., λ n suficiently small we have n n E y exp λ i L xi ξ = {ΣΛ k } 1,i = 1 + k= n {I ΣΛ 1 ΣΛ} 1,i Proof Let q t x, dy denote the transition probabilities for V and f a real valued function. We have 6.1 E y { ξ = k! = k! fx s ds} k = E y { s 1 s k < s 1 s k < fv s ds} k k k E y fv si ds i k = k! vy, y 1 vy 1, y vy k 1, y k fy i q s1 y, dy 1 q s s 1 y 1, dy q sk s k 1 y k 1, dy k k ds i k fy i dmy i. For f in 6.1 take f ɛ = n j=1 λ jf ɛ,xj, where f ɛ,x is an approximate δ function at x. By Theorem we have that ξ f ɛx s ds n λ il xi ξ a.s. The continuity ξ k of vx, y together with 6.1 show that f ɛx s ds is uniformly integrable for any k. Hence 6. E y n = k! j=1 λ j L xj ξ n j 1,...,j k =1 k vy, x j1 λ j1 vx j1, x j λ j vx j, x j3 vx jk, x jk 1 λ jk 1 vx jk 1, x jk λ jk for all k. Actually, 6. holds even if v is not continuous.

15 M. B. Marcus and J. Rosen 15 Let β = vy, x 1 λ 1,..., vy, x n λ n and 1 be the transpose of an n dimensional vector with all of its elements equal to one. Note that n j k =1 vx j k 1, x jk λ jk is an n 1 matrix with entries {ΣΛ 1} jk 1, j k 1 = 1..., n. Note also that ΣΛ 1 is an n 1 matrix and 6.3 n j k 1 =1 vx jk, x jk 1 λ jk 1 {ΣΛ 1} jk 1 = {ΣΛ 1} jk. Iterating this relationship we get k E y n 6.4 λ j L xj ξ = k!βσλ k 1 1 j=1 = k! n {ΣΛ k } 1,i where we use the facts that x 1 = y and βσλ k 1 is an n dimensional vector which is the same as the first row of ΣΛ k. It follows from this that n n 6.5 E y exp λ i L xi ξ = {ΣΛ k } 1,i This gives us the equations in 6.1. = k= n {I} 1,i + { ΣΛ k } 1,i = 1 + k=1 n {I ΣΛ 1 ΣΛ} 1,i 7. Proofs of the Isomorphism Theorems We begin with a routine calculation which we provide for the convenience of the reader. Lemma 6. Let ζ = ζ 1,..., ζ n be a mean zero, n-dimensional Gaussian random variable with covariance matrix Σ. Assume that Σ is invertible. Let λ = λ 1,..., λ n be an n-dimensional vector and Λ an n n diagonal matrix with λ j as its j-th diagonal entry. Let u = u 1,..., u n be an n-dimensional vector. We can choose λ i >, i = 1..., n sufficiently small so that Σ 1 Λ is invertible and n 7.1 E exp λ i ζ i + u i / = 1 exp deti Σ Λ 1/ uλu t + uλ Σ Λu t where Σ def = Σ 1 Λ 1 and u = u 1,..., u n.

16 16 Local times of symmetric Lévy processes Proof 7. We write n E exp λ i ζ i + u i / uλu t n = exp E exp λ i ζi / + u i ζ i and 7.3 n E exp λ i ζi / + u i ζ i = = 1 det Σ 1/ 1/ det Σ Ẽeu Λξ det Σ 1/ exp u Λζ ζ Σ 1 Λ ζ t dζ where ξ is an n dimensional Gaussian random variable with mean zero and covariance matrix Σ and Ẽ is expectation with respect to the probability measure of ξ. Clearly Ẽe u Λξ uλ Σ Λu t 7.4 = exp. Putting these together gives us 7.1. Proof of Theorem 3 To prove this theorem it suffices to show that n E x1 E G exp λ i L xi ϕ + Gx i + s 7.5 = E 1 + Gx n 1 exp λ i Gx i + s / s for all x 1,..., x n, all s > and all λ 1,..., λ n sufficiently small. We write this as n E 1 + Gx1 E x1 exp λ i L xi s exp n λ igx i + s / 7.6 ϕ = E exp n λ. igx i + s / As in Lemma 6, we consider the matrices Σ, Λ and Σ = Σ 1 Λ 1, where Σ i,j = ux i, x j and Λ i,j = λ i δi, j. Using Lemma 6 we note that n 7.7 E G exp λ i Gx i + s i / s 1 n n = λ 1 s 1 + Σ 1,j λ j s j E G exp λ i Gx i + s i /. j=1

17 M. B. Marcus and J. Rosen 17 Also, clearly n 7.8 E G exp λ i Gx i + s i / s 1 = s 1 λ 1 E 1 + Gx n 1 exp λ i Gx i + s i /. Thus we see that E 1 + Gx 1 ]s 1 exp n λ igx i + s i / 7.9 E exp n λ = 1 + igx i + s i / s 1 Consequently, the right hand side of 7.6 is equal to n n Σ 1,j λ j = 1 + { ΣΛ} 1,j j=1 = 1 + = 1 + j=1 n Σ 1,j λ j s j /s 1. j=1 n {Σ 1 Λ 1 Λ} 1,j j=1 n {I ΣΛ 1 ΣΛ} 1,j By Lemma 5 applied with ξ = ϕ, so that vx, y = u 1 x, y, the left hand side of 7.6 is also equal to this last expression. Thus the theorem is proved. We now turn our attention to Theorem 4. It is proved in Eisenbaum, Kaspi, Marcus, Rosen and Shi 1999 in a more general framework; we give a simple direct proof here. The immediate approach to the proof of 3.8, in analogy with the proof of Theorem 3, would be to apply Kac s formula to the process which is X stopped at the random time τt. However, this process is not a Markov process. To get a Markov processes we consider X stopped at τt, where T = T q is an independent exponential random variable with mean 1/q. To be more specific, we consider { Xt if t < τt 7.11 Z t = otherwise To use Kac s formula we require that Z has a the potential density. This is given in the next lemma, which is proved in Section 8. Lemma 7. Z is a Markov process. When X is recurrent Z has a -potential density ũx, y, given by j=1 7.1 ũx, y = u {} x, y + 1/q where u {} x, y is defined in 3.3. Proof of Theorem 4 It suffices to show that n 7.13 E E η exp λ i L xi τt + η x i n = E η exp λ i η xi + t /

18 18 Local times of symmetric Lévy processes for all x 1,..., x n and all λ 1,..., λ n small. We write this as n E exp λ i L xi τt = E η exp n λ iη xi + t / 7.14 E η exp n λ iηx i /. We define the matrix Σ with Σ i,j = u {} x i, x j. As in Lemma 6, we use the notation Λ and Σ = Σ 1 Λ 1, where Λ i,j = λ i δi, j. It follows from Lemma 6 that E η exp n λ iη xi + t / 7.15 E η exp n t1λ1 λ iηx i / = exp t + t1λ ΣΛ1 t where 1 = 1, 1,..., 1. Note that 7.16 Λ + Λ ΣΛ = Λ + ΛΣ 1 Λ 1 Λ = Λ + ΛI ΣΛ 1 ΣΛ = ΛI + I ΣΛ 1 ΣΛ = ΛI ΣΛ 1 Let K = I ΣΛ 1. Then for q > 1ΛK1 t we have that qe qt E η exp n λ iη xi + t / 7.17 E η exp n λ iηx i / dt = qe qt e 1ΛK1tt dt q = q 1ΛK1 t 1ΛK1 t j = 1 +. q On the other hand 7.18 qe qt E exp n λ i L xi τt dt = E exp j=1 n λ i L xi τt where T is an independent exponential random variable with mean 1/q. We now show that the right hand sides of 7.17 and 7.18 are equal. This shows that the Laplace transforms of the two sides of equation 7.14 are equal, which completes the proof of this theorem. We use Lemma 5 with ξ = τt, so that V t is the Markov process Z defined in Without loss of generality we can assume that x 1 =. We have n n 7.19 E exp λ i L xi τt = { CΛ k } 1,i k= where, by Lemma 7, C i,j = Σ i,j +1/q for all i, j. Thus we have C = Σ+1/q1 t 1. Since x 1 =, C 1,j = for all j. Consequently we have C 1,j = 1/q1 j for all j. Therefore we can write 7.19 as 7. E exp n λ i L xi τt = q 1Λ k=1 = q 1Λ I Σ + 1 k 1 q 1t 1 Λ 1 t. Σ q 1t 1 Λ 1 t.

19 M. B. Marcus and J. Rosen 19 Note that 7.1 I Σ q 1t 1 Λ = = I ΣΛ 1 1 q 1t 1Λ I 1 q K1t 1Λ 1 K. Using this in 7. we see that n E exp λ i L xi τt = q 1Λ K1 t j 1Λ K1 t q j= which is the same as the right hand side of Let 8. Proof of Lemma Q t fx = E x fx t ; t < τt The Markov property for Z t follows from the equations 8. Q t+s fx = E x fx t+s ; t + s < τt = E x fx t+s E T t + s < τt = E x fx t+s E T T > L t+s = E x fx t+s e ql t+s = E x fx t+s e ql s θt e ql t = E x E Xt fx s e ql s e ql t = E x E Xt fx s ; s < τt ; t < τt = Q t Q s fx where, clearly, E T denotes expectation with respect to T. Using the Markov property at the stopping time τt we see that for any bounded continuous function f we have u α x, yfy dy = E x e αt fx t dt τt 8.3 = E x e αt fx t dt + E x e αt fx t dt τt = E x e αt fz t dt + E x e ατt E e αt fx t dt = E x e αt fz t dt + E x e ατt u α, yfy dy. The first term on the right hand side of 8.3 is, by definition, Ũ α fx, where Ũ α is the α potential of Z. To say that Z has an α potential density means that we can find a function ũ α x, y such that Ũ α fx = ũ α x, yfy dy. Thus we see from 8.3 that Z has the α-potential density 8.4 ũ α x, y = u α x, y E x e ατt u α, y = u α x, y E x e αt E e ατt u α, y

20 Local times of symmetric Lévy processes where the second equality comes by writing τt = T + τt θ T, with T = inf{t > : X t = }, and applying the Markov property at T. We can rewrite the second line of 8.4 as 8.5 ũ α x, y = u α x, y E x e αt u α, y +1 E e ατt E x e αt u α, y = u α x, y E x e αt u α, y +1 E e ατt E x e αt E y e αt u α, where the last equality comes from using the identity 8.6 E y e αt = u α y, /u α,. To make the proof more complete, the simple proof of this identity is given at the end of this section. Evaluating 8.4 with x, y =, gives us an expression for E e ατt. Using it in 8.5 shows that 8.7 ũ α x, y = u α x, y E x e αt u α, y + E x e αt E y e αt ũ α,. Using 8.6, see also 1.4 and 3.4, we note that 8.8 lim u α x, y E x e αt u α, y 8.9 α = lim α u α x y u α x u α y u α E x e αt = lim α u α u α x + u α u α y E x e αt u α u α x y = φx + φy φx y = u {} x, y. We now prove that lim α ũα, = 1/q. This implies that lim α ũ α x, y = ũx, y exists and that 7.1 holds. To obtain8.9 we first note that.3 together with the Markov property imply that τt u α x, y = E x e αt dl y t + E x e αt dl y t τt τt = E x e αt dl y t + E x e ατt E X τt e αt dl y t τt = E x e αt dl y t + E x e ατt u α, y. Comparing this with 8.4 shows that τt 8.1 E x e αs dl y s = ũ α x, y. Set x = y = in 8.1 and take the limit as α goes to zero. Since E L τt = ET we get 8.9. This completes the proof of lemma 7.

21 M. B. Marcus and J. Rosen 1 We now prove 8.6. Let T n be the first hitting time of [ 1/n, 1/n] and let f n 1, be as defined in 1.3. We have u α y, vf n 1,v dv = E y e αt f n 1,X t dt 8.11 = E y e αt f n 1,X t dt T n = E e y αtn E X Tn e αt f n 1,X t dt = E y e αtn u α X Tn, vf n 1,v dv. 8.6 follows from the continuity of u α y, v and the fact that T n increases to T as n goes to infinity, almost surely. References M. Barlow 1988 Necessary and sufficient conditions for the continuity of local time of Levy processes, Annals of Probabability, volume R.F. Bass, N. Eisenbaum, and Z. Shi 1999 The most visited sites of symmetric stable processes, preprint. J. Bertoin 1996 Lévy Processes, Cambridge University Press, Cambridge. R. Blumenthal and R. Getoor 1968 Markov Processes and Potential Theory, Academic Press, New York. N. Eisenbaum 19?? Une version sans conditionnement du theoreme d isomorphisme de Dynkin,?? N. Eisenbaum, H. Kaspi, M.B. Marcus, J. Rosen, and Z. Shi. A Ray Knight theorem for symmetric Markov processes, preprint. N.C. Jain and M.B. Marcus 1978 Continuity of subgaussian processes, In Probability in Banach spaces, volume 4 of Advances in Probability, Marcel Dekker, New York. M. Ledoux and M. Talagrand 1991 Probability in Banach Spaces, Springer-Verlag, New York. M.B. Marcus 1999 The most visited sites of certain Lévy processes, preprint. M.B. Marcus and J. Rosen 199 Sample path properties of the local times of strongly symmetric Markov processes via Gaussian processes, Annals of Probability, volume M.B. Marcus and L. A. Shepp 197 Sample behavior of Gaussian processes In Proc. of the Sixth Berkeley Symposium on Math. Statist. and Probab., volume, Univ. California press, Berkeley, CA. D. Stroock 1993 Probability Theory. Cambridge University Press, Cambridge.

22 Local times of symmetric Lévy processes Department of Mathematics Department of Mathematic The City College of CUNY College of Staten Island, CUNY New York, NY 131 Staten Island, NY 1314

A RAY KNIGHT THEOREM FOR SYMMETRIC MARKOV PROCESSES

A RAY KNIGHT THEOREM FOR SYMMETRIC MARKOV PROCESSES The Annals of Probability, Vol. 8, No. 4, 1781 1796 A RAY KNIGHT THEOREM FOR SYMMETRIC MARKOV PROCESSES By Nathalie Eisenbaum, Haya Kaspi, Michael B. Marcus, 1 Jay Rosen 1 and Zhan Shi Université Paris

More information

PERMANENTAL PROCESSES

PERMANENTAL PROCESSES Communications on Stochastic Analysis Vol. 5, No. 1 11 81-1 Serials Publications www.serialspublications.com PERMANENTAL PROCESSES HANA KOGAN, MICHAEL B. MARCUS, AND JAY ROSEN* Abstract. This is a survey

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

Harmonic Functions and Brownian motion

Harmonic Functions and Brownian motion Harmonic Functions and Brownian motion Steven P. Lalley April 25, 211 1 Dynkin s Formula Denote by W t = (W 1 t, W 2 t,..., W d t ) a standard d dimensional Wiener process on (Ω, F, P ), and let F = (F

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

Continuous Functions on Metric Spaces

Continuous Functions on Metric Spaces Continuous Functions on Metric Spaces Math 201A, Fall 2016 1 Continuous functions Definition 1. Let (X, d X ) and (Y, d Y ) be metric spaces. A function f : X Y is continuous at a X if for every ɛ > 0

More information

THE INVERSE FUNCTION THEOREM

THE INVERSE FUNCTION THEOREM THE INVERSE FUNCTION THEOREM W. PATRICK HOOPER The implicit function theorem is the following result: Theorem 1. Let f be a C 1 function from a neighborhood of a point a R n into R n. Suppose A = Df(a)

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Harmonic Functions and Brownian Motion in Several Dimensions

Harmonic Functions and Brownian Motion in Several Dimensions Harmonic Functions and Brownian Motion in Several Dimensions Steven P. Lalley October 11, 2016 1 d -Dimensional Brownian Motion Definition 1. A standard d dimensional Brownian motion is an R d valued continuous-time

More information

Asymptotic behaviour near extinction of continuous state branching processes

Asymptotic behaviour near extinction of continuous state branching processes Asymptotic behaviour near extinction of continuous state branching processes G. Berzunza and J.C. Pardo August 2, 203 Abstract In this note, we study the asymptotic behaviour near extinction of sub- critical

More information

Laplace s Equation. Chapter Mean Value Formulas

Laplace s Equation. Chapter Mean Value Formulas Chapter 1 Laplace s Equation Let be an open set in R n. A function u C 2 () is called harmonic in if it satisfies Laplace s equation n (1.1) u := D ii u = 0 in. i=1 A function u C 2 () is called subharmonic

More information

MATH 205C: STATIONARY PHASE LEMMA

MATH 205C: STATIONARY PHASE LEMMA MATH 205C: STATIONARY PHASE LEMMA For ω, consider an integral of the form I(ω) = e iωf(x) u(x) dx, where u Cc (R n ) complex valued, with support in a compact set K, and f C (R n ) real valued. Thus, I(ω)

More information

Insert your Booktitle, Subtitle, Edition

Insert your Booktitle, Subtitle, Edition C. Landim Insert your Booktitle, Subtitle, Edition SPIN Springer s internal project number, if known Monograph October 23, 2018 Springer Page: 1 job: book macro: svmono.cls date/time: 23-Oct-2018/15:27

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

A LOCALIZATION PROPERTY AT THE BOUNDARY FOR MONGE-AMPERE EQUATION

A LOCALIZATION PROPERTY AT THE BOUNDARY FOR MONGE-AMPERE EQUATION A LOCALIZATION PROPERTY AT THE BOUNDARY FOR MONGE-AMPERE EQUATION O. SAVIN. Introduction In this paper we study the geometry of the sections for solutions to the Monge- Ampere equation det D 2 u = f, u

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Kolmogorov s Forward, Basic Results

Kolmogorov s Forward, Basic Results 978--521-88651-2 - Partial Differential Equations for Probabilists chapter 1 Kolmogorov s Forward, Basic Results The primary purpose of this chapter is to present some basic existence and uniqueness results

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

A Note on the Central Limit Theorem for a Class of Linear Systems 1

A Note on the Central Limit Theorem for a Class of Linear Systems 1 A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

THE MALLIAVIN CALCULUS FOR SDE WITH JUMPS AND THE PARTIALLY HYPOELLIPTIC PROBLEM

THE MALLIAVIN CALCULUS FOR SDE WITH JUMPS AND THE PARTIALLY HYPOELLIPTIC PROBLEM Takeuchi, A. Osaka J. Math. 39, 53 559 THE MALLIAVIN CALCULUS FOR SDE WITH JUMPS AND THE PARTIALLY HYPOELLIPTIC PROBLEM ATSUSHI TAKEUCHI Received October 11, 1. Introduction It has been studied by many

More information

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS Bendikov, A. and Saloff-Coste, L. Osaka J. Math. 4 (5), 677 7 ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS ALEXANDER BENDIKOV and LAURENT SALOFF-COSTE (Received March 4, 4)

More information

Lower Tail Probabilities and Related Problems

Lower Tail Probabilities and Related Problems Lower Tail Probabilities and Related Problems Qi-Man Shao National University of Singapore and University of Oregon qmshao@darkwing.uoregon.edu . Lower Tail Probabilities Let {X t, t T } be a real valued

More information

Math212a1413 The Lebesgue integral.

Math212a1413 The Lebesgue integral. Math212a1413 The Lebesgue integral. October 28, 2014 Simple functions. In what follows, (X, F, m) is a space with a σ-field of sets, and m a measure on F. The purpose of today s lecture is to develop the

More information

NOTES ON EXISTENCE AND UNIQUENESS THEOREMS FOR ODES

NOTES ON EXISTENCE AND UNIQUENESS THEOREMS FOR ODES NOTES ON EXISTENCE AND UNIQUENESS THEOREMS FOR ODES JONATHAN LUK These notes discuss theorems on the existence, uniqueness and extension of solutions for ODEs. None of these results are original. The proofs

More information

4.6 Example of non-uniqueness.

4.6 Example of non-uniqueness. 66 CHAPTER 4. STOCHASTIC DIFFERENTIAL EQUATIONS. 4.6 Example of non-uniqueness. If we try to construct a solution to the martingale problem in dimension coresponding to a(x) = x α with

More information

Exponential functionals of Lévy processes

Exponential functionals of Lévy processes Exponential functionals of Lévy processes Víctor Rivero Centro de Investigación en Matemáticas, México. 1/ 28 Outline of the talk Introduction Exponential functionals of spectrally positive Lévy processes

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Real Analysis Problems

Real Analysis Problems Real Analysis Problems Cristian E. Gutiérrez September 14, 29 1 1 CONTINUITY 1 Continuity Problem 1.1 Let r n be the sequence of rational numbers and Prove that f(x) = 1. f is continuous on the irrationals.

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Weak Variation of Gaussian Processes

Weak Variation of Gaussian Processes Journal of Theoretical Probability, Vol. 10, No. 4, 1997 Weak Variation of Gaussian Processes Yimin Xiao1,2 Received December 6, 1995; revised June 24, 1996 Let X(l) (ter) be a real-valued centered Gaussian

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES. Masatoshi Fukushima

ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES. Masatoshi Fukushima ON ITÔ S ONE POINT EXTENSIONS OF MARKOV PROCESSES Masatoshi Fukushima Symposium in Honor of Kiyosi Itô: Stocastic Analysis and Its Impact in Mathematics and Science, IMS, NUS July 10, 2008 1 1. Itô s point

More information

Non-Essential Uses of Probability in Analysis Part IV Efficient Markovian Couplings. Krzysztof Burdzy University of Washington

Non-Essential Uses of Probability in Analysis Part IV Efficient Markovian Couplings. Krzysztof Burdzy University of Washington Non-Essential Uses of Probability in Analysis Part IV Efficient Markovian Couplings Krzysztof Burdzy University of Washington 1 Review See B and Kendall (2000) for more details. See also the unpublished

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

1 Lyapunov theory of stability

1 Lyapunov theory of stability M.Kawski, APM 581 Diff Equns Intro to Lyapunov theory. November 15, 29 1 1 Lyapunov theory of stability Introduction. Lyapunov s second (or direct) method provides tools for studying (asymptotic) stability

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability...

Functional Analysis. Franck Sueur Metric spaces Definitions Completeness Compactness Separability... Functional Analysis Franck Sueur 2018-2019 Contents 1 Metric spaces 1 1.1 Definitions........................................ 1 1.2 Completeness...................................... 3 1.3 Compactness......................................

More information

MATH 722, COMPLEX ANALYSIS, SPRING 2009 PART 5

MATH 722, COMPLEX ANALYSIS, SPRING 2009 PART 5 MATH 722, COMPLEX ANALYSIS, SPRING 2009 PART 5.. The Arzela-Ascoli Theorem.. The Riemann mapping theorem Let X be a metric space, and let F be a family of continuous complex-valued functions on X. We have

More information

Potential Theory on Wiener space revisited

Potential Theory on Wiener space revisited Potential Theory on Wiener space revisited Michael Röckner (University of Bielefeld) Joint work with Aurel Cornea 1 and Lucian Beznea (Rumanian Academy, Bukarest) CRC 701 and BiBoS-Preprint 1 Aurel tragically

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

g(x) = P (y) Proof. This is true for n = 0. Assume by the inductive hypothesis that g (n) (0) = 0 for some n. Compute g (n) (h) g (n) (0)

g(x) = P (y) Proof. This is true for n = 0. Assume by the inductive hypothesis that g (n) (0) = 0 for some n. Compute g (n) (h) g (n) (0) Mollifiers and Smooth Functions We say a function f from C is C (or simply smooth) if all its derivatives to every order exist at every point of. For f : C, we say f is C if all partial derivatives to

More information

Brownian Motion. Chapter Stochastic Process

Brownian Motion. Chapter Stochastic Process Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic

More information

On Stopping Times and Impulse Control with Constraint

On Stopping Times and Impulse Control with Constraint On Stopping Times and Impulse Control with Constraint Jose Luis Menaldi Based on joint papers with M. Robin (216, 217) Department of Mathematics Wayne State University Detroit, Michigan 4822, USA (e-mail:

More information

SYMMETRIC STABLE PROCESSES IN PARABOLA SHAPED REGIONS

SYMMETRIC STABLE PROCESSES IN PARABOLA SHAPED REGIONS SYMMETRIC STABLE PROCESSES IN PARABOLA SHAPED REGIONS RODRIGO BAÑUELOS AND KRZYSZTOF BOGDAN Abstract We identify the critical exponent of integrability of the first exit time of rotation invariant stable

More information

Simultaneous drift conditions for Adaptive Markov Chain Monte Carlo algorithms

Simultaneous drift conditions for Adaptive Markov Chain Monte Carlo algorithms Simultaneous drift conditions for Adaptive Markov Chain Monte Carlo algorithms Yan Bai Feb 2009; Revised Nov 2009 Abstract In the paper, we mainly study ergodicity of adaptive MCMC algorithms. Assume that

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Sobolev Spaces. Chapter 10

Sobolev Spaces. Chapter 10 Chapter 1 Sobolev Spaces We now define spaces H 1,p (R n ), known as Sobolev spaces. For u to belong to H 1,p (R n ), we require that u L p (R n ) and that u have weak derivatives of first order in L p

More information

Zdzis law Brzeźniak and Tomasz Zastawniak

Zdzis law Brzeźniak and Tomasz Zastawniak Basic Stochastic Processes by Zdzis law Brzeźniak and Tomasz Zastawniak Springer-Verlag, London 1999 Corrections in the 2nd printing Version: 21 May 2005 Page and line numbers refer to the 2nd printing

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Math 341: Convex Geometry. Xi Chen

Math 341: Convex Geometry. Xi Chen Math 341: Convex Geometry Xi Chen 479 Central Academic Building, University of Alberta, Edmonton, Alberta T6G 2G1, CANADA E-mail address: xichen@math.ualberta.ca CHAPTER 1 Basics 1. Euclidean Geometry

More information

Weak convergence and large deviation theory

Weak convergence and large deviation theory First Prev Next Go To Go Back Full Screen Close Quit 1 Weak convergence and large deviation theory Large deviation principle Convergence in distribution The Bryc-Varadhan theorem Tightness and Prohorov

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Proofs for Large Sample Properties of Generalized Method of Moments Estimators

Proofs for Large Sample Properties of Generalized Method of Moments Estimators Proofs for Large Sample Properties of Generalized Method of Moments Estimators Lars Peter Hansen University of Chicago March 8, 2012 1 Introduction Econometrica did not publish many of the proofs in my

More information

Convergence of Feller Processes

Convergence of Feller Processes Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9 MAT 570 REAL ANALYSIS LECTURE NOTES PROFESSOR: JOHN QUIGG SEMESTER: FALL 204 Contents. Sets 2 2. Functions 5 3. Countability 7 4. Axiom of choice 8 5. Equivalence relations 9 6. Real numbers 9 7. Extended

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

PROBLEMS. (b) (Polarization Identity) Show that in any inner product space

PROBLEMS. (b) (Polarization Identity) Show that in any inner product space 1 Professor Carl Cowen Math 54600 Fall 09 PROBLEMS 1. (Geometry in Inner Product Spaces) (a) (Parallelogram Law) Show that in any inner product space x + y 2 + x y 2 = 2( x 2 + y 2 ). (b) (Polarization

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

CONSTRUCTING STRONG MARKOV PROCESSES

CONSTRUCTING STRONG MARKOV PROCESSES CONSTRUCTING STRONG MARKOV PROCESSES ROBERT J. VANDERBEI Abstract. The construction presented in this paper can be briefly described as follows: starting from any finite-dimensional Markov transition function

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Multiple Random Variables

Multiple Random Variables Multiple Random Variables Joint Probability Density Let X and Y be two random variables. Their joint distribution function is F ( XY x, y) P X x Y y. F XY ( ) 1, < x

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

From now on, we will represent a metric space with (X, d). Here are some examples: i=1 (x i y i ) p ) 1 p, p 1.

From now on, we will represent a metric space with (X, d). Here are some examples: i=1 (x i y i ) p ) 1 p, p 1. Chapter 1 Metric spaces 1.1 Metric and convergence We will begin with some basic concepts. Definition 1.1. (Metric space) Metric space is a set X, with a metric satisfying: 1. d(x, y) 0, d(x, y) = 0 x

More information

Stability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan

Stability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan Stability and Sensitivity of the Capacity in Continuous Channels Malcolm Egan Univ. Lyon, INSA Lyon, INRIA 2019 European School of Information Theory April 18, 2019 1 / 40 Capacity of Additive Noise Models

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

Feller Processes and Semigroups

Feller Processes and Semigroups Stat25B: Probability Theory (Spring 23) Lecture: 27 Feller Processes and Semigroups Lecturer: Rui Dong Scribe: Rui Dong ruidong@stat.berkeley.edu For convenience, we can have a look at the list of materials

More information

Commutative Banach algebras 79

Commutative Banach algebras 79 8. Commutative Banach algebras In this chapter, we analyze commutative Banach algebras in greater detail. So we always assume that xy = yx for all x, y A here. Definition 8.1. Let A be a (commutative)

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

ON PARABOLIC HARNACK INEQUALITY

ON PARABOLIC HARNACK INEQUALITY ON PARABOLIC HARNACK INEQUALITY JIAXIN HU Abstract. We show that the parabolic Harnack inequality is equivalent to the near-diagonal lower bound of the Dirichlet heat kernel on any ball in a metric measure-energy

More information

WEYL S LEMMA, ONE OF MANY. Daniel W. Stroock

WEYL S LEMMA, ONE OF MANY. Daniel W. Stroock WEYL S LEMMA, ONE OF MANY Daniel W Stroock Abstract This note is a brief, and somewhat biased, account of the evolution of what people working in PDE s call Weyl s Lemma about the regularity of solutions

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

HOMEWORK 2 - RIEMANNIAN GEOMETRY. 1. Problems In what follows (M, g) will always denote a Riemannian manifold with a Levi-Civita connection.

HOMEWORK 2 - RIEMANNIAN GEOMETRY. 1. Problems In what follows (M, g) will always denote a Riemannian manifold with a Levi-Civita connection. HOMEWORK 2 - RIEMANNIAN GEOMETRY ANDRÉ NEVES 1. Problems In what follows (M, g will always denote a Riemannian manifold with a Levi-Civita connection. 1 Let X, Y, Z be vector fields on M so that X(p Z(p

More information

Average theorem, Restriction theorem and Strichartz estimates

Average theorem, Restriction theorem and Strichartz estimates Average theorem, Restriction theorem and trichartz estimates 2 August 27 Abstract We provide the details of the proof of the average theorem and the restriction theorem. Emphasis has been placed on the

More information

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space

Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Elements of Positive Definite Kernel and Reproducing Kernel Hilbert Space Statistical Inference with Reproducing Kernel Hilbert Space Kenji Fukumizu Institute of Statistical Mathematics, ROIS Department

More information

Loop measures without transition probabilities

Loop measures without transition probabilities Loop measures without transition probabilities Pat Fitzsimmons, Yves Le Jan, Jay Rosen To cite this version: Pat Fitzsimmons, Yves Le Jan, Jay Rosen. Loop measures without transition probabilities. 215.

More information

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45

Division of the Humanities and Social Sciences. Supergradients. KC Border Fall 2001 v ::15.45 Division of the Humanities and Social Sciences Supergradients KC Border Fall 2001 1 The supergradient of a concave function There is a useful way to characterize the concavity of differentiable functions.

More information

is a Borel subset of S Θ for each c R (Bertsekas and Shreve, 1978, Proposition 7.36) This always holds in practical applications.

is a Borel subset of S Θ for each c R (Bertsekas and Shreve, 1978, Proposition 7.36) This always holds in practical applications. Stat 811 Lecture Notes The Wald Consistency Theorem Charles J. Geyer April 9, 01 1 Analyticity Assumptions Let { f θ : θ Θ } be a family of subprobability densities 1 with respect to a measure µ on a measurable

More information

Spectral Gap and Concentration for Some Spherically Symmetric Probability Measures

Spectral Gap and Concentration for Some Spherically Symmetric Probability Measures Spectral Gap and Concentration for Some Spherically Symmetric Probability Measures S.G. Bobkov School of Mathematics, University of Minnesota, 127 Vincent Hall, 26 Church St. S.E., Minneapolis, MN 55455,

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

25.1 Ergodicity and Metric Transitivity

25.1 Ergodicity and Metric Transitivity Chapter 25 Ergodicity This lecture explains what it means for a process to be ergodic or metrically transitive, gives a few characterizes of these properties (especially for AMS processes), and deduces

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

ON THE SHAPE OF THE GROUND STATE EIGENFUNCTION FOR STABLE PROCESSES

ON THE SHAPE OF THE GROUND STATE EIGENFUNCTION FOR STABLE PROCESSES ON THE SHAPE OF THE GROUND STATE EIGENFUNCTION FOR STABLE PROCESSES RODRIGO BAÑUELOS, TADEUSZ KULCZYCKI, AND PEDRO J. MÉNDEZ-HERNÁNDEZ Abstract. We prove that the ground state eigenfunction for symmetric

More information

Gaussian Random Fields: Geometric Properties and Extremes

Gaussian Random Fields: Geometric Properties and Extremes Gaussian Random Fields: Geometric Properties and Extremes Yimin Xiao Michigan State University Outline Lecture 1: Gaussian random fields and their regularity Lecture 2: Hausdorff dimension results and

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

9. Banach algebras and C -algebras

9. Banach algebras and C -algebras matkt@imf.au.dk Institut for Matematiske Fag Det Naturvidenskabelige Fakultet Aarhus Universitet September 2005 We read in W. Rudin: Functional Analysis Based on parts of Chapter 10 and parts of Chapter

More information

ON MARKOV AND KOLMOGOROV MATRICES AND THEIR RELATIONSHIP WITH ANALYTIC OPERATORS. N. Katilova (Received February 2004)

ON MARKOV AND KOLMOGOROV MATRICES AND THEIR RELATIONSHIP WITH ANALYTIC OPERATORS. N. Katilova (Received February 2004) NEW ZEALAND JOURNAL OF MATHEMATICS Volume 34 (2005), 43 60 ON MARKOV AND KOLMOGOROV MATRICES AND THEIR RELATIONSHIP WITH ANALYTIC OPERATORS N. Katilova (Received February 2004) Abstract. In this article,

More information

Introduction to self-similar growth-fragmentations

Introduction to self-similar growth-fragmentations Introduction to self-similar growth-fragmentations Quan Shi CIMAT, 11-15 December, 2017 Quan Shi Growth-Fragmentations CIMAT, 11-15 December, 2017 1 / 34 Literature Jean Bertoin, Compensated fragmentation

More information