Weak convergence of stochastic integrals and differential equations II: Infinite dimensional case 1

Size: px
Start display at page:

Download "Weak convergence of stochastic integrals and differential equations II: Infinite dimensional case 1"

Transcription

1 Weak convergence of stochastic integrals and differential equations II: Infinite dimensional case 1 Thomas G. Kurtz 2 Philip E. Protter 3 Departments of Mathematics and Statistics Departments of Mathematics and Statistics University of Wisconsin - Madison Purdue University Madison, WI West Lafayette, IN February 6, 1996 Minor corrections, March 13, 24 Contents 1 Introduction 2 2 Semimartingale random measures Moment estimates for martingale random measures A convergence theorem for counting measures H # -semimartingales Finite dimensional approximations Integral estimates H # -semimartingale integrals Predictable integrands Examples Convergence of stochastic integrals Convergence in infinite dimensional spaces Integrals with infinite-dimensional range Convergence theorem Verification of standardness Equicontinuity of stochastic integrals Consequences of the uniform tightness condition Stochastic differential equations Uniqueness for stochastic differential equations Sequences of stochastic differential equations Lecture notes for the CIME school in probablity. These notes first appeared in Probabilistic models for nonlinear partial differential equations (Montecatini Terme, 1995), , Lecture Notes in Math., 1627, Springer, Berlin, Research supported in part by NSF grants DMS and DMS Research supported in part by NSF grant INT and NSA grant MDR H-249 1

2 8 Markov processes Infinite systems Systems driven by Poisson random measures Uniqueness for general systems Convergence of sequences of systems McKean-Vlasov limits Stochastic partial differential equations Estimates for stochastic convolutions Eigenvector expansions Particle representations Examples Averaging Diffusion approximations for Markov chains Feller diffusion approximation for Wright-Fisher model Limit theorems for jump processes An Euler scheme References Introduction In Part I, we discussed weak limit theorems for stochastic integrals, with the prinicple result being the following (cf. Part I, Section 7): Theorem 1.1 For each n = 1, 2,..., let (X n, Y n ) be an {F n t }-adapted process with sample paths in D M km R m[, ) such that Y n is an {F n t }-semimartingale. Let Y n = M n + A n be a decomposition of Y n into an {F n t }-martingale M n and a finite variation process A n. Suppose that one of the following two conditions hold: UT (Uniform tightness.) For S n, the collection of piecewise constant, {Ft n }-adapted processes Ht = n=1{ Z Y n (t) : Z S n, sup Z(s) 1} is stochastically bounded. UCV (Uniformly controlled variations.) {T t (A n )} is stochastically bounded for each t >, and there exist stopping times {τ α n } such that P {τ α n α} α 1 and for each t >. sup E[[M n ] t τ α n ] < n 2

3 If (X n, Y n ) (X, Y ) in the Skorohod topology on D M km R m[, ), then Y is an {F t}- semimartingale for a filtration {F t } with respect to which X is adapted and (X n, Y n, X n Y n ) (X, Y, X Y ) in D M km R m Rk[, ). If (X n, Y n ) (X, Y ) in probability, then (X n, Y n, X n Y n ) (X, Y, X Y ) in probability. In this part, we consider the analogous results for stochastic integrals with respect to infinite dimensional semimartingales. We are primarily concerned with integrals with respect to semimartingale random measures, in particular, worthy martingale measures as developed by Walsh (1986). We discover, however, that the class of semimartingale random measures is not closed under the natural notion of weak limit unlike the class of finite dimensional semimartingales (Part I, Theorem 7.3). Consequently, we work with a larger class of infinite dimensional semimartingales which we call H # -semimartingales. This class includes semimartingale random measures, Banach-space valued martingales, and cylindrical Brownian motion. A summary of results on semimartingale random measures is given in Section 2. The definitions and results come primarily from Walsh (1986). H # -semimartingales are introduced in Section 3. The stochastic integral is defined through approximation by finite dimensional integrands. The basic assumption on the semimartingale is essentially the good integrator condition that defines a semimartingale in the sense of Section 1 of Part I. This approach allows us to obtain the basic stochastic integral covergence theorems in Sections 4 and 5 as an application of Theorem 1.1. Previous general results on convergence of infinite dimensional stochastic integrals include work of Walsh (1986), Chapter 7, Cho (1994, 1995), and Jakubowski (1995). Walsh and Cho consider martingale random measures as distribution-valued processes converging weakly in D S (R )[, ). Walsh assumes all processes are defined on the same sample space d (the canonical sample space) and requires a strong form of convergence for the integrands. Cho requires (X n, M n ) (X, M) in D L S (R )[, ) where L is an appropriate function d space. Both Walsh and Cho work under assumptions analogous to the UCV assumption. Jakubowski gives results for Hilbert space-valued semimartingales under the analogue of the UT condition. Our results are given under the UT condition, although estimates of the type used by Walsh and Cho are needed to verify that particular sequences satisfy the UT condition. Section 6 contains a variety of technical results on the uniform tightness condition. Section 7 includes a uniqueness result for stochastic differential equations satisfying a Lipschitz condition with a proof that seems to be new even in the finite dimensional setting. As an example, a spin-flip model is obtained as a solution of a stochastic differential equation in sequence space. Convergence results for stochastic differential equations are given based on the results of Sections 4 and 5. Section 8 briefly discusses stochastic differential equations for Markov processes and introduces L 1 -estimates of Graham that are useful in proving existence and uniqueness, particularly for infinite systems. Infinite systems are the topic of Section 9. Existence and uniqueness results, similar to results of Shiga and Shimizu (198) for systems of diffusions, are given for very general kinds of equations. Results of McKean-Vlasov type are given in Section 1 using the results of Section 9. 3

4 Stochastic partial differential equations are a natural area of application for the results discussed here. We have not yet developed these applications, but Section 11 summarizes some of the ideas that seem most useful in obtaining convergence theorems. Section 12 includes several simple examples illustrating the methods of the paper. Diffusion approximations, an averaging theorem, limit theorems for jump processes, and error analysis for a simulation scheme are described. Acknowledgements. Many people provided insight and important references during the writing of this paper. In particular, we would like to thank Don Dawson, Carl Graham, Peter Kotelenez, Jim Kuelbs, Luciano Tubaro, and John Walsh. 2 Semimartingale random measures. Let (U, r U ) be a complete, separable metric space, let U 1 U 2 be a sequence of sets {U m } B(U) satisfying m U m = U, and let A = {B B(U) : B U m, some m}. Frequently, U m = U and A = B(U). Let (Ω, F, P ) be a complete probability space and {F t } a complete, right continuous filtration, and let Y be a stochastic process indexed by A [, ) such that For each B A, Y (B, ) is an {F t }-semimartingale with Y (B, ) =. For each t and each disjoint sequence {B i } B(U), Y (U m i=1b i, t) = i=1 Y (U m B i, t) a.s. Then Y is an {F t }-semimartingale random measure. We will say that Y is standard if Y = M + V where V (B, t) = Ṽ (B [, t]) for a random σ-finite signed measure Ṽ on U [, ) satisfying Ṽ (U m [, t]) < a.s. for each m = 1, 2,... and t and M is a worthy martingale random measure in the sense of Walsh (1986), that is, M(A. ) is locally square integrable for each A A, and there exists a (positive) random measure K on U U [, ) such that M(A), M(B) t+s M(A), M(B) t K(A B (t, t + s]), A, B A, and K(U m U m [, t]) < a.s. for each t >. K is called the dominating measure. (Merzbach and Zakai (1993) define a slightly more general notion of quasi-worthy martingale which could be employed here. See also the defintion of conditionally worthy in Example 12.5.) Note that if U is finite, then every semimartingale random measure is standard. M is orthogonal if M(A), M(B) t = for A B =. If M is orthogonal, then π(a (, t]) M(A) t extends to a random measure on U [, ), and if we define K(Γ) =π(f 1 (Γ)) for f(u, t) = (u, u, t), K is a dominating measure for M. In particular, if M is orthogonal, then M is worthy. If ϕ is a simple function on U, that is ϕ = m i=1 c ii Bi for disjoint {B i } A, and {c i } R, then we can define Y (ϕ, t) = m c i Y (B i, t). i=1 4

5 If Y is standard and E[K(U U [, t])] < for each t >, then E[M(ϕ, t) 2 ] E[ ϕ(u)ϕ(v) K(du dv (, t])] ϕ 2 E[K(U U (, t])] (2.1) U U so Y (ϕ, t) can be extended uniquely at least to all ϕ B(U) for which the integral against Ṽ is defined, that is Y (ϕ, t) = M(ϕ, t) + ϕ(u)ṽ (du [, t]) where M(ϕ, t) is defined as the limit of M(ϕ n, t) for simple ϕ n using (2.1). More generally, for each j =, 1,..., let {B j i } A be disjoint, let = t < t 1 < t 2 <, and let C j i be an F tj -measurable random variable. Define U X(u, t) = i,j C j i I B j i (u)i (tj,t j+1 ](t) (2.2) and define X Y by X Y (t) = C j i (Y (Bj i, t j+1 t) Y (B j i, t j t)). Again, we can estimate the martingale part [ ] E[(X M(t)) 2 ] = E C j i 1 C j i 2 ( M(B j i 1 ), M(B j i 2 ) tj+1 t M(B j i 1 ), M(B j i 2 ) tj t) j i 1,i 2 [ ] E X(u, s)x(v, s) K(du dv ds), (2.3) U U [,t] and if E[K(U U [, t])] < for each t >, we can extend the definition of X M (and hence of X Y ) to all bounded, Ṽ -integrable processes that can be approximated by simple processes of the form (2.2). An alternative approach to defining X M is to first consider X(t, u) = m ξ i (t)i Bi (u), (2.4) i=1 for disjoint {B i } A and cadlag, adapted ξ i. Set X Y (t) = m i=1 t ξ i (s )dy (B i, s), where the integrals are ordinary semimartingale integrals. We then have E[(X M(t)) 2 ] = E[ i,j E[ t ξ i (s )ξ j (s )d M(B i, ), M(B j, ) s ] (2.5) U U (,t] X(s, u)x(s, v) K(du dv ds)], 5

6 which, of course, is the same as (2.3). By Doob s inequality, we have E[sup(X M(s)) 2 ] 4E[ X(s, u)x(s, v) K(du dv ds)]. (2.6) U U (,t] For future reference, we also note the simple corresponding inequalities for V, E[sup X V (s) ] E[ X(s, u) Ṽ (du ds)] (2.7) and U (,t] E[sup(X V (s)) 2 ] E[ Ṽ (U [, t]) X(s, u) 2 Ṽ (du ds)]. (2.8) U (,t] Let P be the σ-algebra of subsets of Ω U [, ) generated by sets of the form A B (t, t + s] for t, s, A F t, and B B(U). P is the σ-algebra of predictable sets. If E[K(U U [, t])] < and Ṽ (U [, t]) < a.s. for each t >, then the bounded P-measurable functions gives the class of bounded processes X for which X Y is defined. Of course, the estimate (2.3) also allows extension to unbounded X for which the right side is finite, provided X is also almost surely integrable with respect to Ṽ. Note that if K satisfies K(A B [, t]) = K 1 (A B [, t]) + K 2 (A B [, t]) (2.9) where K 1 is a random measure on U [, ) and we define then ˆK(A [, t]) = K 1 (A [, t]) (K 2(A U [, t]) + K 2 (U A [, t])), (2.1) E[sup(X M(t)) 2 ] 4E[ X(u, s) 2 ˆK(du ds)]. (2.11) U [,t] For future reference, if Ṽ (U m [, ]) is locally in L 1 for each m, that is, there is a sequence of stopping times τ n such that E[ Ṽ (U m [, t τ n ])] < for each t >, then we can define ˆV (A [, ]) to be the predictable projection of Ṽ (A [, ]), and we have E[ X(s, u) Ṽ (du ds)] = E[ X(s, u) ˆV (du ds)] (2.12) U [,t] for all positive, cadlag, adapted X (allowing = ). Example 2.1 Gaussian white noise. U [,t] The canonical example of a martingale random measure is given by the Gaussian process indexed by A = {A B(U) B([, )) : µ m(a) < } and satisfying E[W (A)] = and E[W (A)W (B)] = µ m(a B), where m denotes Lebesgue measure and µ is a σ-finite measure on U. If we define M(A, t) = W (A [, t]) for A B(U), µ(a) <, and t, then M is an orthogonal martingale random measure with K(A B [, t]) = tµ(a B), and for fixed A, M(A, ) is a Brownian motion. 6

7 Example 2.2 Poisson random measures. Let ν be a σ-finite measure on U and let h(u) be in L 2 (ν). Let N be a Poisson random measure on U [, ) with mean measure ν m, that is, for A B(U) B([, )), N(A) has a Poisson distribution with expectation ν m(a) and N(A) and N(B) are independent if A B =. For A B(U) satisfying ν(a) <, define M(A, t) = h(u)(n(du [, t]) A ν(du)t). Noting that E[M(A, t) 2 ] = t A h(u)2 ν(du) and that {M(A i, t)} are independent for disjoint {A i }, we can extend M to all of B(U) by addition. Suppose Z is a process with independent increments with generator Af(z) = (f(z + u) f(z) ui { u 1} f (z))ν(du). R Then ν must satisfy R u2 1ν(du) <. (See, for example, Feller (1971).) Let U = R, and let N be the Poisson random measure with mean measure ν m. Define M(A, t) = ui A { u 1}(N(du [, t]) ν(du)t), V (A, t) = ui A { u >1}N(du [, t]), and Y (A, t) = M(A, t) + V (A, t). Then we can represent Z by Z(t) = Y (R, t). Consider a sequence of Poisson random measures with mean measures nν m. Define M n (A, t) = 1 h(u)(n n (du [, t]) ntν(du)). (2.13) n A Then M n is an orthogonal martingale random measure with M n (A), M n (B) t = t h(u) 2 ν(du) = K(A B [, t]). A B By the central limit theorem, M n converges (in the sense of finite dimensional distributions) to the Gaussian white noise martingale random measure outlined in Example 2.1 with µ(a) = A h(u)2 ν(du). Example 2.3 Empirical measures. Let ξ 1, ξ 2,... be iid U-valued random variables with distribution µ, and define M n (A, t) = 1 [nt] (I A (ξ i ) µ(a)). (2.14) n i=1 Then M n (A), M n (B) t = [nt] (µ(a B) µ(a)µ(b)). Note that K n (A B) = µ(a B) + µ(a)µ(b) extends to a measure on U U and K n (A B (, t]) = K (A B) [nt] n extends to a measure on U U [, ) which will be a dominating measure for M n. Of course, M n converges to a Gaussian martingale random measure with conditional covariation M(A), M(B) t = t(µ(a B) µ(a)µ(b)) and dominating measure K m. 7

8 2.1 Moment estimates for martingale random measures. Suppose that M is an orthogonal martingale measure. If A, B A are disjoint, then [M(A), M(B)] t = and in particular, M(A, ) and M(B, ) have a.s. no simultaneous discontinuities. It follows that Π(A [, t]) = [M(A)] t determines a random measure on U [, ) as does Π k (A [, t]) = (M(A, s) M(A, s )) k (2.15) for even k > 2. For odd k > 2, (2.15) determines a random signed measure. For X of the form (2.4), it is easy to check that [X M] t = X 2 (s, u)π(du ds), U [,t] and setting Z = X M and letting Z(s) = Z(s) Z(s ), we have Z k (t) = = t U [,t] t kz k 1 k(k 1) (s )dz(s) + Z k 2 (s)d[z] s 2 + ( ( ) k Z k (s) Z k (s ) kz k 1 (s ) Z(s) )Z k 2 (s ) Z(s) 2 2 ( ) k kz k 1 (s )X(s, u)m(du ds) + Z k 2 (s)x 2 (s, u)π(du ds) U [,t] 2 k ( ) k + Z k j (s )X j (s, u)π j (du ds) (2.16) j U [,t] j=3 and can be extended to more general X under appropriate conditions. Since M(A, ) is locally square integrable, [M(A)] t is locally in L 1, that is, there exists a sequence of stopping times {τ n } such that τ n and E[[M(A)] t τn ] < for each t > and each n. In addition, [M(A)] t M(A) t = Π(A [, t]) π(a [, t]) is a local martingale. It follows from (2.16) and L 2 approximation that E[(X M(t)) 2 ] = E[ X 2 (s, u)π(du ds)] = E[ X 2 (s, u)π(du ds)] (2.17) U [,t] U [,t] whenever either the second or third expression is finite. (Note that the left side may be finite with the other two expressions infinite.) We would like to obtain similar expressions for higher moments. A discrete time version of the following lemma can be found in Burkholder (1971), Theorem 2.2. The continuous time version was given by Lenglart, Lepingle, and Pratelli (198) (see Dellacherie, Maisonneuve and Meyer (1992), page 326). The proof we give here is from Ichikawa (1986), Theorem 1. 8

9 Lemma 2.4 For < p 2 there exists a constant C p such that for any locally square integrable martingale M with Meyer process M and any stopping time τ E[sup M(s) p ] C p E[ M p/2 τ ] s τ Proof. For p = 2 the result is an immediate consequence of Doob s inequality. Let < p < 2. For x >, let σ x = inf{t : M t > x 2 }. Since σ x is predictable there exists an increasing sequence of stopping times σx n σ x. Noting that M σ n x x 2, we have and letting n, we have Using the identity P {sup M(s) p > x} P {σx n τ} + E[ M τ σ n] x s τ x 2 P {σx n τ} + E[x2 M τ ], x 2 P {sup M(s) p > x} P { M τ x 2 } + E[x2 M τ ]. (2.18) s τ x 2 E[x 2 X 2 ]px p 3 dx = 2 2 p E[ X p ], the lemma follows by multiplying both sides of (2.18) by px p 1 and integrating. Assume that for 2 < k k and A A, Π k (A [, ]) is locally in L 1 and there exist predictable random measures π k and ˆπ k such that and Π k (A [, t]) π k (A [, t]) (2.19) Π k (A [, t]) ˆπ k (A [, t]) (2.2) are local martingales. Of course, for k even, π k = ˆπ k. We define π 2 = π If M is Gaussian white noise as in Example 2.1, then Π k = π k = for k > 2. If M is as in Example 2.2, then Π k (A [, t]) = h k (u)n(du [, t]), and A π k (A [, t]) = t A h k (u)ν(du), ˆπ k (A [, t]) = t h k (u)ν(du). A Theorem 2.5 Let k 2, and suppose that for 2 j k [ ( ) ] k H k,j E X(s, u) j j ˆπ j (du ds) <. (2.21) U [,t] 9

10 Then E[sup Z(s) k ] < and E[Z k (t)] = k j=2 ( ) [ ] k E Z k j (s )X j (s, u)π j (du ds) j U [,t] (2.22) Proof. For k = 2, the result follows by (2.17). Note that if (2.21) holds, then it holds with k replaced by k < k. Consequently, proceeding by induction, suppose that E[sup Z(s) k 1 ] <. Since M k (t) = t is a local square integrable martingale with M k t = by Lemma 2.4, for any stopping time τ E[ sup τ s t kz k 1 (r )dz(r) k 2 Z 2k 2 (s )d Z s, kz k 1 (r )dz(r) ] C 1 ke[ sup Z(s) 2k k Z t τ ], s<t τ and letting τ c = inf{t : Z(t) > c}, it follows that E[ sup Z(s) k ] C 1 ke[ sup Z(s) 2k k Z t ] (2.23) τ c s<t τ c + k ( ) k E[ sup Z(s) k j j s<t τ c j=2 which by the Hölder inequality implies E[ sup τ c Z(s) k ] U [,t] X(s, u) j ˆπ j (du ds)] ( ) k C 1 ke[ sup Z(s) k ] k 1 k E[ X(s, u) π 2 (du ds) ] k (2.24) s<t τ c U [,t] k + j=2 ( k j ) E[ sup s<t τ c Z(s) k ] k j j=2 k E[ ( U [,t] ) k X(s, u) j j j ˆπ j (du ds) ] k where the right side is finite by (2.21) and the fact that E[sup s<t τc Z(s) k ] c k. The inequality then implies that E[sup s<t τc Z(s) k ] K k, where K k is the largest number satisfying K C 1 kk k 1 1 k ( ) k H k k k,2 + K k j j k H k k,j j. (2.25) (2.22) then follows from (2.16). 1

11 2.2 A convergence theorem for counting measures. For n = 1, 2,..., let N n be a random counting measure on U [, ) with the property that N n (A {t}) 1 for all A B(U) and t. Let ν be a σ-finite measure on U, and let F 1 F 2 be closed sets such that ν(f k ) <, ν( F k ) =, and ν(a) = lim k ν(a F k ) for each A B(U). Let Λ n be a random measure also satisfying Λ n (A {t}) 1. Suppose that Λ n and N n are adapted to {F n t } in the sense that N n (A [, t]) and Λ n (A [, t]) are F n t -measurable for all A B(U) and t, and suppose that N n (A F k [, t]) Λ n (A F k [, t]) is a local {F n t }-martingale for each A B(U) and k = 1, 2,.... Theorem 2.6 Let N be the Poisson random measure on U [, ) with mean measure ν m. Suppose that for each k = 1, 2,..., f C(U), and t lim n f(u)λ n (du [, t]) = t F k f(u)ν(du) F k in probability. Then N n N in the sense that for any A 1,..., A m such that for each i, A i F k for some k and ν( A i ) =, (N n (A 1 [, ]),..., N n (A m [, ])) (N(A 1 [, ]),..., N(A m [, ])). It also follows that for f C(U), f(u)n n (du [, ]) f(u)n(du [, ]). F k F k Proof. The result is essentially a theorem of Brown (1978). Alternatively, assuming m i=1a i F k, let τ n = inf{t : Λ n (F k [, t]) > tν(f k ) + 1}. Note that τ n and that N n (A i [, t τ n ]) Λ n (A i [, t τ n ]) is an {F n t }-martingale. For T > and δ >, let γt n (δ) = sup Λ n (F k (t τ n, (t + δ) τ n ]) t T and observe that lim δ lim sup E[γT n (δ)] =. It follows that for t T E[N n (A i (t τ n, (t + δ) τ n ] F n t ] E[γ n T (δ) F n t ] and the relative compactness of {(N n (A 1 [, ]),..., N n (A m [, ]))} follows from Theorem of Ethier and Kurtz (1986). The theorem then follows from Theorem of Ethier and Kurtz (1986). 11

12 In addition to the conditions of Theorem 2.6, we assume that there exists h C(U) with h 1 such that (1 h(u))ν(du) < and for f C(U), U f(u)(1 h(u))λ n (du [, t]) t f(u)(1 h(u))ν(du) U U in probability. Let D be a linear space of functions on U such that for each k and each ϕ D Mn(ϕ, k t) = ϕ(u)h(u)(n n (du [, t]) Λ n (du [, t])) F k ˆM n(ϕ, k t) = ϕ(u)h(u)(n n (du [, t]) Λ n (du [, t])) and M n (ϕ, t) = are local {F n t }-martingales and F c k U ϕ(u)h(u)(n n (du [, t]) Λ n (du [, t])) U ϕ 2 (u)h 2 (u)ν(du) <. (2.26) Theorem 2.7 Suppose that there exists α : D [, ) and a sequence m n such that for every sequence k n with k n m n and each t, lim E[sup n kn ˆM n (ϕ, s) ˆM kn n (ϕ, s ) ] = (2.27) and in probability. Then for ϕ 1,..., ϕ m D, for [ ˆM kn n (ϕ)] t α(ϕ)t (2.28) (M n (ϕ 1, t),..., M n (ϕ m, t)) (M(ϕ 1, t),..., M(ϕ m, t)) M(ϕ, t) = W (ϕ, t) + U ϕ(u)h(u)ñ(du [, t]) where W is a continuous (in t), mean zero, Gaussian processes satisfying E[W (ϕ 1, s)w (ϕ 2, t)] = s t 1 2 (α(ϕ 1 + ϕ 2 ) α(ϕ 1 ) α(ϕ 2 )), Ñ(A [, t]) = N(A [, t]) tν(a), and W is independent of N. Remark 2.8 Note that the linearity of D and (2.28) implies [ kn ˆM n (ϕ 1 ), ˆM kn n (ϕ 2 )] t t 2 (α(ϕ 1 + ϕ 2 ) α(ϕ 1 ) α(ϕ 2 )). (2.29) 12

13 (2.27) and (2.29) verify the conditions of the martingale central limit theorem (see, for example, Ethier and Kurtz (1986), Theorem 7.1.4) and it follows that ( kn ˆM n (ϕ 1, ),..., ˆM kn n (ϕ m, )) (W (ϕ 1, ),..., W (ϕ m, )). Suppose that A k n(ϕ, t) has the property that ( ˆM k n(ϕ, t)) 2 A k n(ϕ, t) is a local {F n t }-martingale for each ϕ D and that for m n and k n as above, we replace (2.27) and (2.28) by the requirements that lim E[sup n kn ˆM n (ϕ, s) ˆM kn n (ϕ, s ) 2 ] = (2.3) and lim E[sup A kn n (ϕ, s) A kn n (ϕ, s ) ] = n A kn n (ϕ, t) α(ϕ)t (2.31) in probability. Then the conclusion of the theorem remains valid. In particular, (2.3) and (2.31) verify alternative conditions for the martingale central limit theorem. Note that if Λ n (A F k [, ]) is continuous for each A B(U) and k = 1, 2,..., then we can take A k n(ϕ, t) = ϕ 2 (u)h 2 (u)λ n (du [, t]). (2.32) Proof. For simplicity, let m = 1. For each fixed k, Theorem 2.6 implies Mn(ϕ, k ) ϕ(u)h(u)ñ(du [, ]) and it follows from (2.26) that ϕ(u)h(u)ñ(du [, t]) = lim k F k F c k F k U ϕ(u)h(u)ñ(du [, t]). Consequently, for k n sufficiently slowly, Mn kn (ϕ, ) ϕ(u)h(u)ñ(du [, t]), U and since we can assume that k n m n, for the same sequence, the martingale central limit k theorem implies ˆM n n (ϕ, ) W (ϕ, ). The convergence in D R [, ) of each component implies the relative compactness of {(Mn kn kn (ϕ, ), ˆM n (ϕ, ))} in D R {, ) D R [, ). The fact that the second component is asymptotically continuous implies relative compactness in D R 2[, ). Consequently, at least along a subsequence (Mn kn kn (ϕ, ), ˆM n (ϕ, )) converges in D R 2[, ). To see that there is a unique possible limit and hence that there is convergence along the original sequence it is enough to check that W and N are independent. To verify this assertion, check that W (ϕ, ), ϕ D, and Ñ(A F k [, ]), A B(U), k = 1, 2,... are 13

14 all martingales with respect to the same filtration. Since trivially, [W (ϕ), Ñ(A F k)] t =, an application of Itô s formula verifies that W and N give a solution of the martingale problem that uniquely determines their joint distribution and implies their independence. It follows that and hence (Mn kn (ϕ, ), kn ˆM n (ϕ, )) ( 3 H # -semimartingales. U ϕ(u)h(u)ñ(du [, ]), W (ϕ, )) M n (ϕ, ) M(ϕ, ). We will, in fact, consider more general stochastic integrals than those corresponding to semimartingale random measures. As in most definitions of an integral, the first step is to define the integral for a simple class of integrands and then to extend the integral to a larger class by approximation. Since we already know how to define the semimartingale integral in finite dimensions, a reasonable approach is to approximate arbitrary integrands by finite-dimensional integrands. 3.1 Finite dimensional approximations. We will need the following lemma giving a partition of unity. bounded continuous functions on S with the sup norm. C(S) denotes the space of Lemma 3.1 Let (S, d) be a complete, separable metric space, and let {x k } be a countable dense subset of S. Then for each ɛ >, there exists a sequence {ψk ɛ} C(S) such that supp{ψk ɛ} B ɛ(x k ), ψk ɛ 1, ψɛ k (x) ψɛ k (y) 4 d(x, y), and for each compact K S, ɛ there exists N K < such that N K k=1 ψɛ k (x) = 1, x K. In particular, k=1 ψɛ k (x) = 1 for all x S. Proof. Fix ɛ >. Let ψ k (x) = (1 2d(x, B ɛ ɛ/2(x k )). Then ψ k 1, ψk (x) = 1, x B ɛ/2 (x k ), and ψ k (x) =, x / B ɛ (x k ). Note also that ψ k (x) ψ k (y) 2 d(x, y). ɛ Define ψ1 ɛ = ψ 1, and for k > 1, ψk ɛ = max ψ i k i max i k 1 ψi. Clearly, ψk ɛ ψ k and k i=1 ψɛ i = max i k ψi. In particular, for compact K S, there exists N K < such that K N K k=1 B ɛ/2(x k ) and hence N K k=1 ψɛ k (x) = 1 for x K. Finally, ψ ɛ k (x) ψɛ k (y) 2 max i k ψ i (x) ψ i (y) 4 ɛ d(x, y) Let U be a complete, separable metric space, and let H be a Banach space of functions on U. Let {ϕ k } be a dense subset of H. Fix ɛ >, and let {ψk ɛ } be as in Lemma 3.1 with S = H and {x k } = {ϕ k }. The role of the ψk ɛ is quite simple. Let x D H[, ), and define x ɛ (t) = k ψɛ k (x(t))ϕ k. Then x(t) x ɛ (t) H k ψ ɛ k(x(t)) x(t) ϕ k H ɛ. (3.1) 14

15 Since x is cadlag, for each T >, there exists a compact K T H such that x(t) K T, t T. Consequently, for each T >, there exists N T < such that x ɛ (t) = NT k=1 ψɛ k (x(t))ϕ k for t T. This construction gives a natural way of approximating any cadlag H-valued function (or process) by cadlag functions (processes) that are essentially finite dimensional. Let Y be an {F t }-semimartingale random measure, and suppose Y (ϕ, ) is defined for all ϕ H (or at least for a dense subset of ϕ). Let X be a cadlag, H-valued, {F t }-adapted process, and let X ɛ (t) = k ψ ɛ k(x(t))ϕ k. (3.2) Then X X ɛ H ɛ, and the integral X ɛ Y is naturally (and consistently with the previous section) defined to be X ɛ Y (t) = k t ψ ɛ k(x(s ))dy (ϕ k, s). We can then extend the integral to all cadlag, adapted processes by taking the limit provided we can make the necessary estimates. This approach to the definition of the stochastic integral is similar to that taken by Mikulevicius and Rozovskii (1994). 3.2 Integral estimates. Definition 3.2 Let S be the collection of H-valued processes of the form Z(t) = m ξ k (t)ϕ k k=1 where the ξ k are R-valued, cadlag, and adapted. Suppose that Y = M + V is standard and M has dominating measure K. Then for Z S, we define m t Z Y (t) = ξ k (s )dy (ϕ k, s). (3.3) As in the previous section, we have k=1 E[sup Z M(s) 2 ] (3.4) [ ] 4E Z(u, s ) Z(v, s )) K(du dv ds), U U [,t] and, letting Ṽ denote the total variation measure for the signed measure Ṽ, E[sup Z V (s) ] E[ Z(s, u) Ṽ (du ds)]. (3.5) U [,t] 15

16 and and If, for example, the norm on H is the sup norm and Z(s) H ɛ for all s, then E[sup Z M(s) 2 ] 4ɛ 2 E[K(U U [, t])] (3.6) E[sup Z V (s) ] ɛe[ Ṽ (U [, t])]. (3.7) If H = L p (µ), for some p 2, and ˆK defined as in (2.1), has the representation ˆK(du dt) = h(u, t)µ(du)dt (3.8) Ṽ (du dt) = g(u, t)µ(du)dt, (3.9) then for r satisfying 2 p + 1 r = 1 and q satisfying 1 p + 1 q = 1, we have for Z H ɛ E[sup Z M(s) 2 ] [ t ( 4E [ t 4ɛ 2 E h(, s) L r (µ)ds U ) 2 ( ) ] 1 Z(s, u) p p µ(du) h(u, s) r r µ(du) ds U ] (3.1) and E[sup Z V (s) ] [ t ( E U ) 1 ( ) ] 1 Z(s, u) p p µ(du) g(u, s) q q µ(du) ds U [ t ] ɛe g(, s) L q (µ)ds. (3.11) Note that either (3.6) and (3.7) or (3.1) and (3.11) give an inequality of the form which in turn implies H t = {sup E[sup Z Y (s) ] ɛc(t) (3.12) Z Y (s) : Z S, sup Z(s) H 1} (3.13) is stochastically bounded. The following lemma summarizes the estimates made above in a form that will be needed later. 16

17 Lemma 3.3 a) Let H be the sup norm, and suppose E[K(U U [, t])] < and E[ Ṽ (U [, t])] < for all t >. Then if sup s Z(s) H 1 and τ is a stopping time bounded by a constant c, E[sup Z Y (τ + s) Z Y (τ) ] 2 E[K(U U (τ, τ + t])] + E[ Ṽ (U [τ, τ + t])] and lim E[K(U U (τ, τ + t])] + E[ Ṽ (U [τ, τ + t])] =. t b) Let H = L p (µ), for some p 2, and for h and g as in (3.1) and (3.11), suppose E[ t h(, s) L r (µ)ds] < and E[ t g(, s) L q (µ)ds] < for all t >. Then if sup s Z(s) H 1 and τ is a stopping time bounded by a constant c, E[sup Z Y (τ + s) Z Y (τ) ] 2 and lim t E[ τ+t τ E[ τ+t h(, s) L r (µ)ds] + E[ τ h(, s) L r (µ)ds] + E[ τ+t τ τ+t g(, s) L q (µ)ds]] =. τ g(, s) L q (µ)ds]] Proof. The probability estimates follow from the moment estimates (3.6) - (3.11), and the limits follow by the dominated convergence theorem, using the fact that τ c. We will see that for many purposes we really do not need the moment estimates of Lemma 3.3. Consequently, it suffices to assume stochastic boundedness for Ṽ and to localize the estimate on K. Lemma 3.4 a) Let H be the sup norm. Let τ be a stopping time, and let σ be a random variable such that P {σ > } = 1, τ + σ is a stopping time, E[K(U U (τ, τ + σ]] <, and P { Ṽ (U (τ, τ + σ]) < } = 1. Then if sup s Z(s) H 1 and α >, P {sup Z Y (τ + s) Z Y (τ) 2α} E[K(U U (τ, τ + t σ])] + P { Ṽ (U [τ, τ + t σ]) α} + P {σ < t} α and the right side goes to zero as t. b) Let H = L p (µ), for some p 2, and let h and g be as in (3.8) and (3.9). Let τ be a stopping time, and let σ be a random variable such that P {σ > } = 1, τ + σ is a stopping time, E[ τ+σ h(, s) τ L r (µ)ds] < and P { τ+σ g(, s) τ L q (µ)ds < } = 1. Then if sup s Z(s) H 1 P {sup Z Y (τ + s) Z Y (τ) 2α} E[ τ+t σ h(, s) τ L r (µ)ds] τ+t σ + P { g(, s) L α q (µ)ds α} + P {σ < t} τ and the right side goes to zero as t. 17

18 Proof. Observe that P {sup Z Y (τ + s) Z Y (τ) 2α} P { sup Z M(τ + s) Z M(τ) α} σ +P { sup Z V (τ + s) Z V (τ) α} + P {σ < t}, σ and note that the first two terms on the right are bounded by the corresponding terms in the desired inequalities. 3.3 H # -semimartingale integrals. Now let H be an arbitrary, separable Banach space. With the above development in mind, we make the following definition. Definition 3.5 Y is an {F t }-adapted, H # -semimartingale, if Y is an R-valued stochastic process indexed by H [, ) such that For each ϕ H, Y (ϕ, ) is a cadlag {F t }-semimartingale with Y (ϕ, ) =. For each t, ϕ 1,..., ϕ m H, and a 1,..., a m R, Y ( m i=1 a iϕ i, t) = m i=1 a iy (ϕ i, t) a.s. The definition of the integral in (3.3) extends immediately to this more general setting. Noting (3.12), (3.13), and their relationship to the assumption that the semimartingale measure is standard, we define: Definition 3.6 Y is a standard H # -semimartingale if H t defined in (3.13) is stochastically bounded for each t. This stochastic boundedness is implied by an apparently weaker condition. Definition 3.7 Let S S be the collection of processes Z(t) = m ξ k (t)ϕ k k=1 in which the ξ k are piecewise constant, that is, ξ k (t) = j ηi k I [τ k i,τi+1 k ) (t) i= where = τ k τ k j are {F t }-stopping times and η k i is F τ k i -measurable. 18

19 Lemma 3.8 If Ht = { Z Y (t) : Z S, sup Z(s) H 1} is stochastically bounded, then H t defined in (3.13) is stochastically bounded. Remark 3.9 If Y is real-valued, that is H = R, then the definition of standard H # - semimartingale is equivalent to the definition of semimartingale given in Section II.1 of Protter (199), that is, the process satisfies the good integrator condition. Proof. For each δ >, there exists K(t, δ) such that P { Z Y (t) K(t, δ)} δ (3.14) for all Z S satisfying sup Z(s) H 1. We can assume, without loss of generality, that K(t, δ) is right continuous and strictly increasing in δ (so that the collection of random variables satisfying P {U K(t, δ)} δ is closed under convergence in probability). Let τ = inf{s : Z Y K(t, δ)} and Z τ = I [,τ) Z. Then P {sup Z Y (s) K(t, δ)} = P { Z Y (t τ) K(t, δ)} = P { Z τ Y (t) K(t, δ)} δ. For Z S with sup Z(s) H 1, there exists a sequence {Z n } S with sup Z n (s) 1 such that sup Z(s) Z n (s) H. This convergence implies Z n Y (t) Z Y in probability, and the lemma follows. The assumption of Lemma 3.8 holds if there exists a constant C(t) such that for all Z S satisfying sup Z(s) H 1, E[ Z Y (t) ] C(t). The following lemma is essentially immediate. The observation it contains is useful in treating semimartingale random measures which can frequently be decomposed into a part (usually a martingale random measure) that determines an H # -semimartingale on an L 2 space and another part that determines an H # -semimartingale on an L 1 -space. Note that if H 1 is a space of functions on U 1 and H 2 is a space of functions on U 2, then H 1 H 2 can be interpreted as a space of functions on U = U 1 U 2 where, for example, R R is interpreted as the set consisting of two copies of R. Lemma 3.1 Let Y 1 be a standard H # 1 -semimartingale and Y 2 a standard H # 2 -semimartingale with respect to the same filtration {F t }. Define H = H 1 H 2, with ϕ H = ϕ 1 H1 + ϕ 2 H2 and Y (ϕ, t) = Y 1 (ϕ 1, t) + Y 2 (ϕ 2, t) for ϕ = (ϕ 1, ϕ 2 ). Then Y is a standard H # - semimartingale. If Y is standard, then the definition of Z Y can be extended to all cadlag, H-valued processes. 19

20 Theorem 3.11 Let Y be a standard H # -semimartingale, and let X be a cadlag, adapted, H-valued process. Define X ɛ by (3.2). Then X Y lim ɛ X ɛ Y exists in the sense that for each t > lim ɛ P {sup X Y (s) X ɛ Y (s) > η} = for all η >, and X Y is cadlag. Proof. Let K(δ, t) > be as in Lemma 3.8. Since X ɛ 1 (s) X ɛ 2 (s) H ɛ 1 + ɛ 2, we have that P {sup X ɛ 1 Y (s) X ɛ 2 Y (s) (ɛ 1 + ɛ 2 )K(δ, t)} δ, and it follows that {X ɛ Y } is Cauchy in probability and that the desired limit exists. Since X ɛ Y is cadlag and the convergence is uniform on bounded intervals, it follows that X Y is cadlag. The following corollary is immediate from the definition of the integral. Corollary 3.12 Let Y be a standard H # -semimartingale, and let X be a cadlag, adapted, H-valued process. Let τ be an {F t }-stopping time and define X τ = I [,τ) X. Then X Y (t τ) = X τ Y (t). For finite dimensional semimartingale integrals, the stochastic integral for cadlag, adapted integrands can be defined as a Riemann-type limit of approximating sums X Y (t) = lim X(t i t)(y (t i+1 t) Y (t i t)) (3.15) where the limit is as max(t i+1 t i ) for the partition of [, ), = t < t 1 < t 2 <. Formally, the analogue for H # -semimartingale integrals would be X Y (t) = lim (Y (X(t i t), t i+1 t) Y (X(t i t), t i t)); however, Y (X, t) is not, in general, defined for random variables X. We can define an analog of the summands in (3.15) by first defining X [t i,t i+1 ) = I [ti,t i+1 )X(t i ) and then defining [ti.t i+1 )Y (X(t i ), t) = X [t i,t i+1 ) Y (t). Similarly, we can define [τi,τ i+1 )Y (X(τ i ), t) for stopping times τ i τ i+1. Proposition 3.13 For each n let {τi n } be an increasing sequence of stopping times = τ n τ1 n τ2 n, and suppose that for each t > lim max{τ i+1 n τi n : τi n < t} =. n If X is a cadlag, adapted H-valued process and Y is a standard H # -semimartingale, then X Y (t) = lim n [τ n i,τ n i+1 ) Y (X(τ n i ), t) (3.16) where the convergence is uniform on bounded time intervals in probability. 2

21 Proof. Let X n = I [τ n i,τ n i+1 ) X(τ n i ). Then the sum on the right of (3.16) is just X n Y (t) and, with Xn ɛ defined as above, Xn ɛ Y (t) = ψk(x(τ ɛ i n ))(Y (ϕ k, τi+1 n t) Y (ϕ k, τi n t)). k i By the finite dimensional result, for each η >, and by standardness P {sup lim P {sup Xn ɛ Y (s) X ɛ Y (s) > η} = n Xn ɛ Y (s) X n Y (s) ɛk(δ, t)} + P {sup X ɛ Y (s) X Y (s) ɛk(δ, t)} 2δ and the result follows. 3.4 Predictable integrands. Definition 3.14 Let S be the collection of H-valued processes of the form Z(t) = m ξ k (t)ϕ k k=1 where the ξ k are {F t }-predictable and bounded. Z Y for Z S, can be extended to Z S by setting Z Y (t) = m t k=1 ξ k (s)dy (ϕ k, s). We will show that the condition that H t is stochastically bounded implies that H t = {sup Z Y (s) : Z S, sup Z(s) H 1} is also stochastically bounded, and hence Z Y can be extended to all predictable, H-valued processes X that satisfy a compact range condition by essentially the same argument as in the proof of Theorem Lemma 3.15 If H t is stochastically bounded, then H t is stochastically bounded. 21

22 Proof. Let K(t, δ) be as in (3.14). Fix ϕ 1,..., ϕ m H and let C ϕ = {x R m : m i=1 x iϕ i H 1}. To simplify notation, let Y i (t) = Y (ϕ i, t). We need to show that if (ξ 1,..., ξ m ) is predictable and takes values in C ϕ, then P {sup m i=1 s ξ i (s)dy i (s) K(t, δ)} δ. (3.17) Consequently, it is enough to show that there exists cadlag, adapted ξi n such that (ξ1 n,..., ξm) n C ϕ and lim n sup s (ξ i(u) ξi n (u ))dy i (u) = in probability for each i. Assume for the moment that Y i = M i + A i where M i is a square integrable martingale and E[T t (A i )] <, T t (A i ) denoting the total variation up to time t. Let Γ(t) = m i=1 [M i] t and Λ(t) = m i=1 T t(a i ). Then (see Protter (199), Theorem IV.2) there exists a sequence of cadlag, adapted R m -valued processes ξ n such that lim n E[ t ξ(s) ξ n (s ) 2 dγ(s)] + E[ t ξ(s) ξ n (s ) dλ(s)] =. (3.18) Letting π denote the projection onto the convex set C ϕ, since π(x) π(y) x y and ξ C ϕ, if we define ξ n = π( ξ n ), ξ n C ϕ and the limit in (3.18) still holds. Finally, E[sup 2 2 s (ξ i (u) ξi n (u ))dy i (u) ] E[ E[ t t ξ i (s) ξ n i (s ) 2 d[m i ] s ] + E[ ξ(s) ξ n (s ) 2 dγ(s)] + E[ t t ξ i (s) ξ n i (s ) dt s (A i )] ξ(s) ξ n (s ) dλ(s)] so the stochastic integrals converge and the limit must satisfy (3.17). For general Y i, fix ɛ >, and let Y i = M i + A i where M i is a local martingale with discontinuities bounded by ɛ, that is, sup t M i (t) M i (t ) ɛ, and A i is a finite variation process. (Such a decomposition always exists. See Protter (199), Theorem III.13.) For c >, let τ c = inf{t : m i=1 [M i] t + m i=1 T t(a i ) c}, and let Y τc i = Y i ( τ c ). Then for cadlag, adapted ξ with values in C ϕ, it still holds that P {sup m i=1 s ξ i (s )dy τc i (s) K(t, δ)} δ. (3.19) (replace ξ by I [,τc)ξ). Define Ỹ τc i = M τc i + A τc i where A τc i (t) = A i (t), for t < τ c and (t) = A i (τ c ) for t τ c. It follows from (3.19) and the fact that A τc i m i=1 ξ i (τ c )(M i (τ c ) M i (τ c )) ɛ sup x C ϕ 22 m x i ɛk ϕ i=1

23 that P {sup m i=1 s ξ i (s )dỹ τc i (s) K(t, δ) + ɛk ϕ } δ. (3.2) for all cadlag, adapted processes with values in C ϕ. But M τc i is a square integrable martingale and T t (A τc i ) c, so it follows that (3.2) holds with ξ(s ) replaced by an arbitrary predictable process with values in C ϕ. Letting c and observing that ɛ is arbitrary, we see that (3.17) holds and the lemma follows. Proposition 3.16 Let Y be a standard H # -semimartingale, and let X be an H-valued predictable process such that for t, η > there exists compact K t,η H satisfying P {X(s) K t,η, s t} 1 η. Then defining X ɛ as in (3.2), X Y lim ɛ X ɛ Y exists. Remark 3.17 If estimates of the form (3.4) hold, then the definition of X Y can be extended to locally bounded X, that is, the compact range condition can be dropped. (Approximate X be processes of the form XI K (X) where K is compact.) We do not know whether or not the compact range condition can be dropped for every standard H # -semimartingale. Proof. The proof is the same as for Theorem Examples. The idea of an H # -semimartingale is intended to suggest, but not be equivalent to, the idea of an H -semimartingale, that is, a semimartingale with values in H. In deed, any H -semimartingale will be an H # -semimartingale; however, there are a variety of examples of H # -semimartingales that are not H -semimartingales. Example 3.18 Poisson process integrals in L p spaces. Let µ be a finite measure on U, and let H = L p (µ) for some 1 p <. Let N be a Poisson point process on U [, ), and for ϕ H, define Y (ϕ, t) = ϕ(u)n(du ds). Of U [,t] course Y (ϕ, ) is just a compound Poisson process whose jumps have distribution given by ν(a) = I A (ϕ(u))µ(du). Since point evaluation is not a continous linear functional on L p, Y is an H # -semimartingale, but not an H -semimartingale. Example 3.19 Cylindrical Brownian motion. Let H be a Hilbert space and let Q be a bounded, self-adjoint, nonnegative operator on H. Then there exists a Gaussian process W with covariance E[W (ϕ 1, t)w (ϕ 2, s)] = t s Qϕ 1, ϕ 2. 23

24 If Q is nuclear, then W will be an H (= H)-valued martingale; however, in general, W will only be an H # -semimartingale. (See, for example, Da Prato and Zabczyk (1992), Section 4.3.) Note that if X(t) = m i=1 ξ i(t)ϕ i is cadlag and adapted to the filtration generated by W, then E[ X W (t) 2 ] = t E[ i,j and it follows that W is standard. ξ i (s)ξ j (s) Qϕ i, ϕ j ]ds Q 4 Convergence of stochastic integrals. t E[ X(s) 2 H]ds Let H be a separable Banach space, and for each n 1, let Y n be an {Ft n }-H # -semimartingale. Note that the Y n need not all be adapted to the same filtration nor even defined on the same probablity space. The minimal convergence assumption that we will consider is that for ϕ 1,..., ϕ m H, (Y n (ϕ 1, ),..., Y n (ϕ m, )) (Y (ϕ 1, ),..., Y (ϕ m, )) in D R m[, ) with the Skorohod topology. Let {X n } be cadlag, H-valued processes. We will say that (X n, Y n ) (X, Y ), if (X n, Y n (ϕ 1, ),..., Y n (ϕ m, )) (X, Y (ϕ 1, ),..., Y (ϕ m, )) in D H R m[, ) for each choice of ϕ 1,..., ϕ m H. We are interested in conditions on {(X n, Y n )}, under which X n Y n X Y. In the finite dimensional setting of Kurtz and Protter (1991a), convergence was obtained by first approximating by piecewise constant processes. This approach was also taken by Cho (1994, 1995) for integrals driven by martingale random measures. Here we take a slightly different approach, approximating the H-valued processes by finite dimensional H-valued processes in a way that allows us to apply the results of Kurtz and Protter (1991a) and Jakubowski, Mémin, and Pages, G. (1989). Lemma 4.1 Suppose that for each ϕ H, the sequence {Y n (ϕ, )} of R-valued semimartingales satisfies the conditions of Theorem 1.1. Let X ɛ n(t) k ψɛ k (X n(t))ϕ k. If (X n, Y n ) (X, Y ), then (X n, Y n, X ɛ n Y n ) (X, Y, X ɛ Y ). If (X n, Y n ) (X, Y ) in probability, then (X n, Y n, X ɛ n Y n ) (X, Y, X ɛ Y ) in probability. Proof. By tightness, there exists a sequence of compact K m H such that P {X n (t) K m, t m} 1 1. Let τ m m n = inf{t : X n (t) / K m }. Then P {τn m m} 1 1, m and Xn ɛ Y n (t) = Zn m (t) N Km t k=1 ψɛ k (X n(s ))dy n (ϕ k, s) for t < τn m. Theorem 1.1 implies (X n, Y n, Zn m ) (X, Y, Z m ) for each m, where Z m (t) N Km t k=1 ψɛ k (X(s ))dy (ϕ k, s). Since Zn m (t) = Xn ɛ Y n (t) for t τn m, using the metric of Ethier and Kurtz (1986), Chapter 3, Formula (5.2), we have d(xn ɛ Y n, Zn m ) e τ n m, and the convergence of {Z m n } for each m implies the desired convergence for Xn ɛ Y n. In order to prove the convergence for X n Y n, by Lemma 4.1, it is enough to show that X ɛ n Y n is a good approximation of X n Y n, that is, we need to estimate (X n X ɛ n ) Y n. If the Y n correspond to semimartingale random measures, then (3.6) and (3.7) or (3.1) and (3.11) give estimates of the form E[sup (X n Xn ) ɛ Y (s) ] ɛc n (t). 24

25 If sup n C n (t) < for each t, then defining H n,t = {sup Z Y n (s) : Z S n, sup Z(s) H 1}, (4.1) Ĥ t = n H n,t is stochastically bounded for each t. This last assertion is essentially the uniform tightness (UT) condition of Jakubowski, Mémin, and Pages (1989). As in Lemma 3.8, the condition that n H n,t be stochastically bounded is equivalent to the condition that Ĥt = n Hn,t = n { Z Y n (t) : Z S n, sup Z(s) H } (4.2) be stochastically bounded. For finite dimensional semimartingales, the uniform tightness condition, Condition UT of Theorem 1.1, implies uniformly controlled variations, Condition UCV of Theorem 1.1. In the present setting, the relationship of the UT condition and some sort of uniform worthiness is not clear and certainly not so simple. Consequently, the following theorem is really an extension of the convergence theorem of Jakubowski, Mémin, and Pages (1989) rather than the results of Kurtz and Protter (1991a), although in the finite dimensional setting of those results, the conditions of the two theorems are equivalent. Theorem 4.2 For each n = 1, 2,..., let Y n be a standard {Ft n }-adapted, H # -semimartingale. Let Hn,t and Ĥ t be defined as in (4.2), and suppose that for each t >, Ĥt is stochastically bounded. If (X n, Y n ) (X, Y ), then there is a filtration {F t }, such that Y is an {F t }-adapted, standard, H # -semimartingale, X is {F t }-adapted and (X n, Y n, X n Y n ) (X, Y, X Y ). If (X n, Y n ) (X, Y ) in probability, then (X n, Y n, X n Y n ) (X, Y, X Y ) in probability. Remark 4.3 a) One of the motivations for introducing H # -semimartingales rather than simply posing the above result in terms of semimartingale random measures is that the Y n may be given by standard semimartingale random measures while the limiting Y is not. b) Jakubowski (1995) proves the above theorem in the case of Hilbert space-valued semimartingales. Proof. The stochastic boundedness condition implies that for each t, δ > there exists K(δ, t) such that for all R Ĥt, P { R K(δ, t)} δ. Without loss of generality, we can assume that K(δ, t) is a nondecreasing, right continuous function of t. Note that this inequality will hold for R = sup Z n Y n (s) for any cadlag, {Ft n }-adapted Z n satisfying sup Z n (s) H 1. Let F t = σ(x(s), Y (ϕ, s) : s t, ϕ H). Define Z n (t) = m f i (X n, Y n ( ˆϕ 1, ),..., Y n ( ˆϕ d, ), t)ϕ i i=1 where (f 1,..., f m ) is a continous function mapping D H R d[, ) into C A(ϕ1,...,ϕ m)[, ), A(ϕ 1,..., ϕ m ) = {α R m : i α iϕ i H 1}, in such a way that f i (x, y 1,..., y d, t) depends only on (x(s), y 1 (s),..., y d (s)) for s t (ensuring that Z n is {Ft n }-adapted and Z = 25

26 m i=1 f i(x, Y ( ˆϕ 1, ),..., Y ( ˆϕ d, ))ϕ i is {F t }-adapted). Theorem 1.1 implies Z n Y n Z Y, and it follows (using the right continuity of K(δ, t)) that P {sup Z Y K(δ, t)} δ. (4.3) By approximation, one can see that (4.3) holds for any process Z of the form m Z(t) = ξ i (t)ϕ i i=1 where ξ = (ξ 1,..., ξ m ) is {F t }-adapted, cadlag and has values in A(ϕ 1,..., ϕ m ). By (4.3), it follows that Y (ϕ, ) is an {F t }-semimartingale for each ϕ and hence that Y is an H # - semimartingale. It also follows from (4.3) that Y is standard. Finally, observing that X n (s) X ɛ n(s) H /ɛ is bounded by 1, we have that P {sup X n Y n Xn ɛ Y n ɛ(k(δ, t)} δ and similarly for X and Y. Consequently, the Theorem follows from Lemma 4.1. Example 4.4 Many particle random walk. For each n let Xk n, k = 1,..., n, be independent, continous-time, reflecting random walks on E n = { i : i =,..., n} with generator n n 2 (f(x + 1 ) + f(x 1 ) 2f(x)), < x < 1 2 n n B n f(x) = n 2 (f( 1 ) f()), x = n n 2 (f(1 1 ) f(1), x = 1 n and Xk n() uniformly distributed on E n. Let H = C 1 ([, 1]) with ϕ H = sup x 1 ( ϕ(x) + ϕ (x) ), and define Y n (ϕ, t) = 1 n ( t ) ϕ(n 1 X n n k (t)) ϕ(n 1 Xk n ()) B n ϕ(xk n (s))ds. k=1 Note that Y n corresponds to a martingale random measure and that E[(Y n (ϕ, t 2 ) Y n (ϕ, t 1 )) 2 Ft n 1 ] = 1 n t2 ( ϕ(x n E[ k (s) + 1 ) n ϕ(xn k (s))) 2 ( + ϕ(x n k (s) 1 ) n ϕ(xn k (s))) 2 Ft n ]. n 2 k=1 t 1 n 2 It follows that for Z S n, E[(Z Y n (t)) 2 ] t sup sup x 1 Z (s, x) 2 t sup Z(s) 2 H, and hence {Y n } is uniformly tight. The martingale central limit theorem gives Y n Y where Y is Gaussian and satisfies Y (ϕ 1, ), Y (ϕ 2, ) t = t 1 ϕ 1(x)ϕ 2(x)dx. It follows that Y does not correspond to a standard martingale random measure. 26

27 5 Convergence in infinite dimensional spaces. Theorem 4.2 extends easily to integrals with range in R d. The interest in semimartingales in infinite dimensional spaces, however, is frequently in relation to stochastic partial differential equations. Consequently, extension to function-valued integrals is desirable. For semimartingale random measures, this extension would be to integrals of the form Z(t, x) = X(s, x, u)y (du ds) U [,t] where X is a process with values in a function space on E U. We will take (E, r) to be a complete, separable metric space. More generally, let X be an H-valued process indexed by [, ) E. If X is {F t }-adapted and X(, x) is cadlag for each x, then Z(t, x) = X(, x) Y (t) is defined for each x; however, the properties of Z as a function of x (even the measurability) are not immediately clear. Consequently, we construct the desired integral more carefully. 5.1 Integrals with infinite-dimensional range. Let (E, r E ) and (U, r U ) be complete, separable metric spaces, let L be a separable Banach space of R-valued functions on E, and let H be a separable Banach space of R-valued functions on U. (We restrict our attention to function spaces so that for f L and ϕ H, f ϕ has the simple interpretation as a pointwise product. The restriction to function spaces could be dropped with the introduction of an appropriate definition of product.) Let G L = {f i } L be a sequence such that the finite linear combinations of the f i are dense in L, and let G H = {ϕ j } be a sequence such that the finite linear combinations of the ϕ j are dense in H. Definition 5.1 Let Ĥ be the completion of the linear space { l i=1 m j=1 a ijf i ϕ j : f i G L, ϕ i G H } with respect to some norm Ĥ. For example, if l m a ij f i ϕ j Ĥ i=1 j=1 = sup{ m a ij λ, f i η, ϕ i : λ L, η H, λ L 1, η H 1} i=1 then we can interpret Ĥ as a subspace of bounded linear operators mapping H into L. Metivier and Pellaumail (198) develop the stochastic integral in this setting. We say that a norm G on a function space G is monotone, if g G implies g G and g 1 g 2 implies g 1 G g 2 G. If L and H are both monotone, we may take l i=1 m a ij f i ϕ j Ĥ = j=1 27 l i=1 m a ij f i ϕ j L H. (5.1) j=1

for all f satisfying E[ f(x) ] <.

for all f satisfying E[ f(x) ] <. . Let (Ω, F, P ) be a probability space and D be a sub-σ-algebra of F. An (H, H)-valued random variable X is independent of D if and only if P ({X Γ} D) = P {X Γ}P (D) for all Γ H and D D. Prove that if

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

1. Stochastic equations for Markov processes

1. Stochastic equations for Markov processes First Prev Next Go To Go Back Full Screen Close Quit 1 1. Stochastic equations for Markov processes Filtrations and the Markov property Ito equations for diffusion processes Poisson random measures Ito

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Branching Processes II: Convergence of critical branching to Feller s CSB

Branching Processes II: Convergence of critical branching to Feller s CSB Chapter 4 Branching Processes II: Convergence of critical branching to Feller s CSB Figure 4.1: Feller 4.1 Birth and Death Processes 4.1.1 Linear birth and death processes Branching processes can be studied

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

Kai Lai Chung

Kai Lai Chung First Prev Next Go To Go Back Full Screen Close Quit 1 Kai Lai Chung 1917-29 Mathematicians are more inclined to build fire stations than to put out fires. Courses from Chung First Prev Next Go To Go Back

More information

Regularity of the density for the stochastic heat equation

Regularity of the density for the stochastic heat equation Regularity of the density for the stochastic heat equation Carl Mueller 1 Department of Mathematics University of Rochester Rochester, NY 15627 USA email: cmlr@math.rochester.edu David Nualart 2 Department

More information

Math 735: Stochastic Analysis

Math 735: Stochastic Analysis First Prev Next Go To Go Back Full Screen Close Quit 1 Math 735: Stochastic Analysis 1. Introduction and review 2. Notions of convergence 3. Continuous time stochastic processes 4. Information and conditional

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Weak convergence and large deviation theory

Weak convergence and large deviation theory First Prev Next Go To Go Back Full Screen Close Quit 1 Weak convergence and large deviation theory Large deviation principle Convergence in distribution The Bryc-Varadhan theorem Tightness and Prohorov

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

THEOREMS, ETC., FOR MATH 515

THEOREMS, ETC., FOR MATH 515 THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Continuous Functions on Metric Spaces

Continuous Functions on Metric Spaces Continuous Functions on Metric Spaces Math 201A, Fall 2016 1 Continuous functions Definition 1. Let (X, d X ) and (Y, d Y ) be metric spaces. A function f : X Y is continuous at a X if for every ɛ > 0

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

Tools from Lebesgue integration

Tools from Lebesgue integration Tools from Lebesgue integration E.P. van den Ban Fall 2005 Introduction In these notes we describe some of the basic tools from the theory of Lebesgue integration. Definitions and results will be given

More information

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9 MAT 570 REAL ANALYSIS LECTURE NOTES PROFESSOR: JOHN QUIGG SEMESTER: FALL 204 Contents. Sets 2 2. Functions 5 3. Countability 7 4. Axiom of choice 8 5. Equivalence relations 9 6. Real numbers 9 7. Extended

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY PROCESSES WITH INDEPENDENT INCREMENTS

STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY PROCESSES WITH INDEPENDENT INCREMENTS STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY PROCESSES WITH INDEPENDENT INCREMENTS DAMIR FILIPOVIĆ AND STEFAN TAPPE Abstract. This article considers infinite dimensional stochastic differential equations

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

On a class of stochastic differential equations in a financial network model

On a class of stochastic differential equations in a financial network model 1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University

More information

2 Lebesgue integration

2 Lebesgue integration 2 Lebesgue integration 1. Let (, A, µ) be a measure space. We will always assume that µ is complete, otherwise we first take its completion. The example to have in mind is the Lebesgue measure on R n,

More information

Real Analysis Problems

Real Analysis Problems Real Analysis Problems Cristian E. Gutiérrez September 14, 29 1 1 CONTINUITY 1 Continuity Problem 1.1 Let r n be the sequence of rational numbers and Prove that f(x) = 1. f is continuous on the irrationals.

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

Analysis Comprehensive Exam Questions Fall 2008

Analysis Comprehensive Exam Questions Fall 2008 Analysis Comprehensive xam Questions Fall 28. (a) Let R be measurable with finite Lebesgue measure. Suppose that {f n } n N is a bounded sequence in L 2 () and there exists a function f such that f n (x)

More information

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Ecole Polytechnique, France April 4, 218 Outline The Principal-Agent problem Formulation 1 The Principal-Agent problem

More information

PROBLEMS. (b) (Polarization Identity) Show that in any inner product space

PROBLEMS. (b) (Polarization Identity) Show that in any inner product space 1 Professor Carl Cowen Math 54600 Fall 09 PROBLEMS 1. (Geometry in Inner Product Spaces) (a) (Parallelogram Law) Show that in any inner product space x + y 2 + x y 2 = 2( x 2 + y 2 ). (b) (Polarization

More information

Convergence of Markov Processes. Amanda Turner University of Cambridge

Convergence of Markov Processes. Amanda Turner University of Cambridge Convergence of Markov Processes Amanda Turner University of Cambridge 1 Contents 1 Introduction 2 2 The Space D E [, 3 2.1 The Skorohod Topology................................ 3 3 Convergence of Probability

More information

MAT 578 FUNCTIONAL ANALYSIS EXERCISES

MAT 578 FUNCTIONAL ANALYSIS EXERCISES MAT 578 FUNCTIONAL ANALYSIS EXERCISES JOHN QUIGG Exercise 1. Prove that if A is bounded in a topological vector space, then for every neighborhood V of 0 there exists c > 0 such that tv A for all t > c.

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents MATH 3969 - MEASURE THEORY AND FOURIER ANALYSIS ANDREW TULLOCH Contents 1. Measure Theory 2 1.1. Properties of Measures 3 1.2. Constructing σ-algebras and measures 3 1.3. Properties of the Lebesgue measure

More information

Analysis Qualifying Exam

Analysis Qualifying Exam Analysis Qualifying Exam Spring 2017 Problem 1: Let f be differentiable on R. Suppose that there exists M > 0 such that f(k) M for each integer k, and f (x) M for all x R. Show that f is bounded, i.e.,

More information

Kernel Method: Data Analysis with Positive Definite Kernels

Kernel Method: Data Analysis with Positive Definite Kernels Kernel Method: Data Analysis with Positive Definite Kernels 2. Positive Definite Kernel and Reproducing Kernel Hilbert Space Kenji Fukumizu The Institute of Statistical Mathematics. Graduate University

More information

2. Dual space is essential for the concept of gradient which, in turn, leads to the variational analysis of Lagrange multipliers.

2. Dual space is essential for the concept of gradient which, in turn, leads to the variational analysis of Lagrange multipliers. Chapter 3 Duality in Banach Space Modern optimization theory largely centers around the interplay of a normed vector space and its corresponding dual. The notion of duality is important for the following

More information

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32 Lévy Processes and Infinitely Divisible Measures in the Dual of a Nuclear Space David Applebaum School of Mathematics and Statistics, University of Sheffield, UK Talk at "Workshop on Infinite Dimensional

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

MATH 426, TOPOLOGY. p 1.

MATH 426, TOPOLOGY. p 1. MATH 426, TOPOLOGY THE p-norms In this document we assume an extended real line, where is an element greater than all real numbers; the interval notation [1, ] will be used to mean [1, ) { }. 1. THE p

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November

More information

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION A la mémoire de Paul-André Meyer Francesco Russo (1 and Pierre Vallois (2 (1 Université Paris 13 Institut Galilée, Mathématiques 99 avenue J.B. Clément

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10

MS&E 321 Spring Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 MS&E 321 Spring 12-13 Stochastic Systems June 1, 2013 Prof. Peter W. Glynn Page 1 of 10 Section 3: Regenerative Processes Contents 3.1 Regeneration: The Basic Idea............................... 1 3.2

More information

1 Math 241A-B Homework Problem List for F2015 and W2016

1 Math 241A-B Homework Problem List for F2015 and W2016 1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

Proofs of the martingale FCLT

Proofs of the martingale FCLT Probability Surveys Vol. 4 (2007) 268 302 ISSN: 1549-5787 DOI: 10.1214/07-PS122 Proofs of the martingale FCLT Ward Whitt Department of Industrial Engineering and Operations Research Columbia University,

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Convergence of Feller Processes

Convergence of Feller Processes Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Representations of Gaussian measures that are equivalent to Wiener measure

Representations of Gaussian measures that are equivalent to Wiener measure Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

9.2 Branching random walk and branching Brownian motions

9.2 Branching random walk and branching Brownian motions 168 CHAPTER 9. SPATIALLY STRUCTURED MODELS 9.2 Branching random walk and branching Brownian motions Branching random walks and branching diffusions have a long history. A general theory of branching Markov

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS Qiao, H. Osaka J. Math. 51 (14), 47 66 EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS HUIJIE QIAO (Received May 6, 11, revised May 1, 1) Abstract In this paper we show

More information

Stochastic calculus without probability: Pathwise integration and functional calculus for functionals of paths with arbitrary Hölder regularity

Stochastic calculus without probability: Pathwise integration and functional calculus for functionals of paths with arbitrary Hölder regularity Stochastic calculus without probability: Pathwise integration and functional calculus for functionals of paths with arbitrary Hölder regularity Rama Cont Joint work with: Anna ANANOVA (Imperial) Nicolas

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm

A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Algorithm F. Cérou 1,2 B. Delyon 2 A. Guyader 3 M. Rousset 1,2 1 Inria Rennes Bretagne Atlantique 2 IRMAR, Université de Rennes

More information

f(x)dx = lim f n (x)dx.

f(x)dx = lim f n (x)dx. Chapter 3 Lebesgue Integration 3.1 Introduction The concept of integration as a technique that both acts as an inverse to the operation of differentiation and also computes areas under curves goes back

More information

arxiv: v2 [math.pr] 27 Oct 2015

arxiv: v2 [math.pr] 27 Oct 2015 A brief note on the Karhunen-Loève expansion Alen Alexanderian arxiv:1509.07526v2 [math.pr] 27 Oct 2015 October 28, 2015 Abstract We provide a detailed derivation of the Karhunen Loève expansion of a stochastic

More information

3. (a) What is a simple function? What is an integrable function? How is f dµ defined? Define it first

3. (a) What is a simple function? What is an integrable function? How is f dµ defined? Define it first Math 632/6321: Theory of Functions of a Real Variable Sample Preinary Exam Questions 1. Let (, M, µ) be a measure space. (a) Prove that if µ() < and if 1 p < q

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is

More information

Real Analysis Notes. Thomas Goller

Real Analysis Notes. Thomas Goller Real Analysis Notes Thomas Goller September 4, 2011 Contents 1 Abstract Measure Spaces 2 1.1 Basic Definitions........................... 2 1.2 Measurable Functions........................ 2 1.3 Integration..............................

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information