Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes

Size: px
Start display at page:

Download "Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes"

Transcription

1 BMS Advanced Course Stochastic Processes III/ Wahrscheinlichkeitstheorie IV Michael Scheutzow Lecture Notes Technische Universität Berlin Wintersemester 218/19 preliminary version November 28th 218

2

3 Contents 1 Girsanov s Theorem Preparation for Girsanov s Theorem Novikov s Condition Application: Brownian Motion with Drift Local Time 7 3 Weak Solutions and Martingale Problems Weak solutions of stochastic differential equations Weak solutions via Girsanov s theorem Martingale Problems Martingale Problems: Existence of solutions Bibliography 19 i

4

5 Chapter 1 Girsanov s Theorem 1.1 Preparation for Girsanov s Theorem Before stating the important Girsanov theorem we recall Lévy s characterization of Brownian motion and provide a proof of the statement which we skipped in WTIII. We start with a lemma see Lemma in [KS91]. Here, i is the complex number 1. Lemma 1.1. Let X, Y be R d -valued random variables on Ω, F, P and let G be a sub σ-algebra of F. Assume that for each u R d we have Eexp{i u, Y } G = E exp{i u, X } a.s. Then the laws of X and Y coincide and Y and G are independent. Proof. The first statement is clear since the characteristic function determines a probability measure uniquely. To see the second statement, we represent the conditional expectation via a regular conditional probability see the chapter Bedingte Erwartungen und Wahrscheinlichkeiten in WT1: Eexp{i u, Y } Gω = exp{i u, y } Qω, dy, a.s. R d Since a characteristic function is continuous, we can assume that the exceptional set does not depend on u R d. The assumption in the lemma now shows that for almost all ω Ω, the characteristic functions of Qω,. and the law of X coincide and therefore, the conditional law of Y given G and the law of X coincide for almost all ω Ω, so for each Borel set B R d, we have PY B G = PX B which immediately implies that Y and G are independent. Now we are ready to state Lévy s characterization of Brownian motion. Theorem 1.2. Let M t = M 1 t,..., M d t be a continuous R d -valued local martingale starting at on the filtered probability space Ω, F, F, P. If M j, M k t = δ jk t; j, k {1,..., d}, then M is a d-dimensional F-Brownian motion. 1

6 2 Wahrscheinlichkeitstheorie IV Proof. We basically follow [KS91], Theorem We need to show that for s < t, the random vector M t M s is independent of F s and has distribution N, t si. Thanks to the previous lemma it is enough to prove that for any u = u 1,..., u d R d and s t, we have E exp{i u, M t M s } F s = exp { 1 2 u 2 t s }. Fix u R d and s and let fx := e i u,x, x R d. Applying Itô s formula to the real and imaginary part of f, we get e i u,mt = e i u,ms + i d u j e i u,mv dmv j 1 2 j=1 s d j=1 u 2 j s e i u,mv dv By Burkholder s inequality Theorem 2.19 in WTIII with p = 2, we have, using M j t = t, E sup Ms j E sup Ms j C 2 t + 1 < s t s t showing that M j is a martingale Chapter 1 in WTIII which is even square integrable. Further, the real and imaginary parts of ei u,mv dmv j are square integrable martingales since the integrand is bounded and therefore E For any A F s, we get from s e i u,xv dmv j F s = a.s. E e i u,mt Ms 1 A = PA 1 2 u 2 s E e i u,mv Ms 1 A dv. This integral equation for the function t E e i u,mt Ms 1 A with s fixed and t s has a unique solution, namely so the claim follows. E e i u,mt Ms 1 A = P A exp{ 1 2 u 2 t s}, Remark 1.3. In WTIII we defined the stochastic integral with respect to a continuous local martingale M for integrands f for which there exists a progressive process g which is square integrable with respect to the Doleans measure µ M such that f g M =, i.e. f and g are in the same equivalence class in that space. We mention without proof see e.g. [KS91] the fact that if the process t M t is almost surely absolutely continuous, then this holds for every adapted process f in L 2 µ M. Note that absolute continuity of t M t holds in particular if M is Brownian motion. Let Ω, F, F, P be a filtered probability space satisfying the usual conditions and let W =,..., W d } t, t be a d-dimensional F-Brownian motion on that space. Let,..., X d } t, t be progressive or just jointly measurable and adapted and { Wt = W 1 t X = { X t = X 1 t satisfy T P X i t 2 dt < = 1, 1 i d, T <.

7 Version November 28th Then, the process { d Z t := exp i=1 X s i dw s i 1 2 } X s 2 ds is well-defined here, x denotes the Euclidean norm of x R d. Writing N t := d the formula becomes { Z t = exp N t 1 } 2 N t i=1 Xi s dw i s, and Z is called stochastic exponential of N even in cases where N is an arbitrary continuous local martingale starting at, in short: Z = EN. Further, Z t = 1 + d i=1 Z s X i s dw s i = 1 + Z s dn s This follows from Itô s formula applied to On the other hand, there is only one solution Z of equation we will show this in class; hint: let Z and Z be two solutions, define Y t := Z t τ Z t τ for a suitable stopping time τ and apply Burkholder-Davis-Gundy s inequality with p = 2. Note that shows that the process Z is a continuous local martingale with Z = 1. We will see later that it is of great interest to find conditions under which Z is even a martingale in which case EZ t = 1 for all t. It is not hard to see that a rather strong sufficient condition for Z to be a martingale is that there exists a deterministic function Lt, t such that Psup s t X s Lt = 1 for every t. A weaker condition is Novikov s condition which we will provide in Theorem 1.7. We will see in class that for Z as defined in 1.1.3, the condition EZ t = 1 for all t is not only necessary but also sufficient for Z to be a martingale hint: a nonnegative local martingale is a supermartingale [use Fatou s lemma for conditional expectations] and a supermartingale is a martingale iff its expected value is constant. If Z is a martingale, then we define, for each T [,, a probability measure P T on Ω, F T by P T A := E 1 A Z T, A FT. Note that the probability measures P T and P FT are mutually absolutely continuous with density d P T /dp FT = Z T. In the following Girsanov theorem, we use the symbol F T to denote the restricted filtration F t, t T. It should be clear what we mean by an F T -Brownian motion on Ω, F T, F T, P T, so we refrain from providing a precise definition. Theorem 1.4 Girsanov 196. Assume that Z defined by is a martingale. Define W i t := W i t X i s ds, i {1,..., d}, t. Then, for each fixed T, the process { W t }, t [, T ] is an F T -Brownian motion on Ω, F T, F T, P T. For the proof we require two lemmas. We denote by ẼT the expectation with respect to P T. Lemma 1.5. Let T > and assume that Z is a martingale. real-valued F t -measurable satisfying ẼT Y <, then Ẽ T Y Fs = 1 Z s E Y Z t F s, a.s. w.r.t. P and PT. If s t T and Y is

8 4 Wahrscheinlichkeitstheorie IV Proof. We have, for A F s, Ẽ T 1A Y = E 1 A Y Z T = E 1A Y EZ T F t = E 1 A Y Z t = E 1A E Y Z t F s and 1 Ẽ T 1 A E 1 Y Z t F s = E 1 A E 1 Y Z t F s ZT = E 1 A E Y Z t F s Zs = E 1 A E Y Z t F s. Z s Z s Z s In the following proposition, we denote by M loc,t the set of continuous local F-martingales on [, T ] with initial condition M = with respect to the original measure P and by M loc,t the corresponding set with P replaced by P T. Proposition 1.6. Fix T > and assume that Z defined as in is a martingale and M M loc,t. Then d M t := M t X s i d M, W i s ; t T is in M loc,t. If N M loc,t and i=1 then Ñ t := N t d i=1 X i s d N, W i s ; t T, M, Ñ t = M, N t ; t T, a.s. Proof. We first assume that M and N are bounded martingales with bounded quadratic variation and that Z and i 2ds X s are bounded in t and ω the general case can then be shown by stopping. Kunita-Watanabe s inequality WTIII, Proposition 2.8 a shows that X i s d M, W i s 2 M t X i s 2 ds, so M is also bounded. The integration-by-parts formula WTIII, Proposition 2.12 implies Z t Mt = Z s dm s + d i=1 M s X s i Z s dw s i, which is a martingale under P. Now Lemma 1.5 immediately implies that M M loc,t. The last claim in the statement of the proposition follows since M and M resp. N and Ñ differ only by a process of locally finite variation. Proof of Theorem 1.4. We show that the process W in the statement of the theorem satisfies the assumptions of Theorem 1.2. Setting M = W j in the previous proposition, we see that M = W j is in M loc,t. Setting N = W k, we see that so the claim follows. W j, W k = W j, W k = δ j,k t; t T, a.s.,

9 Version November 28th Novikov s Condition Theorem 1.7. If N is a continuous local martingale with N = such that E exp { 1 2 N } T <, T <, then the process Z defined by is a martingale. In particular, if X, W and Z are defined as in and { 1 T } E exp X s 2 ds <, T <, 2 then Z is a martingale. Proof. Following [Eb16] we prove the result only under the slightly stronger condition { p } E exp 2 N T <, T <, for some p > 1. This simplifies the proof considerably. For the general case see [KS91] or [Bo18]. Since Z is a continuous local martingale, there exists a localizing sequence of stopping times T n, n N such that T n almost surely and t Z t Tn is a continuous martingale for each n N. If, for each t, the sequence Z t Tn is uniformly integrable, then by Satz 1.39 in WT2 lim n Z t Tn = Z t in L 1 and therefore the martingale property of Z t Tn is inherited by Z letting n. It remains to show that the sequence Z t Tn is uniformly integrable for each fixed t >. Let c > and define q > 1 such that 1 p + 1 q = 1. Then, using Hölder s inequality, we get E Z t Tn 1 {Zt Tn c} = E exp { N t Tn p 2 N } {p 1 t T n exp 2 N } t T n 1{Zt Tn c} E exp { pn t Tn p2 2 N } 1/p t T n E exp { q p 1 2 N } 1/q t T n 1{Zt Tn c} E exp { p 2 N } 1/q t 1{Zt Tn c}. To show uniform integrability we have to show that if we first take the supremum over all n N and then the limit as c, then this limit is. Lemma 1.38 in WT2 shows that it is actually sufficient to take the lim sup n instead of sup n N since N is countable. Observe that lim sup E Z t Tn 1 {Zt Tn c} E exp { p n 2 N } 1/q t 1{Zt c} by dominated convergence since exp { p 2 N t} is integrable and hence the limit as c is, so the proof is complete. Remark 1.8. The previous theorem is known to be wrong if 1/2 is replaced by any number p < 1/2 see Remark 5.17 in [KS91], even in case d = 1.

10 6 Wahrscheinlichkeitstheorie IV 1.3 Application: Brownian Motion with Drift We show how Girsanov s theorem can be employed to compute the density of the first passage times for one-dimensional Brownian motion with drift. For a one-dimensional Brownian motion W and b > let τ b := inf{t : W t = b} denote the first passage time to level b. Note that Pτ b t = P sup s t W s b. Note that we computed the latter quantity in WT2 by applying Donsker s invariance principle. It is therefore easy to show that τ b has the density f b t = b { exp 2πt 3 b2 } ; t >. 2t Can we also compute the density f µ b of τ b for Brownian motion with drift µ? Indeed we can and Girsanov s theorem helps us doing this. Let µ. Let d = 1 and define W t := W t tµ, t. This corresponds to the choice X s = µ in Girsanov s theorem. Clearly, Novikov s condition is satisfied and hence Z t := exp { µw t 1 2 µ2 t }, t is a martingale. Fix T >. By Girsanov s theorem W is a Brownian motion on [, T ] under P T and hence W t = W t + µt, t T is a Brownian motion with drift µ under P T. Denoting by τ b the first passage time of W to b, we get for t [, T ] P T τ b t = 1 τb td P T = 1 τb tz T d P = 1 τb tz τb d P = E 1 τb t exp{µb 1 2 µ2 τ b } = exp{µb 1 2 µ2 s}f b s ds, where we applied the optional sampling theorem for the continuous martingale Z. It follows that the first passage time τ b for Brownian motion with drift has a density given by f µ b s = exp{µb 1 2 µ2 s}f b s = b { exp 2πs 3 b } µs2, s >. 2s Note that f b µ is a true density i.e. has integral 1 iff µ >. If µ <, then the integral is strictly smaller than 1 which is due to the fact that Brownian motion with drift µ < has a positive probability of never reaching level b >.

11 Chapter 2 Local Time In this section we introduce the concept of local time of a real-valued continuous semimartingale X. Is X a semimartingale? Itô s formula can certainly not be applied, but we will see that the answer is nevertheless yes. To show this we define fx = x, x R, approximate f by smoother functions f n, apply Itô s fomula to f n X and then take the limit n. Specifically, we choose f n C 2 R, R as follows: i f n x = x for x 1/n, ii f n x = f n x, x R, iii f nx, x R. Hence, f n converges to f uniformly and f nx converges to sgnx pointwise we define sgnx = 1 if x >, sgnx = 1 if x < and sgn =. In addition 1 1 f nx dx = 2 for all n. Itô s formula applied to f n yields f n X t f n X = f nx s dx s f nx s d X s In order to see that the stochastic integral converges as n we need the following lemma note that Lemma 2.16 in WT3 about ucp-convergence does not apply here. We formulate the lemma in a way which meets our demands but not more. Lemma 2.1. Let H n, n N be a sequence of uniformly bounded progressive processes, i.e. there exists a number C R such that H n t ω C for all t, n N and ω Ω and assume that H t ω := lim n Ht n ω exists for all t and ω Ω. Let Y be a continuous semimartingale such that Y =. Then.. Hs n dy s H s dy s ucp. Proof. We follow Section 8.3 of [WW9]. Let Y = M + A be the unique decomposition of Y such that M M loc and A A. Clearly we have by dominated convergence. H n s da s. 7 H s da s ucp,

12 8 Wahrscheinlichkeitstheorie IV so it remains to show the claim in case Y = M. Let T k := inf{t : M t + M t k} k, k N. The BDG inequality implies E sup t k 1 [,Tk ]shs n 2 k H s dm s C 2 E 1 [,Tk ]shs n H s 2 d M s, as n by dominated convergence. In particular we have, for fixed k N,. 1 [,Tk ]H n s dm s. 1 [,Tk ]H s dm s ucp. Letting k the claim follows since T k almost surely check this!. Now we are able to pass to the limit in The previous lemma shows that. f nx s dx s. sgnx s dx s ucp taking C = 1. Therefore, the second integral in 2..1 converges ucp to a process 1 L t := lim n 2 f nx s d X s = X t X sgnx s dx s The first equality shows that L A +. The second representation shows that the process L does not depend on the particular choice of f n. L is called local time at of the semimartingale X the reason for this will become clear in the next theorem. Note that we have shown the following formula for fx = x : fx t = fx + sgnx s dx s + L t, showing that fx t is indeed a continuous semimartingale. Theorem 2.2. Let X be a continuous semimartingale with local time L given by Then 1 L t = lim ε 2ε 1 Xs ε d X s in probability. In particular, for an F-Brownian motion X = W, where λ denotes Lebesgue measure on [,. 1 L t = lim ε 2ε λ{s t : W s ε}, Proof. The second claim clearly follows from the first since W s = s. To show the first claim we choose f n in a particular way namely such that in addition to the previous assumptions Then n 2 1 Xs [ 1 n+1, 1 n+1] d X s 1 2 n1 [ ] 1 n+1, 1 f n n + 11 [. 1 n+1 n n], 1 f nx s d X s n d X s Xs [ 1 n n], 1 which implies the first claim check this! Note that L t = lim n... implies L t = lim ε... by monotonicity.

13 Chapter 3 Weak Solutions and Martingale Problems 3.1 Weak solutions of stochastic differential equations We will basically follow the approach presented in [KS91]. Consider the following stochastic differential equation m dx t = bt, X t dt + σ k t, X t dwt k, where W 1,..., W m are independent Brownian motions and b and σ 1,..., σ m are measurable functions mapping [, R d to R d. We start with the definition of a weak solution of Definition 3.1. A weak solution of is a tuple X, W, Ω, F, F, P, where k=1 i Ω, F, F, P is a FPS which satisfies the usual conditions, ii X is a continuous, adapted R d -valued process and W is an m-dimensional F-Brownian motion, iii bs, X s + m k=1 σ k s, X s 2 ds < almost surely for every t, iv X t = X + holds almost surely. bs, X s ds + m k=1 σ k s, X s dw k s ; t < Remark 3.2. Occasionally, we fix T > and consider weak solutions on the interval [, T ] instead of [,. It should be clear what we mean by this, so we do not provide a formal definition. Note that it might well happen that a weak solution X, W exists but that X is not adapted to the augmented filtration generated by W here augmented means the smallest filtration which contains the filtration generated by W which satisfies the usual conditions. We will see an example of this kind soon. Note that in WT3 we established a stronger form of solution to 9

14 1 Wahrscheinlichkeitstheorie IV 3.1.1: we showed in Theorem 2.27 under appropriate assumptions called H that the sde has a unique solution with initial condition x R d on any FPS carrying an F-Brownian motion W. In particular, we can choose the filtration F to be the augmented filtration generated by W. Such a solution is called a strong solution. We will often allow the initial condition X = ξ to be random. In this case ξ is F -measurable and therefore independent of the filtration generated by W. Then a process X on a given FPS Ω, F, F, P with a given F-Brownian motion W and a given F -measurable ξ is called a strong solution, if X, W, Ω, F, F, P is a weak solution which is adapted to the augmented filtration generated by W and ξ and which satisfies the initial condition X = ξ almost surely. By definition, every strong solution is a weak solution. We will see that the converse is not true. One may ask which of the two concepts of a solution is the more natural one. There is no clear answer to this question. It depends on which kind of phenomenon one would like to model by an sde. If we think of W as an input to a system and X as the output, then one can argue that the output should be a function of the input and in this case strong solutions seem more natural. On the other hand one can take a pragmatic viewpoint by saying that one just wants to describe some random phenomenon via an sde and one does not care whether the solution is a function of the input. In this case weak solutions are more appropriate. After having defined the concept of a weak solution, we define what we mean by uniqueness of solutions. Definition 3.3. We say that pathwise uniqueness holds for if, whenever X, W, Ω, F, F, P and X, W, Ω, F, F, P are two weak solutions on the same probability space Ω, F, P with possibly different filtrations such that PX = X = 1, then the processes X and X are indistinguishable i.e. PX t = X t ; t < = 1. Remark 3.4. Some authors define pathwise uniqueness slightly differently by assuming that F and F coincide. Formally, our definition is stronger i.e. more restrictive. In fact, the two definitions can be shown to be equivalent see [IW89], Remark IV Remark 3.5. The proof of Theorem 2.27 in WT3 shows that pathwise uniqueness holds under the assumptions H stated there. Definition 3.6. We say that uniqueness in law or weak uniqueness holds for if, for any two weak solutions X, W, Ω, F, F, P and X, W, Ω, F, F, P with the same initial distribution i.e. LX = L X the processes X and X have the same law. The following classical example due to H. Tanaka shows that a weak solution may not be a strong solution and that uniqueness in law does not imply pathwise uniqueness. Example 3.7. Consider the one-dimensional sde dx t = signx t dw t ; t <, where signx = { 1; x >, 1; x. If X, W, Ω, F, F, P is a weak solution, then B t := signx s dw s

15 Version November 28th is a continuous local F-martingale starting at with quadratic variation B t = sign2 X s ds = t. Therefore, B is an F-Brownian motion by Lévy s theorem. Note that X t = X +B t, t and that the process B is independent of F. Since X is F -measurable, B and X are independent, so the law of X is uniquely determined by that of X and so weak uniqueness holds for Let us now show existence of a weak solution. Let X be an F-Brownian motion on some FPS Ω, F, F, P with F -measurable initial condition X and define W t := signx s dx s. Then W is an F-Brownian motion by Lévy s theorem and signx s dw s = sign 2 X s dx s = X t X, so X, W is a weak solution! Next we show that there is no pathwise uniqueness in case the initial condition is X =. Let X, W, Ω, F, F, P be the weak solution which we just constructed with X =. Then X, W, Ω, F, F, P is also a weak solution in spite of the slight asymmetry in the definition of the function sign! Therefore, pathwise uniqueness does not hold for If X, W is any weak solution of with initial condition, then signx s dx s = sign 2 X s dw s = W t, so W is measurable with respect to the filtration generated by X. We will see in a few seconds that X is not measurable with respect to the filtration generated by W, so X is not a strong solution. Since any strong solution is a weak solution this shows that no strong solution of with initial condition exists. Since the process X is an F-Brownian motion for some filtration F it can be represented as X t = sgnx s dx s + L t = signx s dx s + L t = W t + L t, by the results of the last chapter, where L is the local time of the semimartingale X at. The second part of Theorem 2.2 shows that L is not only F-adapted but even adapted to the filtration generated by X t, t. Therefore the same holds true for W. Obviously, the filtration G t, t generated by X is strictly smaller than that generated by X the event {X 1 > } is for example contained σx s, s 1 but not in G Weak solutions via Girsanov s theorem The following proposition shows how Girsanov s theorem can be used to show existence of a weak solution to a certain class of stochastic differential equations. Proposition 3.8. Fix T > and consider the stochastic differential equation dx t = bt, X t dt + dw t ; t T, where W is d-dimensional Brownian motion and b : [, T ] R d R d is jointly measurable and bounded. Then, for any x R d, has a weak solution with initial condition X = x almost surely.

16 12 Wahrscheinlichkeitstheorie IV Proof. Fix x R d and let X t t be a d-dimensional F-Brownian motion starting at x on some FPS Ω, F, F, P. By Theorem 1.7, the process { d Z t := exp j=1 b j s, X s dx j s 1 2 } bs, X s 2 ds, t is a continuous martingale, so Girsanov s theorem implies that under Q given by dq/dp FT = Z T, the process W t := X t x is an F T -Brownian motion starting at. Therefore, X t = x + bs, X s ds; t T bs, X s ds + W t ; t T, and hence X, W, Ω, F T, F T, Q is a weak solution of with initial condition QX = x = 1. Remark 3.9. The assumption that b is bounded is stronger than necessary but simplifies the argument. For a more general result which also allows the initial condition to be random, see [KS91], Proposition and the remark following that proposition. Uniqueness in law for is discussed in Proposition in [KS91]. 3.3 Martingale Problems We continue to consider stochastic differential equations of the form dx t = bt, X t dt + m σ k t, X t dwt k, with measurable coefficients as before. The d d symmetric and nonnegative definite matrix a ij t, x := k=1 m σ k t, x i σ k t, x j ; i, j {1,..., d} k=1 is called the diffusion matrix associated to We denote by C k c R d the space of real-valued k-times continuously differentiable functions on R d with compact support, and C c R d := k N Ck c R d. Then the following holds. Proposition 3.1. Let X, W, Ω, F, F, P be a weak solution of Then for all f C 2 R d, the process M f t := fx t fx d b i s, X s i fx s i=1 d i,j=1 a ij s, X s ijfx 2 s ds is a continuous local F-martingale with M f =. If b and σ 1,..., σ k are locally bounded in both variables and f C 2 c R d, then M f is a continuous L 2 -martingale.

17 Version November 28th Proof. This is a straightforward application of Itô s formula. The idea of Stroock and Varadhan [SV79] was to characterize a weak solution as the solution of a local martingale problem. Let S d := {A R d d : A is symmetric and nonnegative definite} and let b : [, R d R d and a : [, R d S d be measurable. For t we define the second order differential operator A t by A t fx := d b i t, x i fx i=1 d a ij t, x ijfx 2 i,j=1 = bt, x, fx trace at, xhess f x, f C 2 R d. Definition Let b and a be as above. A continuous R d -valued stochastic process X = X t t on some probability space Ω, F, P is called a solution to the local martingale problem for a, b or for the associated family of operators A t t if for each f Cc R d, the process M f t := fx t fx A s fx s ds is a continuous local martingale with respect to the filtration generated by X. In short, we say that X solves LMPa, b. If X has initial distribution µ = PX 1, then we say that X solves LMPa, b, µ. If is a martingale for each f Cc R d, then we say that X solves the associated martingale problem and write MPa, b and MPa, b, µ. Finally, we say that the solution to LMPa, b, µ or MPa, b, µ is unique if for any two solutions X and X on possibly different spaces the laws of X and X coincide. Remark If X solves LMPa, b, µ, then we can transfer X to the following canonical setup: consider the probability space Ω, F, P := C[,, R d, BC[,, R d, LX, where LX denotes the law of X and C[,, R d is equipped with the usual topology one can easily find a complete metric which makes the space Polish. Define the canonical process π t ω := ω t, t. Then π has the same law as X and therefore π also solves LMPa, b, µ. In particular, we can identify a solution X to LMPa, b, µ with the probability measure LX on the measurable space C[,, R d, BC[,, R d. Analogous statements hold for LMPa, b, MPa, b, and MPa, b, µ. Therefore, we will often use formulations like Let P be a solution of LMPa, b, µ if P is the law of a solution of LMPa, b, µ. We know already from the previous proposition that weak solutions solve the corresponding local martingale problem. Theorem 3.15 shows that the converse is also true. Before we formulate that result, we provide a representation theorem due to Doob of an R d -valued local martingale as a stochastic integral with respect to a Brownian motion on a possibly enlarged probability space. We will need that result in the proof of Theorem 3.15 but it is also of interest otherwise. Definition If Ω, F, F, P is a FPS and m N is fixed, then we define an m-extension Ω, F, F, P of Ω, F, F, P as follows. Let W be an m-dimensional ˆF-Brownian motion on a FPS ˆΩ, ˆF, ˆF, ˆP and define Ω := Ω ˆΩ, ˇF := F ˆF, ˇF t := F t ˆF t, ˇP := P ˆP. Let Ω, F, F, P be the

18 14 Wahrscheinlichkeitstheorie IV smallest FPS which contains or extends Ω, ˆF, ˆF, ˆP and which satisfies the usual conditions. Define W t ω = W t ω, ˆω := W t ˆω, t. Note that W is a F-Brownian motion. For any adapted process A on Ω, F, F, P we define Ãt ω = Ãtω, ˆω := A t ω, t. Note that à is F-adapted and à is F-progressive if A is F-progressive. We will drop the tildes whenever there is no danger of confusion. Theorem Let M be a continuous R d -valued local F-martingale with M = on some FPS Ω, F, F, P such that M i, M j t = m k=1 Vs ik Vs jk ds, t for progressive processes Vt ik t, 1 i d, 1 k m. Then on any m-extension Ω, there exists an R m -valued F-Brownian motion B such that M i t = m k=1 V ik s db k s. F, F, P Proof. For each t we regard V t := Vt ik i=1,...,d;k=1,...,m as a linear map from Rm to R d or a d m-matrix. Let N t R m be its null space and R t R d its range. Let Vt 1 : R t Nt be the bijective inverse of the restriction of V t to Nt and denote the orthogonal projection from a Euclidean space to a subspace U by π U. Let W be an m-dimensional Brownian motion as in the previous definition in particular W is independent of M and define B t := V 1 s π Rs dm s + π Ns dw s, t, where we interpret V 1 s π Rs as an m d-matrix or linear map from R d to R m. Note that the integrands are progressive and the stochastic integrals are well-defined! Then the covariation matrix of the continuous local martingale B is d dt Bk, B l = d V 1 t π Rt ki V 1 t π Rt d kj dt M i, M j + i,j=1 m π N t Inserting we obtain in matrix notation with. T denoting the transpose d dt B t = Vt 1 π Rt V t Vt T V 1 t π Rt T +π N t π N t T = π Nt π N t ν=1 kν π N t lν. T +π N t π N t T = π N t +π Nt = I m. Therefore, Lévy s theorem shows that B is an m-dimensional Brownian motion. Further, using the fact that V s π Ns and for N M loc, N implies N, so the proof is complete. V s db s = = = V s V 1 s π Rs dm s + π Rs dm s + π Rs dm s + V s π Ns dw s π R s dm s = M t,

19 Version November 28th Theorem Let b : [, R d R d and a : [, R d S d be measurable. Suppose that X is a solution to LMPa, b on some probability space Ω, F, P. Let m N and let σ : [, R d R d m be a measurable function such that a = σσ T note that such a function always exist when m d. Let F be the complete filtration on Ω, F, P generated by X. Then, on any m-extension Ω, F, F, P of the FPS Ω, F, F, P there exists an R m -valued F-Brownian motion B such that X, B, Ω, F, F, P is a weak solution of Proof. Fix i {1,..., d}. For each n N, choose f n i Cc R d with f n i x = x i for all x n. For each n N, choose a localizing sequence T m n m N of stopping times for the continuous local martingale M f n i n+1 t. We can and will assume that T t 1 T n n T 1 n+1 for all n N. Then M f n i t T n m t τ n := inf{t : X t n}. Then is a martingale for every m, n N. Define the stopping time M f n i t = X i t X i b i s, X s ds, t τ n. Observe that S n := τ n T n n is an increasing sequence of stopping times with S n almost surely and that M f n i t S n = X i t S n X i t Sn b i s, X s ds is a martingale for each n. Consequently, M i t := X i t X i b i s, X s ds is a continuous local martingale. This formula also shows that X is a continuous semimartingale. Next, fix i, j {1,..., d} and choose f n i,j Cc R d such that f n i,j x = x i x j whenever x n. As above, one can show that M i,j t := X i t X j t X i Xj X i s b j s, X s + X s j b i s, X s + a ij s, X s ds is a continuous local martingale. Integrating by parts and using 3.3.4, we obtain M i,j t = = X s i dx j X i s s + X s i dm s j + X s j dx s i + X i, X j t b j s, X s + X j b i s, X s + a ij s, X s ds s X s j dm s i + M i, M j t a ij s, X s ds. Therefore, the process M i, M j t a ijs, X s ds is a continuous local martingale starting at which is of locally finite variation, so Theorem 1.57 in WT3 implies that the process is almost surely. We have shown that the processes M i are continuous local martingales starting at such that M i, M j t = a ijs, X s ds. Therefore, the claim follows from Theorem 3.14.

20 16 Wahrscheinlichkeitstheorie IV Remark Note that the previous theorem establishes a one-to-one correspondence between weak solutions and solutions of martingale problems up to the non-unique choice of the matrix σ when a is given. In particular: the sde with coefficients b and σ has a weak solution if and only if LMPσσ T, b has a solution and we have weak uniqueness if and only LMPσσ T, b, µ has at most one solution P M 1 C[,, R d for every µ MR d. Note that the law of a weak solution depends on σ only via σσ T. Note also that the local martingale formulation does not say much about pathwise uniqueness. Indeed, for equation 3.1.2, we have σx = signx and therefore ax = 1. We showed that weak uniqueness holds and that a weak solution exists for any initial law µ. Therefore the associated LMP1,, µ has a unique solution namely standard Brownian motion with initial distribution µ. Note that for the particular square root σx = signx of a = 1 pathwise uniqueness does not hold and we do not have a strong solution at least not if the initial condition is, while for the root σx = 1 pathwise uniqueness holds and we have a strong solution. 3.4 Martingale Problems: Existence of solutions We already stated an existence result for the solution of a local martingale problem in case a I d using Girsanov s theorem. Our aim in this section is to prove a similar statement for more general functions a. Before we formulate and prove an existence result for a solution of LMPa, b we present a useful result about weak convergence see [K2], Theorem 4.27 and then a tightness criterion for probability measures on the space C[,, R d. Proposition Let E and Ẽ be metric spaces, let µ, µ 1,... M 1 E such that µ n µ and let f, f 1, f 2,... : E Ẽ be Borel-measurable mappings. Assume that there exists a set C BE such that µc = 1 and f n x n fx whenever x n x C. Then µ n fn 1 µf 1. Remark In WT2 we treated the special case in which all functions f n equal f and f is continuous at µ-almost every point in E. Proof of Proposition Fix an open set G Ẽ and let x f 1 G C in case the set in nonempty. Then there exists an open neighborhood N of x and some m N such that f k x G for all k m and all x N. Thus, N k m f 1 k G, and so f 1 G C, f 1 k G m where A denotes the interior of the set A. Using the Portmanteau theorem from WT2, we get µ f 1 G µ f 1 k G = sup µ f 1 k G m k m sup lim inf µ n m n k m k m Again using the Portmanteau theorem the claim follows. m k m f 1 k G lim inf n µ n Consider the space C d := C[,, R d equipped with the metric ρf, g := k= 2 k max fx gx 1. x [k,k+1] f 1 n G.

21 Version November 28th It is easy to see that C d, ρ is complete and separable and hence Polish and that a sequence of elements of C d converges with respect to ρ iff the sequence converges uniformly on every compact subset of [,. For t, define the projection map π t : C d C[, t], R d by π t fs := f s, s [, t]. Clearly, π t is continuous if C[, t], R d is equipped with the supremum norm. We define the modulus of continuity of f C d on [, t] by w f t h := sup{ f r f s ; r, s [, t], r s h}, h >. Proposition The set Γ M 1 C is relatively compact and hence tight iff the following two conditions hold: i For each ε > there exists a compact set K ε R d such that µ{f C d : f K ε } 1 ε for every µ Γ. ii For each T >, ε > and δ > there exists some h > such that Proof. We will show this in class. µ {f C d : w f T h δ} ε for all µ Γ. We will use the following sufficient tightness criterion on C c. Proposition 3.2. Let Y i, i I be a family of C d -valued random variables on possibly different spaces Ω i, F i, P i. A sufficient condition for tightness of LY i, i I is that the following two conditions hold: i For each ε > there exists a compact set K ε R d such that P i {Y i K ε} 1 ε for every i I. ii For each T > there exist a T, b T, c T > such that E Y i t Y i s a T c T t s 1+b T ; s, t [, T ], i I. Proof. Fix T > and define Zt i := YtT i, t. Then, for s, t [, 1], E Z i t Z i s a T = E Y i tt Y i st a T c T T 1+b T t s 1+b T. Choose κ T, b T /a T. Then Kolmogorov s continuity theorem as e.g. in Theorem 5.2 in the WT3 Lecture notes implies P i w Y i T h δ = P i w Z i 1 T/h δ 1 w δ a E Z i i T 1 T/h a T 1 δ a hκ T a T ξ T T, for some ξ T > which does not depend on i I nor on δ and h, so the claim follows from the previous proposition. The following is extremely preliminary. Please ignore it! Theorem Let a : R d S d and b : R d R d be bounded and continuous and let x R d. Then LMPa, b, δ x has a solution.

22 18 Wahrscheinlichkeitstheorie IV Proof. The idea of the proof is to approximate the functions a and b by smoother functions a n and b n such that we know that LMPa n, b n, δ x has a solution P n M 1 C d and to show that the sequence P n, n N is tight. Since C d is a Polish space, we know from Prohorov s theorem WT2 that there exists a subsequence of P n, n N which converges weakly to some P M 1 C d. Then it remains to show that P solves LMPa, b, δ x. We choose b n : R d R d such that b n C R d, R d and b n b uniformly on R d it is clear that such a sequence exists. In addition we can and will assume that sup x R d b n x sup x R d bx =: B. To define a n, we first choose a continuous map σ : R d R d d such that ax = σxσ T x note that σ is automatically bounded. Denote the k-th column of σ by σ k, k = 1,...d and choose σ k n C R d, R d such that σ k n toσ k uniformly on R d. Define a n ij := σ.... It remains to verify that X solves LMPa, b, δ x which is equivalent to showing that for every s < t <, any f Cc R d and any g C[, s], R d E fx t fx s s A r fx r dr gx s, s [, s] =. We write this equation as EϕX = where ϕ : C[, t], R d R. We know that the corresponding equation Eϕ n X n = holds where ϕ n is defined like ϕ but with A r replaced by A n r.... Remark The previous theorem can be generalized considerably see for example [K2]: the functions a and b can be allowed to be time-dependent, random, and to depend also on the past values of the solution process. Further, boundedness can be weakened to a linear growth condition and δ x can be replaced by an arbitrary µ M 1 R d.

23 Bibliography [Bo18] A. Bovier, Introduction to stochastic analysis, Course Notes, Bonn, [DZ96] G. Da Prato and J. Zabczyk, Ergodicity for Infinite Dimensional Systems, Cambridge Univ. Press, Cambridge, [Eb16] A. Eberle, Introduction to stochastic analysis, Course Notes, Bonn, [EK86] S. N. Ethier and T. G. Kurtz, Markov Processes: Characterization and Convergence, Wiley, New York, [IW89] N. Ikeda and S. Watanabe, Stochastic Differential Equations and Diffusion Processes, North-Holland, Ansterdam, [K2] O. Kallenberg, Foundations of Modern Probability, second edition, Springer, New York, 22. [KS91] I. Karatzas and S. Shreve, Brownian Motion and Stochastic Calculus, second edition, Springer, New York, [Pr92] P. Protter, Stochastic Integration and Stochastic Differential Equations, second edition, Springer, New York, [Re13] M. v.renesse, Stochastische Differentialgleichungen, Course Notes, Leipzig, renesse, 213. [RY94] D. Revuz and M. Yor, Continuous Martingales and Brownian Motion, second edition, Springer, [RW194] L.C.G. Rogers and D. Williams, Diffusions, Markov Processes and Martingales, Volume I: Foundations, second edition, Wiley, [RW294] L.C.G. Rogers and D. Williams, Diffusions, Markov Processes and Martingales, Volume 2: Itô Calculus, second edition, Wiley, [SV79] D. Stroock and S. Varadhan, Multidimensional Diffusion Processes, Springer, [St13] T. Sturm, Stochastische Analysis, Course Notes, Bonn, sturm/skript/stochana.pdf [WW9] H. v.weizsäcker and G. Winkler, Stochastic Integrals, Vieweg,

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications The multidimensional Ito Integral and the multidimensional Ito Formula Eric Mu ller June 1, 215 Seminar on Stochastic Geometry and its applications page 2 Seminar on Stochastic Geometry and its applications

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

ON THE POLICY IMPROVEMENT ALGORITHM IN CONTINUOUS TIME

ON THE POLICY IMPROVEMENT ALGORITHM IN CONTINUOUS TIME ON THE POLICY IMPROVEMENT ALGORITHM IN CONTINUOUS TIME SAUL D. JACKA AND ALEKSANDAR MIJATOVIĆ Abstract. We develop a general approach to the Policy Improvement Algorithm (PIA) for stochastic control problems

More information

for all f satisfying E[ f(x) ] <.

for all f satisfying E[ f(x) ] <. . Let (Ω, F, P ) be a probability space and D be a sub-σ-algebra of F. An (H, H)-valued random variable X is independent of D if and only if P ({X Γ} D) = P {X Γ}P (D) for all Γ H and D D. Prove that if

More information

On continuous time contract theory

On continuous time contract theory Ecole Polytechnique, France Journée de rentrée du CMAP, 3 octobre, 218 Outline 1 2 Semimartingale measures on the canonical space Random horizon 2nd order backward SDEs (Static) Principal-Agent Problem

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Nonlinear representation, backward SDEs, and application to the Principal-Agent problem Ecole Polytechnique, France April 4, 218 Outline The Principal-Agent problem Formulation 1 The Principal-Agent problem

More information

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS Qiao, H. Osaka J. Math. 51 (14), 47 66 EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS HUIJIE QIAO (Received May 6, 11, revised May 1, 1) Abstract In this paper we show

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

The Uniform Integrability of Martingales. On a Question by Alexander Cherny

The Uniform Integrability of Martingales. On a Question by Alexander Cherny The Uniform Integrability of Martingales. On a Question by Alexander Cherny Johannes Ruf Department of Mathematics University College London May 1, 2015 Abstract Let X be a progressively measurable, almost

More information

Weak solutions of mean-field stochastic differential equations

Weak solutions of mean-field stochastic differential equations Weak solutions of mean-field stochastic differential equations Juan Li School of Mathematics and Statistics, Shandong University (Weihai), Weihai 26429, China. Email: juanli@sdu.edu.cn Based on joint works

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Generalized Gaussian Bridges of Prediction-Invertible Processes

Generalized Gaussian Bridges of Prediction-Invertible Processes Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Proofs of the martingale FCLT

Proofs of the martingale FCLT Probability Surveys Vol. 4 (2007) 268 302 ISSN: 1549-5787 DOI: 10.1214/07-PS122 Proofs of the martingale FCLT Ward Whitt Department of Industrial Engineering and Operations Research Columbia University,

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

MAXIMAL COUPLING OF EUCLIDEAN BROWNIAN MOTIONS

MAXIMAL COUPLING OF EUCLIDEAN BROWNIAN MOTIONS MAXIMAL COUPLING OF EUCLIDEAN BOWNIAN MOTIONS ELTON P. HSU AND KAL-THEODO STUM ABSTACT. We prove that the mirror coupling is the unique maximal Markovian coupling of two Euclidean Brownian motions starting

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS

ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS PORTUGALIAE MATHEMATICA Vol. 55 Fasc. 4 1998 ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS C. Sonoc Abstract: A sufficient condition for uniqueness of solutions of ordinary

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information

Quasi-sure Stochastic Analysis through Aggregation

Quasi-sure Stochastic Analysis through Aggregation Quasi-sure Stochastic Analysis through Aggregation H. Mete Soner Nizar Touzi Jianfeng Zhang March 7, 21 Abstract This paper is on developing stochastic analysis simultaneously under a general family of

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

On a class of stochastic differential equations in a financial network model

On a class of stochastic differential equations in a financial network model 1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

LogFeller et Ray Knight

LogFeller et Ray Knight LogFeller et Ray Knight Etienne Pardoux joint work with V. Le and A. Wakolbinger Etienne Pardoux (Marseille) MANEGE, 18/1/1 1 / 16 Feller s branching diffusion with logistic growth We consider the diffusion

More information

On Reflecting Brownian Motion with Drift

On Reflecting Brownian Motion with Drift Proc. Symp. Stoch. Syst. Osaka, 25), ISCIE Kyoto, 26, 1-5) On Reflecting Brownian Motion with Drift Goran Peskir This version: 12 June 26 First version: 1 September 25 Research Report No. 3, 25, Probability

More information

MATH 51H Section 4. October 16, Recall what it means for a function between metric spaces to be continuous:

MATH 51H Section 4. October 16, Recall what it means for a function between metric spaces to be continuous: MATH 51H Section 4 October 16, 2015 1 Continuity Recall what it means for a function between metric spaces to be continuous: Definition. Let (X, d X ), (Y, d Y ) be metric spaces. A function f : X Y is

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION A la mémoire de Paul-André Meyer Francesco Russo (1 and Pierre Vallois (2 (1 Université Paris 13 Institut Galilée, Mathématiques 99 avenue J.B. Clément

More information

Some Properties of NSFDEs

Some Properties of NSFDEs Chenggui Yuan (Swansea University) Some Properties of NSFDEs 1 / 41 Some Properties of NSFDEs Chenggui Yuan Swansea University Chenggui Yuan (Swansea University) Some Properties of NSFDEs 2 / 41 Outline

More information

Estimates for the density of functionals of SDE s with irregular drift

Estimates for the density of functionals of SDE s with irregular drift Estimates for the density of functionals of SDE s with irregular drift Arturo KOHATSU-HIGA a, Azmi MAKHLOUF a, a Ritsumeikan University and Japan Science and Technology Agency, Japan Abstract We obtain

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

Stochastic Processes

Stochastic Processes Introduction and Techniques Lecture 4 in Financial Mathematics UiO-STK4510 Autumn 2015 Teacher: S. Ortiz-Latorre Stochastic Processes 1 Stochastic Processes De nition 1 Let (E; E) be a measurable space

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9 MAT 570 REAL ANALYSIS LECTURE NOTES PROFESSOR: JOHN QUIGG SEMESTER: FALL 204 Contents. Sets 2 2. Functions 5 3. Countability 7 4. Axiom of choice 8 5. Equivalence relations 9 6. Real numbers 9 7. Extended

More information

e - c o m p a n i o n

e - c o m p a n i o n OPERATIONS RESEARCH http://dx.doi.org/1.1287/opre.111.13ec e - c o m p a n i o n ONLY AVAILABLE IN ELECTRONIC FORM 212 INFORMS Electronic Companion A Diffusion Regime with Nondegenerate Slowdown by Rami

More information

Mean-field dual of cooperative reproduction

Mean-field dual of cooperative reproduction The mean-field dual of systems with cooperative reproduction joint with Tibor Mach (Prague) A. Sturm (Göttingen) Friday, July 6th, 2018 Poisson construction of Markov processes Let (X t ) t 0 be a continuous-time

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Functional Analysis I

Functional Analysis I Functional Analysis I Course Notes by Stefan Richter Transcribed and Annotated by Gregory Zitelli Polar Decomposition Definition. An operator W B(H) is called a partial isometry if W x = X for all x (ker

More information

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University

Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University Lecture Notes in Advanced Calculus 1 (80315) Raz Kupferman Institute of Mathematics The Hebrew University February 7, 2007 2 Contents 1 Metric Spaces 1 1.1 Basic definitions...........................

More information

Fast-slow systems with chaotic noise

Fast-slow systems with chaotic noise Fast-slow systems with chaotic noise David Kelly Ian Melbourne Courant Institute New York University New York NY www.dtbkelly.com May 1, 216 Statistical properties of dynamical systems, ESI Vienna. David

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

LOCAL TIMES OF RANKED CONTINUOUS SEMIMARTINGALES

LOCAL TIMES OF RANKED CONTINUOUS SEMIMARTINGALES LOCAL TIMES OF RANKED CONTINUOUS SEMIMARTINGALES ADRIAN D. BANNER INTECH One Palmer Square Princeton, NJ 8542, USA adrian@enhanced.com RAOUF GHOMRASNI Fakultät II, Institut für Mathematik Sekr. MA 7-5,

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

Convergence of Markov Processes. Amanda Turner University of Cambridge

Convergence of Markov Processes. Amanda Turner University of Cambridge Convergence of Markov Processes Amanda Turner University of Cambridge 1 Contents 1 Introduction 2 2 The Space D E [, 3 2.1 The Skorohod Topology................................ 3 3 Convergence of Probability

More information