Stochastic Analysis. Prof. Dr. Nina Gantert. Lecture at TUM in WS 2011/2012. June 13, Produced by Leopold von Bonhorst and Nina Gantert

Size: px
Start display at page:

Download "Stochastic Analysis. Prof. Dr. Nina Gantert. Lecture at TUM in WS 2011/2012. June 13, Produced by Leopold von Bonhorst and Nina Gantert"

Transcription

1 Stochastic Analysis Prof. Dr. Nina Gantert Lecture at TUM in WS 211/212 June 13, 212 Produced by Leopold von Bonhorst and Nina Gantert

2 Contents 1 Definition and Construction of Brownian Motion 3 2 Some properties of Brownian Motion 7 3 The Cameron-Martin Theorem and the Paley-Wiener stochastic integral 13 4 Brownian Motion as a continuous martingale 17 5 Stochastic integrals with respect to Brownian Motion 19 6 Itô s formula and examples 24 7 Pathwise stochastic integration with respect to continuous semimartingales 27 8 Cross-variation and Itô s product rule 32 9 Stochastic Differential Equations 35 1 Girsanov transforms 38 2

3 1 Definition and Construction of Brownian Motion 1.1 Historic origin Brown (1827): Movement of a pollen in a liquid Bachelier (19): Model for stock market fluctuations Einstein (195): Motion of a particle 1.2 Heuristic description with symmetric random walks Y 1,Y 2,... iid with P[Y i = 1] = 1 2 = P[Y i = 1], S k = k i=1 Y i, k = 1,2,..., S =. Fix N N Rescale the process to the time interval [,1]: X k N = 1 N S k, k =,1,2...,N Then, (i) X =. (ii) For t < t 1 < t 2 <... < t m 1,t i = k i N,k i {,1,...,N},X ti X ti 1 are independent and E[X ti X ti 1 ] = and Var(X ti X ti 1 ) = Var( 1 N k i j=k i 1 +1 Y j ) = 1 N (k i k i 1 ). Due to the CLT, the laws of (X ti X ti 1 ) converge to N(,t i t i 1 ) for N (and k i N N t i [,1]). This motivates the following definition: 1.3 Basic Definitions Definition 1.1 Brownian Motion (BM) is a stochastic process B t (ω),( t 1) on a probability space (Ω,A,P) such that (i) B = P-a.s. (ii) For t < t 1 < t 2 <... < t n 1, the increments B ti B ti 1, 1 i n, are independent with law N(,t i t i 1 ) (iii) t B t (ω) is continuous for P-a.a.ω. Definition 1.2 Let (B t ) t 1 be a BM on the probability space (Ω,A,P). Then, the image measure of P under the map Ω C[,1] 3

4 ω (B t (ω)) t 1 is the Wiener measure. The Wiener measure is a prob. measure on (C[,1],F), with F = σ({g t : t 1}), where g t : C[,1] R,g t (x) = x(t) (x C[,1]). Interpretation: Def. 1.1: t B t (ω) is a stochastic evolution in time. Def. 1.2: (B t ) t 1 is a random variable with values in C[,1]. Theorem 1.3 Brownian motion exists. There are different proofs of Theorem 1.3. We give here a proof, which constructs linear interpolations on the sets D n = { k 2n : k 2 n }. We follow the proof of Theorem 1.3. in [4]. Proof: Let D = n D n and let (Ω,A,P) be a probability space such that {Z t,t D} are iid random variables on (Ω,A,P), with law N(,1). Let B := and B 1 := Z 1. For each n N, we define the random variables B s,s D n such that: (1) For r < s < t,r,s,t D n,b t B s is independent of B s B r, and B t B s has the law N(,t s) (2) The vectors (B s,s D n ) and (Z t,t D \D n ) are independent. For D = {,1}, we are done. Proceeding inductively, assume that we followed the construction for some n 1. We then define B s for s D n \D n 1 by B s = 1 2 (B s 1 2 n +B s+ 1 2 n)+ 1 2 n+1 2 The first term is the linear interpolation of B at the neighboring points of s in D n 1. Z s. Therefore, B s is independent of (Z t,t D \D n ) and (2) is satisfied. Moreover, since 1 2 (B s+ 1 2 n B s 1 2 n) 4

5 1 depends only on (Z t,t D n 1 ), it is independent of Z 2 n+1 s. By induction assumption, both terms have law N(, ). Hence, their sums B n+1 s B s 1 and their difference 2n B s+ 1 B 2 n s are iid with law N(, 1 ). 2 n Exercise: X and Y iid random variables with law N(,σ 2 ) X +Y,X Y are iid random variables with law N(,2σ 2 ). To see that all increments B s B s 1 2 n,s D n \{} are independent, it suffices to show that they are pairwise independent, since the vector of increments is Gaussian. We saw that B s B s 1 2 n, B s+ 1 2 n B s (with s D n \D n 1 ) are independent. The other possibility is that the increments are over intervals separated by some s D n 1. Choose s D j with this property and j minimal, so that the two intervals are contained in [s 1 2 j,s] and [s,s+ 1 2 j ]. By induction hypothesis, the increments over these two intervals of length 1 2 j are independent, and the increments over the intervals of lengths 1 are constructed from the 2 n independent increments B s B s 1 and B s+ 2 j 1 B s, respectively, using disjoint sets of 2 random variables (Z j t,t D n ). Hence they are independent (1) is satisfied. This completes the induction. Now, we interpolate between the dyadic points. More precisely, let Z 1 t = 1 f (t) = t = linear in between. and for each n 1, 2 n+1 2 Z t t D n \D n 1 f n (t) = t D n 1 linear between consecutive points in D n f,f 1,f 2,...are continuous functions and, n and s D n n B s := f j (s) = f j (s). (1.1) j= j= 5

6 We prove (1.1) by induction. (1.1) holds for n =. Suppose it holds for n 1. Let s D n \D n 1. Since for j n 1, the function f j is linear on [s 1 2 n,s+ 1 2 n ] we get n 1 f j (s) = j= n 1 j=1 f j (s 1 )+f 2 n j (s+ 1 ) 2 n = (B s 1 +B 2 n s+ 1 n). 2 Since f n (s) = 1 Z 2 n+1 s, this gives (1.1). 2 Since Z d s = N(,1), we have for c > 1, and n large enough, P[ Z 1 c n] e c2 n 2 (since x e u 2 2 du 1 x e x2 2, Proof: exercise). the series P[ s D n with Z s c n] n= n= s D n P[ Z s c n] n= (2 n +1)e c2 n 2 converges if c > 2log2. Fix c > 2log2. Apply the Borel-Cantelli lemma : N (ω) < s.t. for n N (ω), and s D n we have Z s < c n For n N (ω), f n < c n 1 2 n 2 For P-a.a. ω, the sequence B m t = m n= f n(t) converges uniformly in t [,1] for m B t := lim m Bm t is continuous in t. We check that the increments of B have the right finite-dimensional distributions: Assume t 1 < t 2 <... < t n 1. Then, we find t 1,k t 2,k... t n,k 1 with t i,k D and lim k t i,k = t i and since t B t is continuous for P-a.a. ω, B ti+1 B ti = lim k (B t i+1,k B ti,k ) P-a.s. Since lim E[B t i+1,k B ti,k ] = and k lim Cov(B t i+1,k B ti,k,b tj+1,k B tj,k ) = lim I {i=j} (t i+1,k t i,k ) = I {i=j} (t i+1 t i ), k k the increments B ti+1 B ti,i = 1,2,...,n, are independent Gaussian random variables with mean and the variance t i+1 t i, using Lemma 1.4 (s. below) Lemma 1.4 (X n ) n N sequence of Gaussian random vectors and lim X n = X P-a.s.. If b := lim E[X n ] and C := lim Cov(X n ) exist, then X is Gaussian with mean b and Covariance Matrix C. Proof: See [4], Prop

7 Definition 1.5 Astochastic process (B t ) t on some prob. space (Ω,A,P) is a Brownian Motion if: (i) B = P-a.s. (ii) For t < t 1 <... < t n, the increments B ti B ti 1 are independent with law N(,t i t i 1 ) (iii) t B t (ω) is continuous for P-a.a. ω. We obtain (B t ) t from a sequence of iid BMs (B t ) t 1 as follows: B t = B t t t + t 1 i.e. by glueing the paths (B i t) t 1 together. B1 i (t ) i= Then (B t ) t is a BM. Proof: exercise Definition 1.6 A stochastic process (V t ) t is a Gaussian process if for all t 1 < t 2 <... < t n, the vector (V t1,...,v tn ) is a Gaussian random vector. (B t ) t is a Gaussian process. Proof: See exercises. 2 Some properties of Brownian Motion The paths of BM are random fractals in the follows sense: Lemma 2.1 (Scaling invariance) Let (B t ) t be a BM and let a >. Then the process (X t ) t given by X t = 1B a a 2 t (t ) is also a BM. 7

8 Proof: Independence and stationarity of the increments and continuity of the paths persist under the scaling. It remains to show that X t X s = 1 a (B a 2 t B a 2 s) has the law N(,t s). But X t X s is a Gaussian RV with the expectation and variance 1 E[(B a 2 a 2 t B a 2 s) 2 ] = 1 a 2 (t s) = t s. a 2 Example 2.2 (1) Let a < < b and consider T a,b with T a,b := inf{t : B t {a,b}} the first exit time of BM from the intervall (a,b). Then, with X t = 1 a B a 2 t, E[T a,b ] = a 2 E[inf{t : X t { 1, In particular E[T b,b ] = b 2 E[T 1,1 ] = const b 2. b a }}] = a2 E[T 1, b ] a (2) Ruin probabilities: P[(B t ) t exits (a,b) at a] = P[(X t ) t exits ( 1, depends only on the ratio b a. b ) at 1] a Theorem 2.3 (Time inversion) Let (B t ) t be a BM. Then the process (X t ) t given by t = X t = tb1 t > t is again a BM. Proof: Recall that (B t1,...,b tn ), t 1 < t 2 <... t n are Gaussian random vectors and are therefore characterized by their expectations and their Covariances Cov(B ti,b tj ) = t i t j (2.1) Proof of (2.1): Let t i < t j. Then E[B ti B tj ] = E[B ti (B tj B ti )]+E[B 2 t i ] = +t i = t i 8

9 (X t ) t is also a Gaussian process (check!) and the Gaussian random vectors (X t1,...,x tn ) have expectations E[X ti ] =,1 i n. For t >,h, the Covariance of X t and X t+h is given by Cov(X t,x t+h ) = Cov(tB1,(t+h)B 1 ) = t(t+h)cov(b1,b 1 ) = t t t+h t t+h Hence, the laws of all the finite vectors (X t1,...,x tn ), t 1 < t 2 <... t n, are the same as for BM. The paths t X t are clearly continuous for all t > (for P-a.a. ω). For t =, we use the following two facts: (1) Since Q is countable, (X t,t,t Q) has the same law as (B t,t,t Q) lim tց,t Q X t = P-a.s. (2) Q (, ) is dense in (, ) and (X t ) t is continuous on (, ) (for P-a.a. ω) so that = lim tց,t Q X t = lim t X t = P-a.s. Example 2.4 (Ornstein-Uhlenbeck-process) Let (B t ) t be a BM, and set X t = e t B e 2t,(t R). Then X d t = N(,1) t Proof: X t is a Gaussian RV with E[X t ] =,Var(X t ) = e 2t e 2t = 1 Further (X t ) t R has the same law as (X t ) t R. Proof: Set X t = X t,(t R) and tb1 t > t B t = t =. Then X t = e t B e 2t = e t Be 2te 2t = e t Be 2t. Since ( B t ) t is a BM, ( X t ) t R d = (X t ) t R. (X t ) t R is a Gaussian process with E[X t ] =, t and Cov(X s,x t ) = E[X s,x t ] = e (s+t) E[B e 2sB e 2t] (2.1) = e (s+t) e 2(s t) = e t s. Later we will see that ( 1 2 X t ) t is a (weak) solution of the stochastic differential equation dx t = db t X t dt. Corollary 2.5 (Law of large numbers) (B t ) t BM. Then, 1 lim t t B t = P-a.s. 9

10 Remark: 1 lim n B n = P-a.s. since B n = n i=1 (B i B i 1 ) = n i=1 Y i, (Y i ) i 1 iid with law N(,1). Proof: Let tb1 t > t X t = t =. 1 Then lim B t t t = lim X1 t t = X = P-a.s. Remark 2.6 The law of the iterated logarithm says that lim sup t B t 2tloglogt = 1 P-a.s. (2.2) In particular, lim sup t lim inf t B t 2tloglogt = 1 P-a.s. (2.3) B t t = +,liminf t B t t = P-a.s. Theorem 2.7 (Paley, Wiener, Zygmund) Let (B t ) t be a BM. Then P[{ω : t B t (ω)is nowhere differentiable}] = 1 Proof: Assume that there is t [,1] such that t B t (ω) is diff. in t. Then, there is a constant M < such that sup s [,1] B t +s B t s M (2.4) If t [ ] k 1, k 2 n 2 for some n > 2,k 2 n, then we have for 1 j n, n B k+j B k+j 1 2 n 2 n M(2j +1) 1 (2.5) 2 n Proof of (2.5): B k+j 1 2 n 2 n B k+j B 2 n t + B t B k+j 1 2 n B k+j M j +1 2 n +M j 2 n M(2j +1) 1 2 n. 1

11 Let A n,k C[,1] be the collection of functions satisfying (2.5) for j = 1,2,3. Claim: Proof of (2.6): and B k+j 2 n B k+j 1 Hence, P[A n,k ] P[ B 1 7M 2 n ]3 (2.6) B k+j 2 n B k+j 1 2 n d = N(, 1 2 n) d = 1 2 n B 1,j = 1,2,3 are independent. Further, 2 n [ P B 1 7M 3 2 n] P [ 2 n k=1 A n,k ] P n=2 Therefore, using the Borel-Cantelli lemma, ( 7M 2 n) 3. ( ) 3 7M 2 n 2 = (7M)3 n 2 n [ 2 n k=1 P[(2.4) holds for some t [,1]] P = P =. A n,k ] <. [ 2 n k=1 [ A n,k 2 n m=2 n=mk=1 happens for infinitely n A n,k ] ] Corollary 2.8 For P-a.a. ω, the function t B t (ω) is not of bounded variation on any interval. Recall that for g which is right-continuous on [a,b], we set { m } V g [a,b] = sup g(t k ) g(t k 1 ) : a t < t 1 <... < t m b,m N. k=1 V g [a,b] is the variation of g on [a,b]. We say that g is of bounded variation (BV) on [a,b] if V g [a,b] <. Example: If t g(t) is increasing on [a,b], g is BV on [a,b], and V g [a,b] = g(b) g(a). Lemma 2.9 Assume g is BV on [a,b] and right-continuous g 1,g 2 : [a,b] R,g 1,g 2 increasing and right-continuous such that g = g 1 g 2 (2.7) 11

12 Proof: See literature. Proof of Corollary 2.8:AtheorembyLebesguesaysthatanincreasingfunctiong : [a,b] R is differentiable for λ-a.a. s [a,b]. Remark 2.1 Let g be right-continuous and increasing on [a,b]. Then, g defines a measure ν g on [a,b] by ν g ((a 1,b 1 ]) = g(b 1 ) g(a 1 ), a a 1 < b 1 b. If g is BV on [a,b], and f C[a,b] we can define b a f(s)dg(s) := b a f(s)ν g1 (ds) b a f(s)ν g2 (ds) (withg 1,g 2 from(2.7)). ForaBM(B t ) t, thisprocedurecannotbeapplied-nevertheless, we will be able to define b a f(s)db s. Remark 2.11 A continuous function g which is BV on [,1] has quadratic variation, i.e. for E n [,1],E n = {,t 1,...,t n,1},( t 1 <... < t n 1) we have (g(t i+1 ) g(t i )) 2 max g(t i+1 ) g(t i ) g(t i+1 ) g(t i ) t i E n t i E n t i E n if s(e n ), where s(e n ) := sup ti E n t i+1 t i. is the mesh of E n. Theorem 2.12 (B t ) t BM. Let (E n ) be a sequence of partitions with s(e n ),E n E n+1 E n+2... Then, t >, Proof: (1) Convergence in L 2 : V n t := t i E n, (B ti+1 B ti ) 2 t P-a.s. and in L 2. E[V n t ] = t i E n, Using the independence of the increments, Var(V n t ) = t i E n, Var(Vt n), see Remark 2.11 Proof of (*): Y = d N(,σ 2 ) (t i+1 t i ) t. Var((B ti+1 B ti ) 2 ) = 2(t i+1 t i ) 2 ( ) t i E n, Var(Y 2 ) = E[Y 4 ] E[Y 2 ] 2 = 3σ 4 σ 4 = 2σ 4. 12

13 (2) We first show P-a.s. convergence along dyadic partitions E n = {, 1 2 n, 2 2 n, 3 2 n,...,1}. For these partitions, Var(V n t ) = 2 2n ( 1 2 n ) 2 = 2 2 n n=1 Var(V n t ) <. Using Lemma 2.13 below we conclude that V n t t P-a.s.. (3) For general sequence of partitions, use (2) and an approximation argument. Lemma 2.13 Y 1,Y 2,... RVs with n=1 Var(Y n) <. Then, (Y n E[Y n ]) P-a.s.. Proof: See exercises. 3 The Cameron-Martin Theorem and the Paley-Wiener stochastic integral We know several facts about paths of BM, for instance: P[ t [,1] s.t. t B t differentiable in t ] = Do these properties remain true for (B t +ct) t 1 or, more generally, for (B t +h(t)) t 1 where h C[,1]? We denote by { H = h C[,1] : there is f L 2 [,1] s.t. h(t) = the Cameron-Martin space (or Dirichlet space). } f(s)ds, t 1 Given h H, f is uniquely determined as an element of L 2 [,1] and we write Example: h = f h = f is true λ-a.s. (λ = Lebesgue measure on [,1]). 13

14 Recall that for two measures µ,ν on (Ω,A) we write µ ν and say µ and ν are singular if there is A A with µ(a) =,ν(a c ) =. We write ν µ and say ν is absolutely continuous w.r.t. µ if, A A, µ(a) = ν(a) =. We write ν µ and say ν and µ are equivalent if ν µ and µ ν. Let µ be Wiener measure on (C[,1],F) and µ h be the law of (B t +h(t)) t 1. Theorem 3.1 Assume h C[,1] and h() =. (i) If h / H then µ h µ. (ii) If h H then µ h µ. For the proof, we will need the following quantity: ( ( ) ( )) 2n 2 j j 1 Q n (h) := 2 n h h 2 n 2 n (n = 1,2,...) (3.1) j=1 Lemma 3.2 Q n (h),n = 1,2,... is an increasing sequence and Moreover, if h H, then h H supq n (h) <. n Q n (h) 1 h (s) 2 ds = 1 Proof: The general inequality (a+b) 2 2a 2 +2b 2 gives [ ( ) ( )] 2 ( ) ( j j 1 2j 1 j 1 h h 2[ h h 2 n 2 n 2 n+1 2 n f(s) 2 ds. )] 2 +2[ h ( ) j h 2 n ( )] 2 2j 1 Summing this inequality over j {1,2,...,n} gives Q n (h) Q n+1 (h) Q n (h) is increasing in n. For h H with h = f, we have, using Jensen s inequality 2 n+1 ( 2n j Q n (h) = 2 n 2 n j=1 f(s)ds j 1 2 n ) 2 2 n j=1 j 2 n j 1 2 n f(s) 2 ds = 1 f(s) 2 ds Hence, h H sup n Q n (h) <. Proof of : Let t U([,1]), then there exists a sequence of intervals I n (t) = [a n,b n ] = [ kn 1, kn 2 n ] 2 s.t. t n In (t), n. Given I 1 (t),...i n (t), the interval I n+1 (t) is, with probability 1 the left or the right half of I 2 n(t). Let M n = M n (t) = 2 n (h(b n )) h(a n )), then (M n ) n 1 is a martingale w.r.t. σ(i n (t) (on ([,1],B [,1],λ)). Furthermore: 2n ( ( ) ( )) 2 k k 1 E[Mn 2 1 ] = 22n h h 2 n 2 n 2 = Q n(h) n k=1 Ifsup n Q n (h) <,(M n ) n 1 isamartingalewhich isboundedin L 2, i.e. sup n E[M 2 n] <. We prove later: 14

15 Lemma 3.3 Assume (M n ) n 1 is a martingale on (Ω,A,P) and (M n ) n 1 is bounded in L 2. Then there exists a RV X s.t. M n X P-a.s. and in L 2 By Lemma 3.3, M n X a.s. and in L 2. Let For j,m fixed, we have n: h ( ) ( j 2 = g j m X(s) λ-a.s. h H and 2 m ) g(s) := s X(t)dt ( ) j j 2 h = m M 2 m n (t)dt j 2 m X(t)dt j,m and by continuity, g(s) = h(s) s [,1] and h (s) = Q n (h) = E[M 2 n ] E[X2 ] = 1 (h (t)) 2 dt. Proof of Lemma 3.3: M n is bounded in L 1 and by the martingale convergence theorem (Theorem in Probability Theory lecture notes) M n X P-a.s. and X L 1. We have for m > n: E[(M m M n ) 2 ] = E[M 2 m ] E[M2 n ] (3.2) since E[M m M n ] = E[E[M m M n A n ]] = E[M n E[M m A n ]] = E[M 2 n ] Fatou s Lemma implies from (3.2) [ ] E[(X M n ) 2 ] E lim inf (M m M n ) 2 liminf m m E[M2 m ] E[M2 n ] The last expression tends to a.s. for n, since E[Mn 2 ] is increasing in n: E[Mn 2 ] (3.2) = n E[(M k M k 1 ) 2 ]+E[M1 2 ]. k=2 Lemma 3.4 (The Paley-Wiener stochastic integral) Let (B t ) t be a BM and h H. Then ξ n := 2 n ( ) ( )) j j 1 2 (h n h (B 2 n 2 n j ) 2 n Bj 1 2 n j=1 converges a.s. and in L 2. The limit is denoted by 1 h db. 15

16 Proof: Recall from the construction of BM that B2j 1 = 1 ( ) 2 n B2j 2 +B 2j +σ 2 2n n 2 n Z2j 1 2 n where σ n = 2 (n+1)/2 and Z t are iid standard normal. Therefore: 2n 1 ξ n ξ n 1 = 2 n σ n j=1 ( 2h ( ) ( ) ( )) 2j 1 2j 2 2j h h 2 n which implies that (ξ n ) n 1 is a martingale. 2 n 2 n Z2j 1 2 n (E[ξ n A n 1 ] = E[ξ n ξ n 1 A n 1 ]+E[ξ n 1 A n 1 ] = E[ξ n 1 A n 1 ]) 2n ( ( ) ( )) 2 [ E[ξn] 2 2n j j 1 ( ) ] 2 = 2 h h E B 2 n 2 n j = Q 2 n Bj 1 n (h). 2 n j=1 Hence, for h H, the convergence follows from Lemma 3.3. Proof of the Cameron-Martin Theorem: Let µ n and µ h,n denote the finite-dimensional distributions on the set D n. Then the Radon-Nikodym derivative dµ h,n dµ n is the ratio of the two Lebesgue densities. For x C[,1] and j x = (n) j x = x ( ) ( j 2 x j 1 ) n 2, n dµ h,n dµ n (x) = 2 n j=1 exp ( ( ) ( ) jx j h) 2 ( j x) 2 exp = exp( H n (x)) 2 1 n with H n (x) = 2 n 1 n j=1 (( jh) 2 2 j x j h). By Theorem 14.5 in the Probability Theory lecture notes, exp( H n (x)) is a martingale 2 1 n under µ, since it is non-negative it converges a.s. to a finite RV X. We show later: µ h (A) = Xdµ+µ h (A {X = }) for A F. This implies: A µ(x = ) = 1 µ µ h µ(x > ) = 1 µ µ h WehaveE µ [H n ] = H n (x)µ(dx) = 1 Q 2 n(h)andvar µ (H n ) = 2 2n n j=1 ( jh) 2 Var µ ( j x) = Q n (h). By Chebyshev s inequality, we get P µ (H n 1 ) ( 1 4 Q n(h) = P µ 2 Q n(h) H n 1 ) 4 Q n(h) Q n(h) ( 1 Q 4 n(h) ) 2 = 16 Q n (h). Now, if h / H, then by Lemma 3.2, H n and x = µ-a.s.. For the converse, suppose h H. By Lemma 3.4 (and the second part of Lemma 3.2), 1 H(x) 2 h h db. Therefore x > µ-a.s. and µ µ h. Finally note that µ h µ µ µ h. 16

17 4 Brownian Motion as a continuous martingale Definition 4.1 Consider a probability space (Ω,F,P). (i) A filtration is a sequence of σ-fields (F t ) t with F s F t F, s < t. (ii) Astochasticprocess(X t ) t on(ω,f,p)isadaptedto(f t ) t ifx t isf t -measurable, t. Suppose (X t ) t is a stochastic process on (Ω,F,P). Then we can define a filtration (F t ) t by taking F t := σ({x s, s t}), i.e. F t is the σ-field generated by {X s, s t}. Then, (X t ) t is adapted to (F t ) t. Definition 4.2 A real-valued stochastic process (X t ) t is a martingale with respect to a filtration (F t ) t if it is adapted to (F t ) t and E[ X t ] < t (4.1) and, for s t E[X t F s ] = X s P-a.s. (4.2) Theprocess (X t ) t isasubmartingale withrespect toafiltration(f t ) t ifit isadapted to (F t ) t, (4.1) holds and for s t E[X t F s ] X s P-a.s. (4.3) and it is a supermartingale with respect to a filtration(f t ) t if it is adapted to (F t ) t, (4.1) holds and, for s t E[X t F s ] X s P-a.s. (4.4) Remark 4.3 If (X t ) t is a martingale with respect to (F t ) t, X t is in general not a martingale but a submartingale. More generally, if (X t ) t is a martingale with respect to (F t ) t and f : R R is a convex function such that E[ f(x t ) ] <, t, then f(x t ) t is a submartingale with respect to (F t ) t. Proof: E[f(X t ) F s ] Jensen f(e[x t F s ]) = f(x s ) P-a.s. hence (4.2) holds. Remark 4.4 Let (B t ) t be a BM and F t = σ({x s, s t}). Then (B t ) t is a martingale with respect to (F t ) t. Proof: E[B t F s ] = E[B t B s F s ]+E[B s F s ] = +B s P-a.s. (See exercise 3.3 (ii): B t B s is independent of F s, hence E[B t B s F s ] =.) 17

18 Definition 4.5 Let (Ω,F,P) be a probability space with a filtration (F t ) t. A RV T with values in [, ] is a stopping time with respect to (F t ) t if {T t} F t, t. Example 4.6 Let (B t ) t be a BM and F t = σ({x s, s t}), t. (i) Let y and T = inf{t : B t = y}. Then, T is a stopping time with respect to (F t ) t. Proof: {T t} = {B s U(y, 1 n )} F t, where n=1 s:s Q (,t) U(y, 1 n ) := {z R : z y < 1 n } (ii) Let I = (a,b), < a < b and T = inf{t : B t I}. Then, T is not a stopping time because {T t} / F t. Proof: See [4]. Definition 4.7 Assume T is a stopping time with respect to the filtration(f t ) t. Define F T := {A F : A {T t} F t, t }. F T is called the σ-field of events observable until time T. A martingale (X t ) t is a continuous martingale if P[t X t (ω) is continuous] = 1. Theorem 4.8 (Optional stopping) Suppose (X t ) t is a continuous martingale and S T stopping times. If the process (X t T ) t is dominated by an integrable RV X, i.e. X t T X, t a.s. and E[ X ] <, then E[X T F S ] = X S P-a.s. Proof: This can be derived from the result for martingales in discrete time (see Theorem 14.9 in the Probability Theory lecture notes for the case S = ). See [4] for details. Theorem 4.9 (Wald s Lemma for BM) Let (B t ) t be a BM and T a stopping time such that either (a) E[T] < or (b) (B t T ) t is dominated by an integrable RV. Then, we have E[B T ] =. Remark 4.1 One does need a condition on T, as the following example shows: Let T = inf{t : B t = 1}. (Then T < P-a.s., see exercise 5.2). Clearly, E[B T ] = 1. We conclude from Theorem 4.9 that E[T] = and that (B t T ) t is not dominated by an integrable RV. 18

19 Proof of Theorem 4.9: We show that (a) implies (b). Suppose E[T] <, and define M k = max t 1 B t+k B k and M = T k=1 M k. Then T E[M] = E[ M k ] = = k=1 E[I {T>k 1} M k ] k=1 P[T > k 1]E[M k ] k=1 E[M ]E[T +1] But E[M ] = E[max t 1 B t ] < (Proof: exercise). If (b) is satisfied, we can apply the optional stopping theorem (Theorem 4.8) with S =, giving E[X T F ] = X, P-a.s. which yields that E[B T ] =. 5 Stochastic integrals with respect to Brownian Motion Let (B t ) t be a BM on some probability space (Ω,F,P) and F t the completion of σ({b s,s t}) (see Theorem 1.32 in the Probability Theory lecture notes). Then, (B t ) t is adapted to (F t ) t. Definition 5.1 A process {X t (ω) : t,ω Ω} is progressively measurable if for each t the mapping X : [,t] Ω R is measurable with respect to the σ-field B [,t] F t. Lemma 5.2 Any process (X t ) t which is adapted and either right-continuous or leftcontinuous is also progressively measurable. Proof: Assume (X t ) t is right-continuous. Fix t >. For n N, s t define X (n) (ω) = X (ω) and X (n) kt (k +1)t s (ω) = X(k+1)t(ω) for < s k =,1,2,...,2 n 1. 2n 2n 2 n The mapping (s,ω) X s (n) (ω) is B [,t] F t -measurable. By right-continuity, we have lim X s (n) (ω) = X s (ω) for all s [,t] and ω Ω, hence the limit mapping (s,ω) X s (ω) is also B [,t] F t -measurable. The left-continuous case is analogous. 19

20 We construct the integral by starting with simple progressively measurable processes H t (ω) and then proceeding to more complicated ones. Consider first step processes {H t (ω) : t,ω Ω} of the form H t (ω) = k A j (ω)i (tj,t j+1 ](t) for t 1 <... < t k+1 j=1 where A j is F tj -measurable, 1 j k. We define the integral as H s db s := k A j (B tj+1 B tj ) j=1 NowletH beaprogressively measurableprocesssatisfyinge [ H 2 s ds] <.SupposeH canbeapproximatedbyasequence ofprogressively measurablestepprocesses H (n),n 1, then we define H s db s := lim More precisely, let H 2 2 := E [ H 2 sds ]. We will show that H (n) s db s. (5.1) (1) Every progressively measurable H satisfying E [ H 2 s ds] < can be approximated in the 2 - norm by progressively measurable step processes. (2) For each approximating sequence, the limit in (5.1) exists in the L 2 -sense. (3) This limit does not depend on the approximating sequence of step processes. We start with (1): Lemma 5.3 For every progressively measurable process {H s (ω) : s,ω Ω} satisfying E [ H 2 s ds] < there exists a sequence (H (n) ) n N of progressively measurable step processes such that lim H (n) H 2 =. Proof: We approximate the progressively measurable process successively by a bounded progressively measurable process a bounded, almost surely continuous progressively measurable process a progressively measurable step process. Let H = {H s (ω),s,ω Ω} be a progressively measurable process with H 2 <. First define H (n) s (ω) = H s (ω) s n otherwise. 2

21 Clearly, lim H (n) H 2 =. Second, approximate any progressively measurable process H on a finite interval by truncating its values, i.e. define H (n) by H (n) s = (H s (ω) n) ( n). Clearly, H (n) is progressively measurable and H (n) H 2. Now we approximate any uniformly bounded progressively measurable H by bounded, almost surely continuous progressively measurable processes. Let h = 1 and n H s (n) (ω) = 1 s H t (ω)dt h s h (where we set H s (ω) = H (ω) for s < ). H (n) is again progressively measurable (since we only take averages over the past). H (n) is almost surely continuous. Further, ω Ω and almost every (with respect to Lebesgue measure) s [,t], 1 lim h h s s h H t (ω)dt = H s (ω). Since H is uniformly bounded, we obtain that lim H (n) H 2 =. Finally, a bounded, amost surely continuous, progressively measurable process H can be approximated by a sequence of progressively measurable step processes H (n) by taking ( ) j H s (n) = H n,ω for j n s j +1 n. The process H (n) are again progressively measurable and one easily sees that This completes the proof of Lemma 5.3. lim H(n) H 2 =. Lemma 5.4 Let H be a progressively measurable step process and E [ H 2 sds ] <. Then [ ( ) ] 2 [ ] E H s db s = E Hsds 2 Proof: Let H = k i=1 A ii (ai,a i+1 ] be a progressively measurable step process. Then [ ( ) ] [ 2 k ] E H s db s = E A i A j (B ai+1 B ai )(B aj+1 B aj ) = k E[A 2 i(b ai+1 B ai ) 2 ]+2 i=1 = k i,j=1 k i=1 j=i+1 E[A i A j (B ai+1 B ai ) E[(B aj+1 B aj ) F aj ]] k [ E[A 2 i]e[(b ai+1 B ai ) 2 ] = E i=1 ] Hsds 2 21

22 Corollary 5.5 Suppose (H (n) ) n N is a sequence of progressively measurable step processes such that Then E[ E[( (H (n) s H (m) s ) 2 ds] n,m. (H (n) s H (m) s )db s ) 2 ] n,m. Proof: Because the difference of two progressively measurable step processes is again a progressively measurable step process, Lemma 5.4 can be applied to H (n) H (m) and yields the claim. We showed (1). The following theorem addresses (2) and (3). Theorem 5.6 Suppose (H (n) ) n N is a sequence of progressively measurable step processes and H a progressively measurable process such that Then lim E[ (H s (n) H s ) 2 ds] = lim H (n) s db s =: H s db s exists as a limit in the L 2 -sense and does not depend on the choice of (H (n) ) n N. Moreover, we have E[ (H (n) s H (m) s ) 2 db s ] n,m (5.2) Proof: ( By the triangle inequality (H (n) ) n N satisfies the assumptions of Corollary 5.5 and ) hence H (n) s db s is a Cauchy sequence in L 2. Since L 2 is complete, the limit exists n N and does not depend on the choice of the approximating sequence. Finally, (5.2) follows from Lemma 5.4, applied to H (n), taking the limit as n. This completes the construction of the stochastic integral H 2 sdb s for progressively measurable processes H with E [ H 2 s ds] <. Remark 5.7 If the sequence of step processes in Theorem 5.6 is chosen such that then by (5.2) we get n=1 and therefore, almost surely, [ ] E (H s (n) H s ) 2 ds n=1 <, [ ( ) ] 2 E (H s (n) H s )db s < ( n=1 ) 2 H s (n) B s H s db s <. 22

23 This implies that, almost surely, lim H (n) s db s = H s db s. We now want to describe the stochastic integral as a process in time. We will see that it will be a continuous martingale. Definition 5.8 Suppose H = {H s (ω) : s,ω Ω} is progressively measurable with E [ H 2 s ds] <. Define the progressively measurable process {Hs t (ω),s,ω Ω} H t s (ω) := H s(ω)i {s t}. Then, the stochastic integral of H up to time t is defined H s db s := H t s db s. Remark 5.9 We have seen already in Lemma 3.4 that for g L 2 [,1] we can define 1 g(s)db s. Provided that both integrals exist, the Paley-Wiener integral from Lemma 3.4 agrees with the stochastic integral just defined. Proof: See exercises. Definition 5.1 A stochastic process (X t ) t is a modification of a stochastic process (Y t ) t if, for every t, we have P[X t = Y t ] = 1. [ ] t Theorem5.11 Assume that(h s (ω)) s is progressivelymeasurableand E H s(ω) 2 ds < (, t. Then there exists a modification (M t ) t of H sdb s such that )t P[t M t (ω) is continuous] = 1. Further, (M t ) t is a martingale and hence [ ] E H s db s = t. Proof: Fix t N and let H (n) be a sequence of progressively measurable step processes such that E [ ( H (n) H t 2 ) ] 2 (H s (n) H t s )db s. For s t the random variable s H(n) u db u is F s -measurable and E[ s H(n) u ( db u F s ] = (proof: exercise!) which implies that the process H(n) u db u is a martingale, n. ) t t By Doob s maximal inequality, see below, for p = 2, [ ( ) 2 ] [ ( E H s (n) db s H s (m) db s 4E sup t t 23 (H (n) s H (m) s )db s ) 2 ].

24 This implies that M (n) t := H(n) s db s, t t, n =,1,2,... defines a Cauchy sequence in the space of continuous functions on [,t ] (equipped with the supremum norm). We denote the limit of this Cauchy sequence by (M t ) t t. Hence, the process (M t ) t t is almost surely a uniform limit of continuous processes and therefore almost surely continuous. Due to Theorem 5.6, [ P M t = H s db s ] = 1. For fixed t [,t ], the random variable H sdb s is the limit (in L 2 ) of H(n) s db s, hence it is F t -measurable, and H t s db s has conditional expectation E[ H t s db s F t ] =. Therefore, H sdb s is a conditional expectation of M t, given F t, i.e. [ ] M t = E H s db s F t. Therefore, (M t ) t t is a martingale, as a process of successive predictions, (see (14.4) in Probability Theory lecture notes). Doob s maximal inequality Suppose (X t ) t is a continuous martingale and p > 1. Then, for any t ( ) p p E[ sup X s p ] E[ X t p ]. s t p 1 Proof: See literature. 6 Itô s formula and examples Let f C 1 (R) (f continuously differentiable) and x : [, ) R x continuous and BV on [,t]. Then f(x(t)) f(x()) = f (x(s))dx(s) Example: x(s) = s, s. Then f(t) f() = f (s)ds. Itô s formula gives an analogue for the case when x is replaced by a BM B t. The crucial difference is that the second derivative of f is needed. Theorem 6.1 (Itô s formula I) Let f : R R be twice continuously differentiable such that E[ (f (B s )) 2 ds] < for some t >, where (B t ) t is a BM. Then, almost surely for all s [,t] f(b s ) f(b ) = s f (B u )db u s f (B u )du. 24

25 Example 6.2 Let f(x) = x 2. We have E[ s B2 u du] = s udu <, s >. Hence s s Bs 2 = 2 B u db u +s Bs 2 s = 2 B u db u. We conclude from Theorem 5.11 that (B 2 s s) s is a martingale. Example 6.3 Let f(x) = x 3. We have E[ s B4 u du] = s E[B4 u ]du <, s >. Hence Let B 3 s = 3 s B 2 udb u +3 s B u du. s M s = Bs 3 3 B u du, s. We conclude from Theorem 5.11 that (M s ) s is a martingale. To prove Theorem. 6.1, we will need the following. Theorem 6.4 Let f : R R is continuous, t >, and = t (n) 1 <... < t (n) n partitions such that their mesh max 1 i n 1 t(n) i+1 t(n) i. = t are Then n 1 j=1 f(b (n) t )(B (n) j t j+1 B (n) t ) 2 f(b s )ds j Proof: For f 1, see Theorem For general f, see Theorem 7.12 in Mörter/Peres. Proof of Theorem 6.1: We write w(δ,m) for the modulus of continuity of f on [ M,M]: w(δ,m) = sup f (s) f (t) s,t [ M,M], s t <δ Using Taylor s formula, for any x,y [ M,M],x < y with x y < δ, f(y) f(x) f (x)(y x)+ 1 2 f (x)(y x) 2 w(δ,m)(y x) 2. Take a sequence = t (n) 1 <... < t (n) n = t. We write = t 1 <... < t n = t for simplicity. With and we get δ B := max 1 i n 1 B t i+1 B ti M B := max s t B s, n 1 (f(b ti+1 ) f(b ti )) i=1 n f (B ti )(B ti+1 B ti ) i=1 n 1 i=1 1 2 f (B ti )(B ti +1 B ti ) 2 25

26 n 1 w(δ B,M B ) (B ti+1 B ti ) 2 i=1 Now, n 1 i=1 f(b t i+1 ) f(b ti ) = f(b t ) f(b ), and there is a sequence of partitions with mesh going to s.t. n 1 f (B ti )(B ti+1 B ti ) i=1 n 1 f (B ti )(B ti+1 B ti ) 2 i=1 f (B s )db s f (B s )ds n 1 (B ti+1 B ti ) 2 t P a.s. i=1 P a.s. P a.s. By continuity of the Brownian path, w(δ B,M B ) converges almost surely to. This proves Itô s formula for fixed t, or indeed almost surely for all s Q [,t]. Since all the terms in Itô s formula are continuous almost surely, we get the result simultaneously for all s [,t]. We next state Itô s formula for functions f which can depend also on time. Theorem 6.5 (Itô s formula II) Let f : R R R, (x,t) f(x,t) be twice continuously differentiable in the x-coordinate and once continuously differentiable in the t-coordinate. Assume that E[ ( xf(b s,s)) 2 ds] < for some t >. Then, almost surely for all s [,t]: f(b s,s) f(b,) = Proof: See Mörter/Peres. Example 6.6 Fix α >. s Then f(b s,s) 1 = x f(b u,u)db u + s d t f(b u,u)du+ 1 2 f(b t,t) = e αbt 1 2 α2t = M t (M = 1) s αm u db u + s M s M = s s ( 1 2 α2 )M u du+ 1 2 αm u db u. s xx f(b u,u)du α 2 M u du We conclude from Theorem 5.11 that (M t ) t is a martingale (details: exercise). M solves the stochastic differential equation dm s = αm s db s, M = 1 which can be written in integral form M s = 1+ s M u db u, s. 26

27 Definition 6.7 (B t ) t = (B (1) t,...,b (d) t ) t is a d-dim. BM if (B (1) t ) t,...,(b (d) t ) t are iid one-dimensional BMs. For H s = (H s (1),...,H s (d) ) we write H s db s = d i=1 H (i) s db(i) s. Theorem 6.8 (Multidimensional Itô formula) Let (B t ) t be a d-dim. BM and f : R d+1 R be such that the partial derivatives i f and jk f exist for all 1 i d+1,1 j,k d and are continuous. If for some t > [ E ] x f(b s,s) 2 ds < where x f = ( 1 f,..., d f) then, almost surely, for all s t f(b s,s) f(b,) = where x f = d j=1 jjf. s x f(b u,u)db u + s d+1 f(b u,u)du+ 1 2 s x f(b u,u)du 7 Pathwise stochastic integration with respect to continuous semimartingales We saw that for a continuous process H with E[ H2 sds] <, H sdb s can be defined as an almost sure limit, see Remark 5.7. Can we replace (B t ) t with another continuous martingale? Definition 7.1 Let E n = { = t (n) 1 <... < t (n) n } be a sequence of partitions with s(e n ) = sup t (n) i+1 t(n) i. Then, the function X t has continuous quadratic variation along the sequence E n if exists P-a.s. X t = lim (X tj+1 X tj ) 2 (7.1) t i E n Remark 7.2 X t is increasing and continuous, hence it defines a measure ν on (R,B) given by ν((a,b]) = X b X a. In particular, if f : R R is continuous, f(s)d X s is well-defined. In analogy to Theorem 6.4, we have 27

28 Theorem 7.3 Assume that X t has continuous quadratic variation X t and f : R R is continuous. Then, for t >, f(x ti )(X tj+1 X tj ) 2 t i E n Proof: Clear for f 1 due to (7.1). Rest see literature. f(x s )d X s Note that the assumption X t has continuous quadratic variation can only be satisfied for continuous processes. Theorem 7.4 (Itô s formula for (deterministic) functions with continuous quadratic variation) Assume that the function X t has continuous quadratic variation X t and let f : R R be twice continuous differentiable. Then, where f(x t ) f(x ) = f (X u )dx u f (X u )dx u = lim t i E n Sketch of proof: As in the proof of Theorem 6.1, we have f (X u )d X u (7.2) f (X ti )(X ti+1 X ti ). f(x ti+1 ) f(x ti ) f (X ti )(X ti+1 X ti ) t i E n t i E n t i E n w(δ X,M X ) (X ti+1 X ti ) f (X ti )(X ti+1 X ti ) 2 where and and Now, and t i E n δ X = max X ti+1 X ti t i E n w(δ,m) = sup M X = max s t X s s,t [ M,M] s t <δ f (s) f (t). (f(x ti+1 ) f(x ti )) f(x t ) f(x ) t i E n f (X ti )(X ti+1 X ti ) 2 t i E n f (X s )d X s 28

29 and (X ti+1 X ti ) 2 X t. t i E n Moreover, w(δ X,M X ) δ. Hence, f (X ti )(X ti+1 X ti ) t i E n has to converge as well and (7.2) holds. Remark 7.5 (1) Theorem 7.4 gives a pathwise version of Itô s formula, without probability. (2) If X is BV on [,t], X t = and (7.2) becomes Short notation: f(x t ) f(x ) = f (X u )dx u. df(x) = f (X)dX classical differential If X has continuous quadratic variation X and X t, then df(x) = f (X)dX f (X)d X Itô differential We have defined f (X u )dx u for all continuous X t with continuous quadratic variation X t. Which stochastic processes X t have a.s. continuous quadratic variation X t? Lemma 7.6 (i) Assume X t has continuous quadratic variation X t. Then, if f is continuously differentable, f(x t ) has continuous quadratic variation f(x) t = f (X s ) 2 d X s (ii) Let X t = M t +A t,t, where M t has quadratic variation M t and A t has quadratic variation A t =, then X t has quadratic variation X t and X t = M t. (iii) Let f C 1 (R) and assume X t has continuous quadratic variation X t. Then, M t = f(x s)dx s has continuous quadratic variation M t = f(x s ) 2 d X s. (iv) Let f C 1 (R 2 ) and assume X t has continuous quadratic variation X t. Then, g(t) = f(x t,t) has continuous quadratic variation ( ) 2 f g t = x (X s,s) d X s. 29

30 Examples: 1. (B t ) t BM, α >,Z t = e αbt,t. Then, Lemma 7.4 (i) implies that Z t = α2 e 2αBs ds = α2 Zs 2ds P-a.s. Note that Z t is random (whereas B t = t, t P-a.s.). 2. (B t ) t BM,α >,M t = e αbt 1 2 α2t,t. Then, weknowthatm t = 1+ αm sdb s,t. Hence, Lemma 7.4 (iv) implies that M t = α2 Ms 2ds P-a.s. Note that M t is random (whereas B t = t, t P-a.s.). Proof of Lemma 7.4: (i) i X := X ti+1 X ti. Then and Hence, f(x ti+1 ) f(x ti ) = f(x ti ) i X +R i R i sup f (X s ) f (X u ) i X s,u [t i,t i+1 ] (f(x ti+1 ) f(x ti )) 2 = f (X ti ) 2 ( i X) 2 + Ri 2 +2 f (X ti ) i X R i t i E n But due to Theorem 7.3. Rest: Exercise. t i E n f (X ti ) 2 ( i X) 2 t i E n (ii) i M = M ti+1 M ti, i A = A ti+1 A ti. Then Hence i:t i E n ( i X) 2 = t i E n f (X s ) 2 d X s ( i X) 2 = ( i M) 2 +( i A) 2 +2 i M i A. i:t i E n ( i M) 2 + i:t i E n ( i A) 2 +2 i:t i E n t i E n i M i A M t, since ( i A) 2 and (details: exercise). i:t i E n i:t i E n i M i A 3

31 (iii) Due to (7.2), taking g such that g = f, M t = g(x t ) g(x ) 1 2 g (X s )d X s. But g (X s )d X s is continuous and BV as a function of t. Using (ii) and (i), M t = g(x) t = (iv) See [5], Remark g (X s ) 2 d X s = f(x s ) 2 d X s Theorem 7.7 If (X t ) t is a continuous martingale with X = and E[X 2 t] <, t there exists a unique process X t with X = which is continuous, adapted and increasing, such X 2 t X t is a martingale. Moreover, X t has quadratic variation X t. Proof: If (B t ) t is a BM and (H t ) t progressively measurable and X t = H sdb s, then X t = H2 sds P-a.s. (see Lemma 5.4 and Lemma 7.4(i).) General case: See literature. X t is called quadratic variation or bracket or compensator of (X t ). The first part of Theorem 7.6 has a discrete time analogue. A stochastic process (Y n ) is previsible w.r.t. a filtration (A n ) if Y n is A n 1 -measurable n. Theorem 7.8 (Doob decomposition) Suppose (X n ) n=,1,2,... is a martingale w.r.t. the filtration (A n ) and E[X 2 n] <, n. Then, there is a unique previsible increasing process (A n ) with A = such that (X 2 n A n ) n=,1,2,... is a martingale w.r.t. (A n ). Proof: Let A = and define, for n 1,A n = A n 1 +E[X 2 n A n 1] X 2 n 1. Clearly, A n is A n 1 -measurable n. Since (X 2 n) n=,1,2,... is a submartingale w.r.t. A n, A n is increasing. Further, E[X 2 n A n A n 1 ] = E[X 2 n 1 A n 1 A n 1 ] = X 2 n 1 A n 1 (X 2 n A n) n=,1,2,... is a martingale w.r.t. (A n ). To prove the uniqueness, assume that (A n ) and (B n ) both fulfill the requirements. Then A n B n is a previsible martingale starting at, and we infer that A n B n = E[A n B n A n 1 ], hence, by induction, A n = B n n. Definition 7.9 A continuous semimartingale w.r.t. a filtration (F t ) t is a process (X t ) t which is adapted to (F t ) t and which has a decomposition X t = X + M t + A t, t P-a.s. where (M t ) t is a continuous martingale and (A t ) t is a continuous adapted process which is BV (on each interval [,t]). If E[M 2 t] <, t, we define for f C 1 (R) f(s)dx s := f(s)dm s + 31 f(s)da s

32 8 Cross-variation and Itô s product rule Definition 8.1 The cross-variation X, Y is given by (provided that the limit exists). Clearly, X,X = X. X,Y t = lim t i E n Lemma 8.2 The following statements are equivalent (i) X,Y exists and t X,Y t is continuous. (ii) X +Y exists and t X +Y t is continuous. If (i) and (ii) hold, then (X ti+ X ti )(Y ti+ Y ti ) X,Y = 1 ( X +Y X Y ) (8.1) 2 In particular, in this case g(s)d X,Y s is well-defined for g C[, ). Proof: (X ti+1 X ti )(Y ti+1 Y ti ) = 1 ( ) 2 ((Xti+1 +Y ) (X +Y )) 2 X ti+1 ti ti (Xti+1 ti )2 (Y Y ti+1 ti )2. Example 8.3 Let (X t ) t and (Y t ) t be independent BMs. Claim: For P-a.a. ω Proof: X(ω),Y(ω) t = t X,Y = 1 ( X +Y X Y ) 2 and we have X t = t, t, and Y t = t, t P-a.s. Suffices to show X+Y = 2t t P-a.s. Z t := 1 2 (X t +Y t ), t is again a BM (Proof: exercise). Therefore, Z t = t t P-a.s. implying that X +Y = 2t, t P-a.s. Lemma 8.4 X, X continuous as before, f,g C 1 (R), Then Y t = f(x s )dx s, Z t = Y,Z t = g(x s )dx s f(x s )g(x s )d X s (8.2) 32

33 Proof: Define F and G by F = f, G = g, F() = G() =. Then, Y t = F(X t )+A t where and Z t = G(X t ) B t where A t = F(X ) 1 2 F (X s )d X s B t = G(X ) 1 2 G (X s )d X s. A t,b t are continuous with quadratic variation. Hence, Y,Z t = 1 2 ( Y +Z t Y t Z t ) = 1 ( (f(x s )+g(x s )) 2 d X s 2 = f(x s )g(x s )d X s f(x s ) 2 d X s g(x s ) 2 d X s ) Theorem 8.5 (Itô s formula in d dimensions) X = (X (1),...,X (d) ) : [, ) R d where X (i), 1 i d are continuous with continuous quadratic variation X (i) t and continuous cross-variations X (i),x (k) t, 1 i, k d. Let f C 2 (R d ). Then, f(x t ) f(x ) = ( f,dx s )+ 1 2 d i,k=1 2 f x i x k (X s )d X (i),x (k) s where ( f,dx s ) = d k=1 f x i (X s )dx (i) s. Proof: Analogous to the proof of Theorem 6.1: Taylor formula forf(x ti+1 ) f(x ti ). See literature. Note that Theorem 6.8 follows, using Example 8.3. Corollary 8.6 (Itô s product rule) Assume that X,Y, X, Y and X,Y are continuous. Then, t > Short notation: X t Y t = X Y + Y s dx s + X s dy s + X,Y t (8.3) d(x Y) = YdX +XdY +d X,Y (8.4) Proof: Apply Theorem 8.5 with d = 2,f(x,y) = x y. 33

34 Example 8.7 (Ornstein-Uhlenbeck process) Let α > and (B t ) t a BM, x R. Then, we say that X t = e αt x +e αt e αs db s, t (8.5) is an Ornstein-Uhlenbeck process with parameter α and starting point x. Claim: (X t ) t solves the stochastic differential equation dx t = db t αx t dt X t = B t i.e. αx sds X = x X = x Proof: X t = e αt x +e αt e αs db s dx t = αx e αt dt α(e αt e αs db s )dt+e αt e αt db t = αx t dt+db t, where we applied Itô s product rule to e αt eαs db s. Claim: If X d = N(, 1 2α ), X independent of (B t ) t, then X t = e αt X +e αt e αs db s is a Gaussian process with E[X t ] =, t and Cov(X s,x t ) = 1 2α e α t s. Hence, for any t, E[Xt] 2 = 1 X d 2α t = N(, 1 ). 2α Inparticular, fortheprocessz t = e t B e 2t,t, inexample2.4, ( 1 2 Z t ) t isanornstein- Uhlenbeck process with α = 1 and X = N(, 1 2 ). Proof: Clearly, E[X t ] =, t. Assume s t. Then, E[X s X t ] = E[X 2 ]e αt e αs +++e α(s+t) E[ s e αu db u e αv db v ] For any martingale (M t ) t, E[M t M s ] = E[Ms 2 ] for s t.(proof: exercise). Hence, applying this to the martingalem t = eαu db u, E[X s X t ] = 1 2α e α(t+s) +e α(t+s) E[( = 1 2α e α(t+s) +e α(t+s) s s e αv db v ) 2 ] e 2αv dv = 1 2α e α(t+s) +e α(t+s) 1 2α (e2αs 1) = 1 2α e α(t s). 34

35 9 Stochastic Differential Equations In this chapter we want to study stochastic differential equations (SDE) of the form: X = ξ dx t = σ(t,x t ) db t +b(t,x t )dt (9.1) (X t ) t = (X (1) t,x (2) t,...,x (n) t ) t is an unknown R-valued process, (B t ) t = (B (1) t,b (2) t,...,b (m) t ) t is an m-dimensional Brownian motion and b(t,x) and σ(t,x) are measurable functions of (t,x) R + R n. The drift vector b(t,x) is R n -valued and the dispersion matrix σ(t,x) is an n m-matrix valued. Further ξ is a random variable with values in R n which is independent of (B t ) t. Definition 9.1 We say the SDE (9.1) has the strong solution (X t ) t if the following conditions hold: (i) (X t ) t is adapted to the filtration (F ξ t) t where F ξ t is the completion of σ(ξ,{b s : s t}) for t. (ii) (X t ) t satisfies the following integral equation: for t,i = 1,2,...n. X (i) t = ξ (i) + m j=1 σ ij db (j) s + b i (s,x s )ds (9.2) Remark 9.2 (1) Processes which fulfill the integral equations in (9.2) and are defined on a possibly enlarged probability space, but do not have to be adapted to the filtration (F ξ t) t, are called weak solutions. (2) The second property of a solution of a SDE includes the requirement that the integrals in (9.2) are well defined. Example 9.3 For α,β R consider the SDE X = 1 dx t = αx t db t +βx t dt for a one-dimensional BM (B t ) t. This SDE has the (unique) strong solution X t = exp(αb t +(β α2 2 )t). Proof: exercise Remark The SDE from Example 9.3 is used for the Black-Scholes model. 35

36 Remark 9.4 In the following Theorem we use x 2 = n (x i ) 2 for x = (x 1,...,x n ) R n, i=1 σ 2 = n m (σ ij ) 2 for σ = (σ ij ) ij R n m. i=1 j=1 Theorem 9.5 (Existence and uniqueness) Assume that b : R + R n and σ : R + R n R n m are measurable and satisfy the Lipschitz-condition as well as the growth condition σ(t,x) σ(t,y) + b(t,x) b(t,y) K x y (9.3) σ(t,x) 2 + b(t,x) 2 K 2 (1+ x 2 ) (9.4) for a constant K >, for all t and x,y R n. Let ξ be a random variable with values in R n which is independent of the m-dimensional BM (B t ) t such that E[ ξ 2 ] <. Then the SDE (9.1) has a unique, continuous, strong solution (X t ) t such that [ T ] E X t 2 dt < T > (9.5) Remark 9.6 Uniqueness intheorem9.5meansthatfortwocontinuoussolutions(x t ) t, (X t) t of the SDE (9.1) which fulfill the properties of Theorem 9.5 we have P(X t = X t t ) = 1. Example 9.7 To illustrate that we need some conditions like (9.3) and (9.4) let us look at the following examples from ODEs: dx t dt = (X t ) 2, X = 1 corresponding to b(x) = x 2 (which does not satisfy (9.4)) has the unique solution Another example: X t = 1 1 t dx t dt for t < 1. = 3(X t ) 2/3, X = where b(x) = 3x 2/3 does not satisfy (9.3) at x = and, for t a, X t = (t a) 3, for t > a. are solutions for all a >. 36

37 Sketch of the proof of Theorem 9.5: Uniqueness: Uses the following lemma Lemma 9.8 Gronwall inequality Let f : [,T] R be integrable and A R,C > such that Then Proof: see literature. f(t) A+C f(t) A+e Ct f(s)ds t [,T], t [,T]. Consider two continuous solutions (X t ) t,(x t ) t of the SDE (9.1) which fulfill condition (9.5) X t X t = (σ(s,x s ) σ(s,x s )) db s + Therefore (using (a+b) 2 2a 2 +2b 2 ), X t X t 2 (σ(s,x s ) σ(s,x s)) db s 2 +2 (b(s,x s ) b(s,x s ))ds (b(s,x s ) b(s,x s))ds 2 A short calculation for the first term (see Theorem 5.6 for n = m = 1) and an application of the Cauchy-Schwarz inequality shows E[ X t X t 2 ] 2 E[ (σ(s,x s ) σ(s,x s)) 2 ]db s +2t E[ (b(s,x s ) b(s,x s)) 2 ]ds Therefore we have for f(t) := E[ X t X t 2 ] and C := 2(T + 1)K 2 (for T > ) due to condition (9.3) f(t) C f(s)ds for t T and the Gronwall inequality implies f. We can conclude due to the continuity of (X t ) t,(x t ) t that we have P(X t = X t t ) = 1. (using the fact that Q + is countable and dense in R + ). Existence: We define for n N X (n+1) t := ξ + σ(s,x (n) s ) db s + b(s,x (n) s )ds for t inductively where X t := ξ. Using the growth condition we can show inductively that for T > T E[ X (n) s 2 ]ds <, i.e. the stochastic integral is well defined in every step of the recursion. One can show that (X (n) ) n N converges a.s. uniformly on [,T] for every T > and the limit solves the SDE (9.1) and has the properties of Theorem 9.5 (more details can be found in the literature). 37

38 1 Girsanov transforms Goal: Construction of a stochastic process (X t ) t with dx t = b(x t,t)dt+db t where (B t ) t is a BM, i.e. X t = x + Interpretation: X = x b(x s,s)ds+b t (1.1) Deterministic process X t with d X dt t = b(x t,t) with additional noise (B t ) t. Possible strategies: 1) Construction of a strong solution: For a given BM (B t ) t on (Ω,A,P), solve (1.1). Example 1.1 (Ornstein-Uhlenbeck process) dx t = db t αx t dt X = x has the strong solution see Example 8.8. X t = e αt (x + e αs db s ), (1.2) Drawback: a strong solution does not always exist. 2) Construction of a weak solution: Find a BM (B t ) t and a process (X t ) t on some probability space (Ω,A,P) such that (1.1) holds, i.e. find (X t ) t such that is a BM. B t = X t x b(x s,s)ds General Girsanov transformation (Ω,A,P) probability space, (F t ) t filtration, P probability measure on (Ω,A). Assume that for all t, P Ft P Ft. Then, there are Radon-Nikodym derivatives Z t := d P dp F t = d P Ft dp Ft (1.3) 38

39 and (Z t ) t is a martingale with respect to (F t ) t and P (see Probability Theory lecture notes, ) We always assume that (Z t ) t has continuous paths and that inf{t : Z t (ω) = } = P-a.s. Definition 1.2 (M t ) t is a local martingale (up to ) if there is a sequence of stopping times T 1 T 2... such that i) supt n = n P-a.s. ii) (M t Tn ) t is a martingale, n. Each martingale is a local martingale but there are local martingales which are not martingales. We will use the following important fact: If M is a continuous local martingale and (Y t ) t a continuous adapted process, then Y sdm s, t is again a continuous local martingale. See the literature, for instance [5], Proposition Lemma 1.3 In the above setup, with Z t := d P dp F t, the following holds. (i) For s t and a function g t which is F t -measurable and bounded, we have Ẽ[g t F s ] = 1 Z s E[g t Z t F s ] P-a.s. where Ẽ denotes expectation with respect to P. Proof: (ii) For M = ( M t ) t continuous and adapted, the following two statements are equivalent: a) ( M t ) t is a local martingale with respect to P. b) ( M t Z t ) t is a local martingale with respect to P. (i) Assume g s is F s -measurable and bounded. Ẽ[g s g t ] = E[g s g t Z t ] = E[g s E[g t Z t F s ]] ] = E [g s Z s E[g t Z t F s ] 1Zs [ ] = Ẽ 1 g s E[g t Z t F s ] Z s Ẽ[g t F s ] = 1 Z s E[g t Z t F s ] 39

40 (ii) Assume that ( M t Z t ) t is a martingale with respect to P. Then Ẽ[ M t F s ] = (i) 1 Z s E[ M t Z t F s ] = 1 Z s Ms Z s = M s, P-a.s. ( M t ) t is a martingale with respect to P, hence b) a) for martingales. Rest of the proof: Exercise. Theorem 1.4 Assume that (M t ) is a continuous local martingale with respect to P and Z t is defined as in (1.3). Then, M t = M t is a continuous local martingale with respect to P. Short notation: 1 Z s d M,Z s dm = d M + 1 Z d M,Z Proof: Due to Lemma 1.1, it suffices to show that ( M t Z t ) t is a continuous local martingale with respect to P. Let A t = 1 Z s d M,Z s. Then, M t Z t = Z t (M t A t ) The definition of A implies that Itô s product rule = Z M + (M s A s )dz s Z s da s = Z,M t. Z s da s + Z,M t (1.4) Due to (1.4), Mt Z t M Z is a stochastic integral, hence again a continuous local martingale with respect to P. Remark 1.5 logz t = logz + Z continuous local martingale Y t = Y = 1 Z 2 sd Z s, hence 1 t dz s Z s 1 Z 2 s d Z s 1 Z s dz s is a continuous local martingale with logz t = logz +Y t 1 2 Y t Z t = Z e Yt 1 2 Y t and Z solves dz = ZdY. Using M,Z t = Z sd M,Y s, see Lemma 1.6 below we get from Theorem 1.3 that dm = d M +d M,Y (1.5) 4

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Stochastic Integration and Continuous Time Models

Stochastic Integration and Continuous Time Models Chapter 3 Stochastic Integration and Continuous Time Models 3.1 Brownian Motion The single most important continuous time process in the construction of financial models is the Brownian motion process.

More information

Theoretical Tutorial Session 2

Theoretical Tutorial Session 2 1 / 36 Theoretical Tutorial Session 2 Xiaoming Song Department of Mathematics Drexel University July 27, 216 Outline 2 / 36 Itô s formula Martingale representation theorem Stochastic differential equations

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

On a class of stochastic differential equations in a financial network model

On a class of stochastic differential equations in a financial network model 1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Real Analysis Problems

Real Analysis Problems Real Analysis Problems Cristian E. Gutiérrez September 14, 29 1 1 CONTINUITY 1 Continuity Problem 1.1 Let r n be the sequence of rational numbers and Prove that f(x) = 1. f is continuous on the irrationals.

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Stochastic Differential Equations

Stochastic Differential Equations Chapter 5 Stochastic Differential Equations We would like to introduce stochastic ODE s without going first through the machinery of stochastic integrals. 5.1 Itô Integrals and Itô Differential Equations

More information

Stochastic Calculus February 11, / 33

Stochastic Calculus February 11, / 33 Martingale Transform M n martingale with respect to F n, n =, 1, 2,... σ n F n (σ M) n = n 1 i= σ i(m i+1 M i ) is a Martingale E[(σ M) n F n 1 ] n 1 = E[ σ i (M i+1 M i ) F n 1 ] i= n 2 = σ i (M i+1 M

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Analysis Comprehensive Exam Questions Fall 2008

Analysis Comprehensive Exam Questions Fall 2008 Analysis Comprehensive xam Questions Fall 28. (a) Let R be measurable with finite Lebesgue measure. Suppose that {f n } n N is a bounded sequence in L 2 () and there exists a function f such that f n (x)

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

Measure-theoretic probability

Measure-theoretic probability Measure-theoretic probability Koltay L. VEGTMAM144B November 28, 2012 (VEGTMAM144B) Measure-theoretic probability November 28, 2012 1 / 27 The probability space De nition The (Ω, A, P) measure space is

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s. 20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Representations of Gaussian measures that are equivalent to Wiener measure

Representations of Gaussian measures that are equivalent to Wiener measure Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results

More information

Some Tools From Stochastic Analysis

Some Tools From Stochastic Analysis W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click

More information

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989), Real Analysis 2, Math 651, Spring 2005 April 26, 2005 1 Real Analysis 2, Math 651, Spring 2005 Krzysztof Chris Ciesielski 1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer

More information

An Overview of the Martingale Representation Theorem

An Overview of the Martingale Representation Theorem An Overview of the Martingale Representation Theorem Nuno Azevedo CEMAPRE - ISEG - UTL nazevedo@iseg.utl.pt September 3, 21 Nuno Azevedo (CEMAPRE - ISEG - UTL) LXDS Seminar September 3, 21 1 / 25 Background

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9 MAT 570 REAL ANALYSIS LECTURE NOTES PROFESSOR: JOHN QUIGG SEMESTER: FALL 204 Contents. Sets 2 2. Functions 5 3. Countability 7 4. Axiom of choice 8 5. Equivalence relations 9 6. Real numbers 9 7. Extended

More information

Riesz Representation Theorems

Riesz Representation Theorems Chapter 6 Riesz Representation Theorems 6.1 Dual Spaces Definition 6.1.1. Let V and W be vector spaces over R. We let L(V, W ) = {T : V W T is linear}. The space L(V, R) is denoted by V and elements of

More information

1.1 Definition of BM and its finite-dimensional distributions

1.1 Definition of BM and its finite-dimensional distributions 1 Brownian motion Brownian motion as a physical phenomenon was discovered by botanist Robert Brown as he observed a chaotic motion of particles suspended in water. The rigorous mathematical model of BM

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Harnack Inequalities and Applications for Stochastic Equations

Harnack Inequalities and Applications for Stochastic Equations p. 1/32 Harnack Inequalities and Applications for Stochastic Equations PhD Thesis Defense Shun-Xiang Ouyang Under the Supervision of Prof. Michael Röckner & Prof. Feng-Yu Wang March 6, 29 p. 2/32 Outline

More information

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1 Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet. For ξ, ξ 2, i.i.d. with P(ξ i = ± = /2 define the discrete-time random walk W =, W n = ξ +... + ξ n. (i Formulate and prove the property

More information

MA50251: Applied SDEs

MA50251: Applied SDEs MA5251: Applied SDEs Alexander Cox February 12, 218 Preliminaries These notes constitute the core theoretical content of the unit MA5251, which is a course on Applied SDEs. Roughly speaking, the aim of

More information

Stochastic calculus without probability: Pathwise integration and functional calculus for functionals of paths with arbitrary Hölder regularity

Stochastic calculus without probability: Pathwise integration and functional calculus for functionals of paths with arbitrary Hölder regularity Stochastic calculus without probability: Pathwise integration and functional calculus for functionals of paths with arbitrary Hölder regularity Rama Cont Joint work with: Anna ANANOVA (Imperial) Nicolas

More information

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents MATH 3969 - MEASURE THEORY AND FOURIER ANALYSIS ANDREW TULLOCH Contents 1. Measure Theory 2 1.1. Properties of Measures 3 1.2. Constructing σ-algebras and measures 3 1.3. Properties of the Lebesgue measure

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Stability of Stochastic Differential Equations

Stability of Stochastic Differential Equations Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010

More information

Multiple points of the Brownian sheet in critical dimensions

Multiple points of the Brownian sheet in critical dimensions Multiple points of the Brownian sheet in critical dimensions Robert C. Dalang Ecole Polytechnique Fédérale de Lausanne Based on joint work with: Carl Mueller Multiple points of the Brownian sheet in critical

More information

Generalized Gaussian Bridges of Prediction-Invertible Processes

Generalized Gaussian Bridges of Prediction-Invertible Processes Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca October 22nd, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Discretization of SDEs: Euler Methods and Beyond

Discretization of SDEs: Euler Methods and Beyond Discretization of SDEs: Euler Methods and Beyond 09-26-2006 / PRisMa 2006 Workshop Outline Introduction 1 Introduction Motivation Stochastic Differential Equations 2 The Time Discretization of SDEs Monte-Carlo

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

A Fourier analysis based approach of rough integration

A Fourier analysis based approach of rough integration A Fourier analysis based approach of rough integration Massimiliano Gubinelli Peter Imkeller Nicolas Perkowski Université Paris-Dauphine Humboldt-Universität zu Berlin Le Mans, October 7, 215 Conference

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

MSH7 - APPLIED PROBABILITY AND STOCHASTIC CALCULUS. Contents

MSH7 - APPLIED PROBABILITY AND STOCHASTIC CALCULUS. Contents MSH7 - APPLIED PROBABILITY AND STOCHASTIC CALCULUS ANDREW TULLOCH Contents 1. Lecture 1 - Tuesday 1 March 2 2. Lecture 2 - Thursday 3 March 2 2.1. Concepts of convergence 2 3. Lecture 3 - Tuesday 8 March

More information

Stochastic Integration and Stochastic Differential Equations: a gentle introduction

Stochastic Integration and Stochastic Differential Equations: a gentle introduction Stochastic Integration and Stochastic Differential Equations: a gentle introduction Oleg Makhnin New Mexico Tech Dept. of Mathematics October 26, 27 Intro: why Stochastic? Brownian Motion/ Wiener process

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information