Solutions to the Exercises in Stochastic Analysis

Size: px
Start display at page:

Download "Solutions to the Exercises in Stochastic Analysis"

Transcription

1 Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional expectations. Exercise 1 For x, y R d define p t (x, y) = (2πt) d 2 e x y 2 2t. Prove that P t (x, dy) = p t (x, y)dy satisfies the Chapman-Kolmogorov equation: for Γ Borel subset of R d, P t+s (x, Γ) = P s (y, Γ)P t (x, dy). R d Solution: One one hand On the other hand, P t+s (x, Γ) = (2π(t + s)) d 2 P s (y, Γ)P t (x, dy) R d = (2πt) d 2 (2πs) d 2 We now complete the squares in y. y z 2 2s y x 2 2t R d Γ = 1 t + s 2 st Γ e y z 2 2s e x z 2 2(t+s) dz. tz + sx y t + s e y x 2 2t dzdy. 2 1 x z 2 2 t s. 1

2 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 2 next we change the variable y tz+sx t+s P s (y, Γ)P t (x, dy) R d = (2πt) d 2 (2πs) d 2 = (2πt) d 2 (2πs) d 2 = (2π(t s)) fd2 Γ R d to ỹ, then Γ R d e e 1 t+s 2 st ỹ 2 e 1 x z 2 2 t s 1 t+s 2 st ỹ 2 dỹ e x z 2 2(t s) dz. Γ dzdỹ e 1 x z 2 2 t s dz Exercise 2 Let (X t, t ) be a Markov process with X = and transition function p t (x, y)dy where p t (x, y) is the heat kernel. Prove the following statements. (1) For any s < t, X t X s P t s (, dy); (2) Prove that (X t ) has independent increments. (3) For every number p > there exists a constant c(p) such that E X t X s p = c(p) t s p 2. (4) State Kolomogorov s continuity theorem and conclude that for almost surely all ω, X t (ω) is locally Hölder continuous with exponent α for any number α < 1/2. (5) Prove that this is a Brownian motion on R d. Solution: Let f be a bounded measurable function. (1) Since (x s, x t ) P s (, dx)p t s (x, dy), Ef(x t x s ) = f(y x)p s (, x)p t s (x, y)dxdy R d R d = f(z)p s (, x)p t s (, z)dxdz R d R d = f(z)p t s (, z)dz p s (, x)dx = f(z)p t s (, z)dz. R d R d R d Hence x t x s P t s (, dz).

3 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 3 (2) Let us fix = t < t 1 < t 2 <... < t n and Borel sets A i B(R), i = 1,..., n. Let f i (x) = 1 x Ai where A i are Borel measurable set. Then we obtain P(X t1 A 1,..., X tn X tn 1 A n ) =... f 1 (x 1 )f 2 (x 2 x 2 )... f n (x n x n 1 ) R R p t1 (, x 1 )p t2 t 1 (x 1, x 2 )... p tn t n 1 (x n 1, x n ) dx n... dx 1, where in the last line we have used the identity (ii). Introducing new variables: y 1 = x 1, y 2 = x 2 x 1,..., y n = x n 1 x n, we obtain P(X t1 A 1,..., X tn X tn 1 A n ) =... f 1 (y 1 )f 2 (y 2 )... f n (y n ) = n R R p t1 (, y 1 )p t2 t 1 (, y 2 )... p tn t n 1 (, y n ) dy n... dy 1 f i (y i )p ti t i1 (, y i ) dy i. This means that {X ti t i 1, i = 1,..., n} are independent random variables. (3) E x t x s p 1 = z p e z 2 (2π(t s)) d 2(t s) dz 2 R d 1 = (t s) p (2π(t s)) d 2 y p e y 2 d 2 t s 2 dy 2 R d = (t s) p 1 2 y p e y 2 (2π) d 2 dy. 2 R d The integral is finite. Let R c(p) = y p p 1 (, y)dy, R d which is the pth moment of a N(, I d d ) distributed variable. (4) and (5) are straight forward application of Kolmogorov s theorem. Exercise 3 If (B t ) is a Brownian motion prove that (B t ) is a Markov process with transition function p t (x, y)dy.

4 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 4 Solution: Let us denote for simplicity f i (x) = 1 x Ai. Furthermore, we define the random variables Y i = X ti X ti 1, for i = 1,..., n, where we have postulated t =. From the properties of the Brownian motion we obtain that the random variables {Y i } are independent and moreover Y i N (, t i t i 1 ). Thus, we have P[X t1 A 1,..., X tn A n ] = E n f i (X ti ) = E[f 1 (Y 1 )f 2 (Y 2 + Y 1 )... f n (Y n Y 1 )] =... f 1 (y 1 )f 2 (y 2 + y 1 )... f n (y n y 1 ) R R p t1 (, y 1 )p t2 t 1 (, y 2 )... p tn t n 1 (, y n ) dy n... dy 1. Now we introduce new variables: x 1 = y 1, x 2 = y 2 + y 1,..., x n = y n y 1, and obtain that the last integral equals... f 1 (x 1 )f 2 (x 2 )... f n (x n ) R R p t1 (, x 1 )p t2 t 1 (, x 2 x 1 )... p tn t n 1 (, x n x n 1 ) dx n... dx 1. Noticing that p t (, y x) = p t (x, y) and recalling the definition of the functions f i, so the finite dimensional distribution agrees with that of the Markov process determined by the heat kernels. The two processes must agree. Exercise 4 Let (X t, t ) be a continuous real-valued stochastic process with X = and let p t (x, y) be the heat kernel on R. Prove that the following statements are equivalent: (i) (X t, t ) is a one dimensional Brownian motion. (ii) For any number n N, any sets A i B(R), i = 1,..., n, and any < t 1 < t 2 <... < t n, P[X t1 A 1,..., X tn A n ] =... p t1 (, y 1 )p t2 t 1 (y 1, y 2 )... p tn tn 1 (y n 1, y n ) dy n... dy 1. A 1 A k Solution: This follows from the previous exercises.

5 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 5 Exercise 5 A zero mean Gaussian process Bt H is a fractional Brownian motion of Hurst parameter H, H (, 1), if its covariance is E(B H t B H s ) = 1 2 (t2h + s 2H t s 2H ). Then E Bt H Bs H p = C t s ph. If H = 1/2 this is Brownian motion (Otherwise this process is not even a semi-martingale). Show that (Bt H ) has a continuous modification whose sample paths are Hölder continuous of order α < H. Solution: Since E Bt H Bs H p = C t s ph = C t s 1+(pH 1), we can apply the Kolmogorov continuity criterion to obtain that Bt H has a modification whose sample paths are Hölder continuous of order α < (ph 1)/p. This means that for any α < H we can take p large enough to have α < (ph 1)/p. This finishes the proof. Exercise 6 Let (B t ) be a Brownian motion on R d. Let T be a positive number. For t [, T ] set Y t = B t t T B T. Compute the probability distribution of Y t. Solution: Ef(B t t T B T ) = (x tt ) y (2πt) d2 e x 2 2t (2π(T t)) d 2 e y x 2 2(T t) dxdy. R d We observe that R d f x 2 = x t 2 T y + 2t t2 x, y T T 2 y 2 and that x y 2 = x t 2 T y (T t) 2t(T t) 2 x, y + T T 2 y 2 (T t)2 + T 2 y 2. Thus, x 2 t + y x 2 (T t) = 1 t x t 2 T y + 1 T t x t 2 T y + 1 T y 2.

6 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 6 Finally Ef(B t t T B T ) = (2πt) d 2 (2π(T t)) d 2 R d = (2πt) d 2 (2π(T t)) d 2 f(z)e R d = f(z)p t (, z)p T t (z, )dz (2πT ) d 2. R d Finally we see R d f (x tt y ) e 1 2t x t T y 2 e 1 1 2t z 2 e 1 2(T t) z 2 dz B t t T B T p t(, z)p T t (z, ) dz. p T (, ) R d e 2(T t) x t T y 2 e y 2 2T y 2 2T dy dxdy

7 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 7 2 Brownian motion, conditional expectation, and uniform integrability Exercise 7 Let (B t ) be a standard Brownian motion. Prove that (a) (i) for any t >, E[B t ] = and E[B 2 t ] = t; (ii) for any s, t, E[B s B t ] = s t, where s t = min(s, t). (b) (scaling invariance) For any a >, 1 a B at is a Brownian motion; (c) (Translation Invariance) For any t, B t +t B t is a standard Brownian motion; (d) If X t = B t tb 1, t 1, then E(X s X t ) = s(1 t) for s t. Compute the probability distribution of X t. Hint: break X t down as the sum of two independent Gaussian random variables, then compute its characteristic function). Solution: (a) (i) Since the distribution of B t is N (, t), we have E[B t ] = and E[B 2 t ] = t. (a) (ii) We fix any t s. Then we have E[B s B t ] = E[(B t B s )B s ] + E[B 2 s ]. Since the Brownian motion has independent increments, the random variables B t B s and B s are independent and we have E[(B t B s )B s ] = E[B t B s ]E[B s ] =. Furthermore, from (i) we know that E[B 2 s ] = s. Hence, we conclude E[B s B t ] = s, which is the required identity. (b) Let us denote W t = 1 a B at. Then W has continuous sample paths and independent increments, which follows from the same properties of B. Furthermore, for any t > s, we have W t W s = 1 a (B at B as ) N (, which finishes the proof. a(t s) ) = N (, t s), a

8 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 8 (c) If we denote W t = B t +t B t, then W has continuous sample paths and independent increments, which follows from the same properties of B. Moreover, for any t > s, we have W t W s = B t +t B t +s N (, (t + t) (t + s)) = N (, t s), which finishes the proof. (d) For t = or t = 1 we have X t =. For 1 > t s > we have E[X t X s ] = E[(B t tb 1 )(B s sb 1 )] =E[B t B s ] se[b t B 1 ] te[b 1 B s ] + ste[b 1 B 1 ] =s st st + st = s(1 t). Let us take t (, 1). Then we have X t = (1 t)b t t(b 1 B t ). Since the random variables B t and B 1 B t are independent, the distribution of X t is N (, (1 t) 2 t + t 2 (1 t)) = N (, t(1 t)). Exercise 8 Let X L 1 (Ω, F, P ). Prove that the family of random variable {E{X G} : G F} is L 1 bounded, i.e. sup G F E ( E{X G} ) <. Solution: Let us take any G F. Then using the Jensen s inequality we have E ( E{X G} ) E (E{ X G}) = E X <, which proves the claim. Exercise 9 Let X L 1 (Ω; R). Prove that the family of functions is uniformly integrable. {E{X G} : G is a sub σ-algebra of F} Solution: We note that, if a set of measurable sets A C satisfies lim C P[A C ] =, then lim C E[1 AC X ] = ( since X L 1, dominated convergence). Let us take any G F and consider the family of events A(C, G) = {ω : E[X G](ω) C}. Applying the Markov s and Jensen s inequalities we obtain P(A(C, G)) C 1 E ( E[X G] ) C 1 E[E[ X G]] = C 1 E X, as C ( since E X < ).

9 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 9 For any ε >, there exist a δ > such that if P[A] < δ, then E(1 A X ) < ε. For this δ, take C > E X δ, then P[A(C, G)] < δ for any G F, which implies Finally, we conclude sup E[1 A(C,G) X ] < ε. G F sup E[1 A(C,G) E[X G] ] sup E[1 A(C,G) X ] < ε, G F G F which proves the claim. Exercise 1 Let (G t, t ), (F t, t ) be filtrations with the property that G t F t for each t. Suppose that (X t ) is adapted to (G t ). If (X t ) is an (F t )- martingale prove that (X t ) is an (G t )-martingale. Solution: The fact that (X t ) is (G t )-adapted, follows from the inclusion G t F t for each t. Furthermore, for any t > s, using the tower property of the conditional expectation, we obtain E[X t G s ] = E[E[X t F s ] G s ] = E[X s G s ] = X s. This shows that (X t ) is an (G t )-martingale. Exercise 11 (Elementary processes) Let = t < < t n < t n+1, H i be bounded F ti -measurable functions, and H t (ω) = H (ω)1 {} (t) + H i (ω)1 (ti,t i+1 ]. (1) Prove that H : R + Ω R is Borel measurable. Define the stochastic integral I t H s dm s H i (M ti+1 t M ti t) and prove that [ ] [ ] 2 [ ] E H s db s =, E H s db s = E (H s ) 2 ds. (2)

10 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 1 Solution: First, we will prove that the function (1) is Borel-measurable. To this end, we take any Borel set A B(R) and we have to show that the set {(t, ω) : H(t, ω) A} is measurable in the product space (R +, B(R + )) (Ω, F). We can rewrite this set in the following way: {(t, ω) : H(t, ω) A} = ({} {ω : H (ω) A}) n ((t i, t i+1 ] {ω : H i (ω) A}). Since the sets {} and (t i, t i+1 ], i = 1,..., n, belong to B(R + ), and {ω : H i (ω) A} F ti F, the claim now follows from the fact that the product of two measurable sets is measurable in the product space. Next, we will show the identities (2). Let us denote I t = H sdb s. Then we have E[I t ] = E [ H i (B ti+1 t B ti t) ] = E [H i ] E [ B ti+1 t B ti t] =, where in the second equality we have used the independence of B ti+1 t B ti t from F ti, which follows from the properties of the Brownian motion and the fact that B ti+1 t B ti t = if t i t. Furthermore, in the last equality we have used E [ B ti+1 t B ti t] =. For the variance of the stochastic integral we have E[I 2 t ] = E [ Hi 2 (B ti+1 t B ti t) 2] + i E [ H i H (B ti+1 t B ti t)(b t+1 t B t t) ]. In the same way as before, using the independence of the increments of the Brownian motion, we obtain that the second sum is. Thus, we have E[It 2 ] = E [ Hi 2 (B ti+1 t B ti t) 2] = = E [ Hi 2 ] [ E (Bti+1 t B ti t) 2] E [ [ Hi 2 ] t ] (ti+1 t t i t) = E (H s ) 2 ds, where in the second line we have used the fact that Hi 2 and (B t i+1 t B ti t) 2 are independent.

11 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 11 3 Martingales and Conditional Expectations A given non-specific filtration {F t } is used unless otherwise stated. Exercise 12 Let µ be a probability measure. If {X n } is u.i. and X n X in measure, prove that X n is L 1 bounded and X L 1. Solution: By the u.i., there exists a number C such that X X n C n dµ 1. X n dµ X n dµ + X n dµ 1 + C. X n C X n C Take an almost surely convergence sub-sequence if necessary, we may and will assume that X n X. Then which is finite. E[ X ] = E[lim inf X n ] lim inf E X n sup E X n, n n Exercise 13 If X is an L 1 function, prove that X t := E{X F t } is an F t -martingale. Solution: By the definition of conditional expectations, each X t L 1. By the definition X t F t for each t. By the tower property if s < t, E(X t F s ) = E(E(X F t ) F s ) = E(X F s ) = X s. Exercise 14 (Discrete Martingales) If (M n ) is a martingale, n N, its quadratic variation is the unique process (discrete time Doob-Meyer decomposition theorem) M n such that M = and Mn 2 M n is a martingale. Let (X i ) be a family of i.i.d. s with EX i = and E(Xi 2) = 1. Then S n = n X i is a martingale w.r.t. its natural filtration. This is the random walk. Prove that S n is a martingale and that its quadratic variation is S n = n. Solution: Let F n denotes the natural filtration of {X n }. Then E{S n F n 1 } = E{S n 1 F n 1 } + E{X n F n 1 } = S n 1 +. We used the fact that X n is independent of F n 1. Similarly, E{(S n ) 2 n F n 1 } = E{(S n 1 ) 2 F n 1 } + 2E{S n 1 X n F n 1 } + E{(X n ) 2 F n 1 } n = (S n 1 ) 2 + 2S n 1 E{X n F n 1 } + E[(X n ) 2 ] n = (S n 1 ) 2 (n 1). So (S n ) 2 n is an F n martingale. n

12 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 12 Exercise 15 Let φ : R d R be a convex function. Show that (a) If (X t ) is a sub-martingale and φ is increasing s.t. φ(x t ) L 1, then (φ(x t )) is a sub-martingale. (b) If (X t ) is a martingale and φ(x t ) L 1, then φ(x t ) is a sub-martingale. (c) If (X t ) is an L p integrable martingale, for a number p 1, prove that ( X t p ) is a sub-martingale. (d) If (X t ) is a real valued martingale, prove that (X t ) is a sub-martingale. Solution: Since φ is convex, it is continuous and hence it is measurable. This means that in what follows the process φ(x t ) is adapted. (a) For t > s, using the Jensen s inequality we obtain E[φ(X t ) F s ] φ(e[x t F s ]) φ(x s ), where in the last inequality we have used E[X t F s ] X s and the fact that φ is increasing. (b) The claim can be shown in the same way as (a), but now φ(e[x t F s ]) = φ(x s ). (c), (d) The claims follow from (b) and the fact that the maps x x p and x x are convex. Exercise 16 (Limit of Martingales) Let {(M n (t), t [, 1]), n N} be a family of martingales. Suppose that for each t, lim n M n (t) = M t almost surely and {M n (t), n N} is uniformly integrable for each t. Prove that M(t) is a martingale. Solution: For any t [, 1] and any A F fixed, the family {M n (t)1 A, n N} is uniformly integrable. Indeed, lim sup M n (t) 1 A dp lim sup M n (t) dp =, C n C n { M n(t) 1 A C} { M n(t) C} where the inequality follows from M n (t) 1 A M n (t) and the last equality follows from the fact that {M n (t), n N} are uniformly integrable. Since, lim n M n (t)1 A = M t 1 A almost surely, the uniform integrability implies convergence in L 1. Take A = Ω, we see that M t is integrable. For any s < t 1 and any A F s we have E[M(t)1 A ] = lim n E[M n(t)1 A ] = lim n E[M n(s)1 A ] = E[M(s)1 A ],

13 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 13 where the second equality holds, because M n is a martingale. This implies that E[M(t) F s ] = M(s), which finishes the proof. Exercise 17 Let a > be a real number. For = t 1 < < t n < t n+1 <... and i = 1,..., n,... let H i be bounded F ti measurable functions and, let H be a bounded F -measurable function. Define H t (ω) = H (ω)1 {} (t) + H i (ω)1 (ti,t i+1 ](t). Let (M t, t a) a martingale with M Define the elementary integral =, and (H t ) an elementary process. I t H s dm s H i (M ti+1 t M ti t). Prove that (I t, t a) is an F t - martingale. Solution: Note: The sum is always a finite sum. Please refer also to Exercise 11. It is clear that M ti+1 t M ti t F t. There exists an n s.t. t < t n+1. Then M ti+1 t M ti t =, i n + 1. H s dm s H i (M ti+1 t M ti t), and for 1 i n, H i F ti F t. In particular, I t F t for each t and the process (I t ) is (F t )-adapted. The finite number of bounded random variables { H i, i = 1,..., n} is bounded by a common constant C. Furthermore, since each s, M s is integrable, E H s dm s E ( H i M ti+1 t M ti t ] ) C ( E Mti+1 t + E M ti t ) <, proving that for each t, I t is integrable. Finally, we prove the martingale property. First note that if (M s, s ) is a martingale then E{M t F s } = M s t, s, t (3)

14 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 14 Take t s and assume that s [t k, t k+1 ), for some k {,..., n}. Explanation: We only need to consider two cases: (1) i k in which case H i F s in which case we can take H i out of the conditional expectation and (2) i k + 1 in which case s < t i and we may use tower property, to condition in addition w.r.t F ti and take H i out. Then we have E[I t F s ] = = E[H i (M ti+1 t M ti t) F s ] k E[H i (M ti+1 t M ti t) F s ] + i=k+1 E[H i (M ti+1 t M ti t) F s ]. For i k, we have t i s, and do the random variable H i F s and k E[H i (M ti+1 M ti ) F s ] = = k H i (M ti+1 s M ti s) = k H i E[(M ti+1 M ti ) F s ] H i (M ti+1 s M ti s) = I s. If i k + 1 then s t i, we may use the tower property of the conditional expectation: i=k+1 = = E[H i (M ti+1 t M ti t) F s ] = i=k+1 i=k+1 i=k+1 E[H i E[(M ti+1 t M ti t) F ti ] F s ] { ( }} ){ E H i Mti+1 t t i M ti t t i Fs =. Combing all these equalities we conclude E[I t F s ] = I s, E [ E [ H i (M ti+1 t M ti t) F ti ] Fs ] and (I t ) is a martingale.

15 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 15 4 Stopping times, progressively measurability The filtration F t satisfies the usual conditions. Exercise 18 Let S, T be stopping times. Prove that (1) S T, S T, as where a > 1, are stopping times. (2) If T is stopping time, then there exists a sequence of stopping times T n such that T n takes only a finite number of values and T n decreases to T. Solution: (1) For any t we have {S T t} = {S t} {T t} F t, {S T t} = {S t} {T t} F t, {as t} = {S t/a} F t/a F t, which proves the claim. (2) For any n N we define the stopping time { i2 T n = n, if (i 1)2 n T < i2 n for some i < n2 n, +, if T n. These stopping times satisfy the required conditions. Exercise 19 Prove that T is a stopping time iff {T < t} F t, for any t >. Solution: If T is a stopping time, then {T < t} = n 1 {T t 1 n } F t, because {T t 1 n } F t 1/n F t. Conversely, if {T < t} F t, for any t >, then {T t} = n 1 {T < t + 1 n } F t+ = F t, because the filtration is right-continuous. Exercise 2 Let (M t, t I) be an (F t )-martingale. Let τ be a bounded stopping time that takes countably many values. Let Y be a bounded F τ measurable random variable. Let N t = Y (M t M t τ ). Prove that (N t ) is a martingale.

16 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 16 Solution: Since Y is bounded, N t is an integrable process. Furthermore, it is adapted, since, for any t and any Borel set B, we have {N t B} = ({Y (M t M t τ ) B} {τ t}) ({ B} {τ > t}) F t. Let us now take any s < t. Then we have E[N t F s ] = E[Y (M t M t τ )1 τ>s F s ] + E[Y (M t M t τ )1 τ s F s ] = I 1 + I 2. For the first term we have, by the optional stopping theorem, I 1 = E[E[Y (M t M t τ )1 τ>s F τ ] F s ] = E[Y {}}{ E[M t M t τ F τ ] 1 τ>s F s ] =. For the second term we can get, again by the optional stopping theorem, I 2 = E[Y (M t M τ )1 τ s F s ] = Y 1 τ s E[M t M τ F s ] = 1 τ s Y (M s M τ s ) = N s. This finishes the proof. Exercise 21 Show that for s < t and A F s, τ = s1 A + t1 A c is a stopping time. Solution: Indeed, for any r we have, if r < s {τ r} = A, if s r < t Ω, if r t F r. Exercise 22 Let = t 1 < < t n+1 <... with lim n t n =. For each i =, 1,..., let H i be a real valued F ti measurable random variable and H an F -measurable random variable. For t >, we define X t (ω) = H (ω)1 {} (t) + H i (ω)1 (ti,t i+1 ](t). i= Prove that (X t, t ) is progressively measurable.

17 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 17 Solution: To this end, we take any Borel set A B(R) and we have to show that for any t, the set {(s, ω) : H(s, ω) A} is measurable in the product space ([, t], B([, t])) (Ω, F t ). We can rewrite this set in the following way: {(s, ω) [, t] Ω : H(s, ω) A} = ({} {ω : H (ω) A}) ti t ((t i t, t i+1 t] {ω : H i (ω) A}). Since the sets {} and (t i, t i+1 ], i = 1,..., n, belong to B([, t]), and {ω : H i (ω) A} F ti F, the claim now follows from the fact that the product of two measurable sets is measurable in the product space. Exercise 23 Let s < t u < v and let (H s ) and (K s ) be two elementary processes. We define: s H rdb r = H rdb r s H rdb r. Prove that [ v ] E H r db r K r db r =. s u Solution: Recall that the stochastic process ( H rdb r, t ) is measurable w.r.t. F t and is a martingale. We use the tower property to obtain the following: [ v ] [ [ v ]] Fu E H r db r K r db r = E E H r db r K r db r s u s u [ [ v ]] Fu = E H r db r E K r db r =, s u because the stochastic integral is a martingale, and hence the inner expectation vanishes. Exercise 24 Let f : R + R be a differentiable function with f L 1 ([, 1]). Let n : = t (n) 1 < t (n) 2 < < t (n) N n = t be a sequence of partitions of [, t] with lim n n. Prove that lim N n ( n =1 f(t (n) ) f(t (n) 1 ) ) 2 =. Hint: f is uniformly continuous on [, t] and f(t) f(s) = s f (r) dr. Solution: We have a simple estimate N n ( =1 f(t (n) ) ) f(t (n) 2 1 ) max f(t (n) ) f(t (n) N n 1 ) f(t (n) =1 ) f(t (n) 1 ).

18 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 18 f(t (n) Firstly, we have max ) f(t (n) 1 ), as n, because of the uniform continuity of f on [, t]. Secondly, we can estimate N n f(t (n) ) f(t (n) N n 1 ) = =1 =1 (n) t t (n) 1 Nn f (r) dr = =1 (n) t (n) 1 f (r) dr f (r) dr <, which follows from the properties of f. Thus, from these facts the claim follows.

19 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 19 5 Martingales and Optional Stopping Theorem Let (F t ) be a filtration satisfying the usual assumptions, unless otherwise stated.. Exercise 25 Use the super-martingale convergence theorem to prove the following statement. Let (X n ) be a sub-martingale sequence and sup n E(X + n ) <. Then lim n X n exists almost surely. Solution: The process Y n = X n is a super-martingale. Furthermore, sup n E(Yn ) = sup n E(X n + ) <. Thus, from the super-martingale convergence theorem, the limit lim n Y n = lim n X n exists almost surely. Exercise 26 Let (M t, t t ) be a continuous local martingale with M =. Prove that (M t, t t ) is a martingale if sup t t M t L 1. Solution: Since M =, there is a sequence of stopping times T n almost surely, such that (Mt Tn ) is a martingale. Hence, for any t > s, we have E[Mt Tn F s ] = Ms Tn M s, almost surely, as n. Observe that Mt Tn sup t t M t, the condition that sup t t M t L 1 allows us to apply the dominated convergence theorem to obtain E[M Tn t F s ] E[M t F s ], almost surely, as n. This shows that E[M t F s ] = M s, what means that (M t ) is a martingale. Exercise 27 Let {B t } t be one dimensional Brownian Motion starting at x (B = x), and let a < x < b. In this question we are going to find the probability of the Brownian Motion hitting b before a using the Optional Stopping Theorem (OST). Set T a = inf {t : B t = a}, T b = inf {t : B t = b}, and T = T a T b. (a) Give an easy arguments why T a, T b and T are all stopping times with respect to the natural filtration of the Brownian Motion. (b) One would like to compute E[B T ] using OST, but (B t, t ) is not a uniformly integrable martingale and apply OST would require for T to be a bounded stopping time. Instead we are using the limiting argument. (b1) Let n N use OST to prove that E[B T n ] = x.

20 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 2 (b2) Conclude that E[B T ] = x. (c) Compute P(T a > T b ). Solution: (a) T a is a stopping time since it is the hitting time by (B t ) of the closed set {a} and the same can be said for T b. That T = T a T b is the stopping time by Exercise 18. (b1) Note that T n and both and T n are bounded stopping times. Hence by the OST E[B T n B ] = B = x almost surely. Taking the expectation on the both parts and using the tower law we get the result. (b2) It is clear that pointwise B T n B T as n. Moreover B T n is bounded between a and b hence we can use DCT to conclude: (c) Denote P(T a > T b ) by p, then: E[B T ] = lim n E[B T n] = x x = E[B T ] = E[B Tb 1 {Ta>T b } + B Ta 1 {Tb >T a}] = E[b1 {Ta>T b } + a1 {Tb >T a}] = bp + a(1 p) From here we deduce that P(T a > T b ) = x a b a. Exercise 28 If (M t, t I) is a martingale and S and T are two stopping times with T bounded, prove that Solution: We can write M S T = E{M T F S }. E{M T F S } = E{1 T <S M T F S } + E{1 T S M T F S }. Note, that 1 T <S M T is F S -measurable. Thus, for the first term we have E{1 T <S M T F S } = 1 T <S M T = 1 T <S M S T. One can see that {T S} F S T. Indeed, for any t one has {T S} {T S t} = {S T t} = {S T t} ({S t} {T > t}) F t, because every set belongs to F t. Thus, we conclude E{1 T S M T F S } = E{1 T S M T F S T } = 1 T S E{M T F S T } = 1 T S M S T, where we have used the Doob s optional stopping theorem. Combining all these equalities together, we obtain the claim.

21 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 21 Exercise 29 Prove the following: (1) A positive right continuous local martingale (M t, t ) with M = 1 is a super-martingale. (2) A positive right continuous local martingale (M t, t ) is a martingale if E M < and EM t = EM for all t >. Solution: (1) Since E M <, there is a sequence of increasing stopping times T n, such that the process Mt Tn is a martingale. In particular, Mt Tn is integrable. For s < t, we use Fatou s lemma to obtain E[M t F s ] = E[lim inf n M t T n F s ] lim inf n E[M t T n F s ] = lim inf n M s T n = M s. (2) Let S T be two stopping times, bonded by a constant K. Then, from (1), we have EM EM S EM T EM K = EM, which implies EM T = EM S. We know that EM T = EM S for any two bounded stopping times t T implies that (M t ) is a martingale, completing the proof. Exercise 3 Let (M t, t ) and (N t, t ) be continuous local martingales with M = and N =. (1) Let (A t ) and (A t) be two continuous stochastic processes of finite variation with initial values and such that (M t N t A t ) and (M t N t A t) are local martingales. Prove that (A t ) and (A t) are indistinguishable. (2) Prove that M, N t is symmetric in (M t ) and (N t ) and is bilinear. ) (3) Prove that M, N t = 1 4 ( M + N, M + N t M N, M N t. (4) Prove that M M, N N t = M, N t. (5) Let T be a stopping time, prove that M T, N T = M, N T = M, N T. Solution: (1) We use the following theorem. If (M t, t T ) is a continuous local martingale with M =. Suppose that (M t, t T ) has finite total variation. Then M t = M, any t T. The process A t A t = (M t N t A t ) (M t N t A t) is a continuous local martingale, and at the same time a process of finite total variation. Then A t A t = A A =.

22 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 22 (2) The symmetry equality M, N = N, M comes from that of the product: MN = NM. Next, we will prove The process M 1 + M 2, N = M 1, N + M 2, N. M 1 N + M 2 N M 1, N M 2, N is a local martingale. Thus, the claim follows from the uniqueness of the bracket process. Similarly for k R, km k M, is a local martingale and km = k M. (3) is a consequence of the bi-linearity, proved earlier. (4) Since (M t N, t ) is a local martingale, the bracket process M, N vanishes. By the same reason we have M, N = M, N =. Hence, the claim follows from the bilinearity. (5) By the definition M T N T M T, N T and (MN) T M, N T are local martingales. This implies M T, N T = M, N T, from the uniqueness of the bracket process. Furthermore, both M T N T M T, N T and (M M T )N T are local martingales, hence their sum MN T M T, N T is a local martingale as well. This implies M, N T = M T, N T, from the uniqueness of the bracket process. Exercise 31 Let (M t, t [, 1]) and (N t, t [, 1]) be bounded continuous martingales with M = N =. If (M t ) and (N t ) are furthermore independent prove that their quadratic variation process ( M, N t ) vanishes. Solution: Let us take a partition = t < t 1 <... < t n = 1. Then we have = = [ E (M ti M ti 1 )(N ti N ti 1 ) ] 2 [ ] [ ] E (M ti M ti 1 )(M t M t 1 ) E (N ti N ti 1 )(N t N t 1 ) i,=1 ] 2E ] 2 ] ] E [M ti M ti 1 [N ti N ti 1 = E [M 2 ti M 2 ti 1 E [N 2 ti N 2 ti 1 ] ] E [M 2 ti M 2 ti 1 sup E [N 2 t N 2 t 1 ] E [M 2 ti M 2 ti 1 E sup Because N 2 t is uniformly continuous, sup N 2 t N 2 t 1 almost surely and N 2 t is bounded, E sup [ N 2 t N 2 t 1 ]. Since the bracket process is the limit [N 2 t N 2 t 1 ].

23 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 23 of n (M t i M ti 1 )(N ti N ti 1 ) (in probability), this implies M, N t =. Exercise 32 Prove that (1) for almost surely all ω, the Brownian paths t B t (ω) has infinite total variation on any intervals [a, b]. (2) For almost surely all ω, B t (ω) cannot have Hölder continuous path of order α > 1 2. Solution: (1) We know that t is the quadratic variation of (B t ). We can choose a sequence of partitions {t (n) } =1,...,Mn such that for almost surely all ω, ( 2 B (n) t (ω) B (n) t (ω)) (b a), (4) 1 =1 as n. Also for every ω, M n =1 (B t (n) (ω) B t (n) 1 (ω)) 2 max max B (n) t (ω) B (n) t 1 M n (ω) =1 B (n) t (ω) B (n) t (ω) B(ω) TV[a,b]. 1 B (n) t (ω) B (n) t (ω) 1 If B(ω) TV[a,b] is finite then, since t B t (ω) is uniformly continuous on [a, b], M n =1 (B t (n) (ω) B t (n) 1 (ω)) 2 max B (n) t (ω) B (n) t (ω) TV [a,b] (B(ω)). 1 But for almost surely all ω, this limit is b a, and hence TV [a,b] (B(ω)) = for almost surely all ω. (2) Suppose that for some α > 1 2 and some number C, both may depend on ω, B t (ω) B s (ω) C(ω) t s α. We take the partitions {t (n) } =1,...,Mn, as in (1). For any ω we have M n =1 (B t (n) (ω) B t (n) 1 M n (ω)) 2 C 2 =1 t (n) = C 2 (b a) n 2α 1, t (n) M n 1 2α C 2 n 2α 1 =1 t (n) what contradicts with (4), unless b a =. Hence for almost surely all ω, the Brownian path is not Hölder continuous of order α > 1/2 in any interval [a, b]. t (n) 1

24 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 24 6 Local Martingales and Stochastic Integration If M H 2, L 2 (M) denotes the L 2 space of progressively measurable processes with the following L 2 norm: [ ] f L 2 (M) := E (f s ) 2 d M, M s ) <. If (M t ) is a continuous local martingale we denote by L 2 loc (M) the space of progressively measurable processes with (K s ) 2 d M, M s <, t. Exercise 33 Let (B t ) be a standard Brownian Motion and let f s, g s L 2 loc (B). Compute the indicated bracket processes, the final expression should not involve stochastic integration. 1. f sdb s, g sdb s. 2. B sdb s, B3 s db s t. Solution: (1a) By the definition, f s db s, (1b) By Itô isometry: B s db s, g s db s t = B 3 s db s t = (2) We have the following identities: ( s ) I t = (2B s + 1)d B r d(b r + r) = = 2 B 2 s db s + B s db s + 2 f s g s d B s = B 4 s d B s = B 2 s ds + f s g s ds. B 4 s ds. (2B s + 1)B s d(b s + s) B s ds. Exercise 34 We say that a stochastic process belongs to C α if its sample paths are of locally Hölder continuous of order γ for every γ < α. Let (H t, t 1) be an adapted continuous and L p bounded stochastic process where p > 2, with H L 2 ([, 1] Ω). Let (B t ) be a one dimensional Brownian motion and set M t = H sdb s.

25 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 25 (a) Prove that (M t ) belongs to C p. (b) If (H t ) is a bounded process, i.e. sup (t,ω) H t (ω) <, prove that (M t ) belongs to C 1 2. Proof: First note that the bracket process of s H rdb r is s (H r) 2 dr. (1) Since H is L p bounded, there exists C s.t. sup s [,1] E( H s p ) < C. By Burkholder-Davis-Gundy inequality, ( ) p E M t M s p c p E (H r ) 2 2 dr cp (t s) p 1 2 s t s E Cc p (t s) p 2. s (H r ) p dr By Kolmogorov s continuity criterion, (M t ) has a modification which is Hölder continuous for any γ < 1 p ( p 2 1) = p. (2) Similarly for any p, ( ) p E M t M s p c p E (H r ) 2 2 dr cp (t s) p 2 sup (H r (ω)) p, s r [,1],ω Ω so that (M t ) has a modification which is Hölder continuous for any γ < p for any p > and concluding the second assertion, again by Kolmogorov s continuity theorem. Exercise 35 Let T >. Let f be a left continuous and adapted process such that E T (f s) 2 ds <. Prove that ( f sdb s, t T ) is a martingale. Solution: We know that ( f sdb s, t T ) is a local martingale. Furthermore, E( f s db s, ) 2 c 2 E (f s ) 2 ds c 2 E T (f s ) 2 ds <. by Burkholder-Davis-Gundy inequality. It is therefore an L 2 bounded process, and so a martingale. Exercise 36 Let M H 2 and H L 2 (M). Let τ be a stopping time. Prove that τ H s dm s = 1 s τ H s dm s = H s dm τ s.

26 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 26 Solution: For any N H 2, we use Exercise 3 to obtain the following identities H s d M τ, N s = = 1 s τ H s d M, N s = H s d M, N τ s τ t H s d M, N s. The claim now follows from the fact, that the stochastic integral I t = H sdms τ is defined uniquely by the identities for any N H 2. I, N t = H s d M τ, N s, t, Exercise 37 Let M H 2 and K E. Prove that the elementary integral I t := t K sdm s satisfies for any N H 2. I, N t = Solution: See lecture notes. K s d M, N s, t, Exercise 38 Let B t = (Bt 1,..., Bt n ) be an n-dimensional Brownian motion. Prove that B t 2 = 2 BsdB i s i + nt. Solution: Either use B t 2 = n Bi t 2 apply the product to each component (B i t), then summing up or use the multi-dimensional version of the Itô s formula. Let us consider the function f : R n R, f : x x 2. Then its derivatives are given by i f(x) = 2x i, 2 if(x) = 2δ i,. Applying the Itô formula, we obtain B t 2 = f(b t ) = f(b ) + i f(b s )dbs i = + 2 which is the required identity. B i sdb i s + i,=1 ds = 2 2 if(b s )d < B i, B > s B i sdb i s + nt,

27 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 27 Exercise 39 Give an expression for s db s that does not involve stochastic integration. Solution: Applying the classical integration by parts formula, we obtain s db s = tb t B s ds. Exercise 4 Write (2B s + 1)d ( s B rd(b r + r) ) as a function of (B t ), not involving stochastic integrals. Solution: Firstly, ( s (2B s + 1)d = 2B 2 s db s + ) B r d(b r + r) = B s db s + From the Itô formula for the function x 3 we obtain B 2 s db s = 1 3 B3 t And the Itô formula for the function x 2 gives (2B s + 1)B s d(b s + s) (2B 2 s + B s )ds. B s ds. B s db s = 1 2 (B2 t t). Substituting these stochastic integrals into the formula obtained in (2a), we get I t = 2 3 B3 t (B2 t t) + 2 B 2 s ds B s ds. Exercise 41 If (H t ) and (K t ) are continuous martingales, prove that HK, B t = H r d K, B r + K r d H, B r.

28 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 28 Solution: The product formula gives H t K t = H r dk r + Hence, by Itô isometry, we obtain K r dh r + H K + H, K t. HK, B t =< = what is the required identity. H r dk r + H r d K, B r + K r dh r, B t K r d H, B r,

29 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 29 7 Itô s Formula Exercise 42 If (N t ) is a continuous local martingale with ) N (e Nt 1 2 N,N t ) is a local martingale, and E (e Nt 1 2 N,N t 1. =, show that Solution: Let us denote X t = e Nt 1 2 N,N t. The process (N t 1 2 N, N t) is a semi-martingale, and we can apply the Itô formula to the function e x : X t = 1 + X s dn s. Since the stochastic integral is a local martingale, we conclude that (X t ) is a local martingale. Moreover, let T n be the sequence of increasing stopping times such that Xt Tn is a uniformly integrable martingale. Then, applying the Fatou lemma, we get E[X t ] = E[ lim n XTn t ] lim inf n E[XTn t ] = E[X ] = 1, which is the required bound. Exercise 43 Show that a positive continuous local martingale (N t ) with N = 1 can be written in the form of N t = exp (M t 1 2 M, M t) where (M t ) is a continuous local martingale. dn s. Applying the Itô formula to the func- Solution: Let us define M t = tion log(x), we obtain N s 1 log(n t ) = log(n ) + which implies the claim. N 1 s dn s 1 2 N 2 s d N s = M t 1 2 M t, Exercise 44 Let (B t ) be a one dimensional Brownian motion on (Ω, F, F t, P ) and let g : R R be a bounded Borel measurable function. Find a solution to x t = x + x s g(b s )db s. Solution: We define M t = g(b s)db s. Then dx t = X t dm t whose solution is the exponential martingale: e Mt 1 2 M t.

30 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 3 Let f(x) = x and y t = M t 1 2 M t. We see that e yt = 1 + proving the claim. e ys dy s e ys d M s = 1 + e ys dm s, Exercise Let f, g : R R be C 2 functions, prove that f(b), g(b) t = f (B s )g (B s )ds. 2. Compute exp (M t 1 2 M t), exp (N t 1 2 N t), where (M t ) and (N t ) are continuous local martingales. Solution: (1) By Itô formula we have Hence, we obtain f(b t ) = f() + g(b t ) = g() + f(b), g(b) t = = f (B s )db s g (B s )db s f (B s )db s, f (B s )g (B s )ds, f (B s )ds, g (B s )ds. g (B s )db s t what is the required identity. (2) Let us denote X t = exp (M t 1 2 M t) and Y t = exp (N t 1 2 N t). Then, applying the Itô formula to the functions e x, we obtain X t = exp (M ) + Hence, we conclude X s dm s, Y t = exp (N ) + Y s dn s. X, Y t = X s dm s, Y s dn s t = X s Y s d M, N s.

31 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 31 Exercise 46 Let B t = (B 1 t,..., B n t ) be an n-dimensional Brownian motion on (Ω, F, F t, P ), with n > 1. We know that for any t >, {r s for all s t} has probability one. Prove that the process r t = B t is a solution of the equation n 1 r t = r + B t + ds, r =. 2r s Solution: (1) Let us take a function f(x) = x, for x R n. Its derivatives are given by f x i f(x) = x i x, 2 f x 2 (x) = x 2 x 2 i i x 3, for x. Note that f is C 2 on R 2 \{}. Let τ be the first time that X t := x +B t hits. Then τ = almost surely. Applying the Itô formula to f and X τ t, the latter equals X s almost surely. We obtain r t = r + = r + X i s X s dbi s X i s X s dbi s X s 2 (X i s) 2 X s 3 ds n X s 2 X s 2 X s 3 ds. To finish the proof, we have to show that B t := n Xs i X s dbi s is a Brownian motion. Firstly it is a continuous martingale, starting at. It has the quadratic variation B, B (B i t = s) 2 ds = t. B s 2 Hence, the claim follows from the Lévy characterization theorem. Exercise 47 Let f : R R be C 2. Suppose that there is a constant C > s.t. f C. Let x 1 h(x) = f(y) dy. Suppose that lim ( x + h(x) ) =. Denote by h 1 : [, ) [, ) its inverse.. Prove that h 1 h(x ) + B t satisfies: x t = x + f(x s )db s f(x s )f (x s )ds.

32 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 32 ) Solution: Let us denote g(x) = h (h(x 1 ) + x. Then its derivatives are given by g (x) = f(g(x)), g (x) = f (g(x))f(g(x)). The claim now follows by applying the Itô formula to g(b t ). Exercise 48 Let (B t ) be a Brownian motion and τ its hitting time of the set [2, ). 1 Is the stochastic process ( B t 2, t < τ) a solution to a stochastic differential equation? (The time τ is the natural life time of the 1 process B t 2.) Solution: Note that τ = inf{t : B t [2, )}. Let τ n be a sequence of stopping time increasing to τ. Let f(x) = (x 2) 1. Where f is differentiable, f (x) = (x 2) 2 = f 2 and f (x) = 2(x 2) 3 = 2f 3. Applying the Itô formula to f and to the stopped process X t := (B t 2) 1 we obtain i.e. X τn t f(b τn t X τn t We see that on the set {t < τ n }, ) = 1 2 (X τn s ) 2 db s + = 1 τn 2 (X s ) 2 db s + X t = 1 2 Xs 2 db s + τn X 3 s ds. (X τn s ) 3 ds, (X s ) 3 ds, which is the required equation. Since {t < τ} = n {t < τ n }, the equation holds on {t < τ}. Exercise 49 Let (X t ) be a continuous local martingale begins with. Prove that X 2, X t = 2 X sd X, X s. Solution: Note that X t = dx s and Consequently, X 2 t = 2 X s X s + X, X t. X 2, X t = dx s, 2 X s X s t = 2 X s d X, X s.

33 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 33 8 Lévy s Characterisation Theorem, SDEs Exercise 5 Let σ k = (σk 1,..., σd k ), b = (b1,..., b d ) be C 2 functions from R d to R d. Let ( L = 1 d m ) σk i 2 d 2 σ k + b l. x i x x l Suppose that for C >, i,=1 k=1 l=1 σ(x) 2 c(1 + x 2 ), b(x), x c(1 + x 2 ). (a) Let f(x) = x Prove that Lf af for some a. (b) If x t = (x 1 t,..., x d t ) is a stochastic process with values in R d, satisfying the following relations: x i t = x + m k=1 σ i k (x s)db k s + b i (x s )ds, and f : R d R, prove that f(x t ) f(x ) Lf(x s)ds is a local martingale and give the semi-martingale decomposition for f(x t ). Solution: (a) The partial derivatives of f are given by f x i (x) = 2x i, 2 f x i x (x) = 2δ i,. Hence, we obtain the following bound Lf(x) = d m d (σk i )2 + 2 b i x i = σ(x) < b(x), x > k=1 cf(x) + 2cf(x) = 3cf(x), which is the required bond. (b) Firstly, B i, B s = t if i = and vanishes otherwise. Hence, m m m x i, x t = σk i (x s)dbs k, σ l (x s)dbs l = σk i (x s)σ k (x s)ds. k=1 l=1 t k=1

34 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 34 f(x t ) =f(x ) + =f(x ) d i,=1 d d f x i (x s )dx i s f x i (x s ) 2 f x i x (x s ) i,=1 k=1 d i,=1 m σk i (x s) dbs k + k=1 2 f x i x (x s )d x i, x s d m σk i (x s)σ k (x s)ds k=1 f x i (x s )b i (x s )ds m d f =f(x ) + (x s )σk i k=1 x (x s) dbs k i ( t + 1 d m ) σk i 2 (x s)σ k (x 2 f d f s) (x s ) + (x s )b i (x s ) ds x i x x i d k=1 Thus f(x t ) f(x ) Lf(x s)ds = m f x i (x s )σk i (x s) dbs k is a local martingale and the semi-martingale decomposition is as given in the identity earlier. Exercise 51 Let T be a bounded stopping time and (B t ) a one dimensional Brownian motion. Prove that (B T +s B T, s ) is a Brownian motion. Solution: Let us denote W s = B T +s B T. It is obvious, that W = and W t has continuous sample paths. Moreover, by the OST we have, for any s < t, E[W t F T +s ] = E[B T +t F T +s ] B T = B T +s B T = W s. Let G t = F T +t. Then (W t ) is a (G t )-martingale. Next, we take s < t and derive E[W 2 t F T +s ] = E[B 2 T +t + B 2 T 2B T B T +t F T +s ] = E[B 2 T +t (T + t) F T +s ] + (T t) + B 2 T 2B T B T +s = (B 2 T +s (T + s)) + (T t) + B 2 T 2B T B T +s = W 2 s + t s, where we have used the fact that (B T +t ) 2 (T + t) is a martingale. This implies that Wt 2 t is a (G t )-martingale, and hence W, W t = t. Using the Lévy characterization theorem, we conclude that W t is a (G t )-Brownian motion. Exercise 52 Let {B t, W 1 t, W 2 t } be independent one dimensional Brownian motions.

35 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics Let (x s ) be an adapted continuous stochastic process. Prove that (W t ) defined below is a Brownian motion, W t = cos(x s )dw 1 s + sin(x s )dw 2 s. 2. Let sgn(x) = 1 if x > and sgn(x) = 1 if x. Prove that (W t ) is a Brownian motion if it satisfies the following relation W t = sgn(w s )db s. 3. Prove that the process (X t, Y t ) is a Brownian motion if they satisfy the following relations, X t = Y t = 1 Xs>Y s dw 1 s + 1 Xs Y s dw 1 s + 1 Xs Y s dw 2 s 1 Xs>Y s dw 2 s. Solution: (1) The process (W t ) is a continuous martingale with the quadratic variation ( < W > t = (cos(xs )) 2 + (sin(x s )) 2) ds = t. Hence, by Lévy characterization theorem, we conclude that (W t ) is a Brownian motion. (2) Can be shown in the same way. (3) The processes (X t ) and (Y t ) are martingales, and their quadratic variations are < X > t = < Y > t = The bracket process is equal to < X, Y > t = ( 1 2 Xs>Y s X s Y s ) ds = t, ( 1 2 Xs Y s X s>y s ) ds = t. 1 Xs>Y s 1 Xs Y s ds + 1 Xs Y s 1 Xs>Y s ds =, where we have used the fact that 1 Xs>Ys 1 Xs Ys =. The claim now follows from the Lévy characterization theorem.

36 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 36 Exercise 53 Let t [, 1). Define x = and 1 x t = (1 t) 1 s db s. (1) Prove that [ ] 2 db s E = t 1 s 1 t. (2) Prove that Define A(t) = (3) Define t 1 t. Then A 1 (r) = W r = db s = t 1 s t 1 t. A 1 (r) r r+1. db s 1 s, r <. Let G r = F A 1 (r). Prove that (W r ) is an (G r )-martingale. (4) For a standard one dimensional Brownian motion B t, lim r rb 1/r =. Use this to prove that lim t 1 x t =. (5) Prove that x t solves x s x t = B t 1 s ds. (6) Prove that x s 1 sds is of finite total variation on [, 1]. Hint: it is sufficient to prove that almost surely. 1 x s 1 s ds < Solution: (1) By Itô isometry we obtain [ E (2) We have ] 2 db t s = 1 s ds (1 s) 2 = t 1 t. db t s d < B > s = 1 s t (1 s) 2 = t 1 t.

37 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 37 (3) It is obvious that (W r ) is integrable and adapted to (G r ). Moreover, for any t > s, we have [ ] t/(t+1) db s/(s+1) r db r E[W t G s ] = E F 1 r s/(s+1) = 1 r = W s, because s/(s + 1) < t/(t + 1). Thus the Itô integral (W t ) is a martingale with respect to (F t ). This shows that (W r ) is a martingale with respect to (G r ). (4) By (2), we can calculate W t = A 1 (t) 1 A 1 (t) = t. By Paul Lévy characterization theorem, we conclude that (W t ) is a Brownian motion. Then, lim r rw 1/r =. This implies lim X t = lim X r/(r+1) = lim rw 1/r =. t 1 r r (5) Since the function 1 1 s has finite total variation on [, c], for every c < 1, the integral B sd 1 1 s, for t c, can be defined in the Stieltes sense. Using integration by parts, we see that the process W t is defined in the Stieltes sense as well. We can use the classical analysis to derive the following equalities: x s 1 s ds = = t W A(s) ds = sw A(s) t db t s 1 s = x t + B t, s 1 s db s = t s dw A(s) db t s 1 s db s 1 s + which is the claimed equality, for any t c. Passing c 1, we conclude that the equality holds for any t [, 1]. (6) Since, x s is defined as an integral of a non-random process with respect to the Brownian motion, it has the normal distribution, whose parameters are easily seen to be and s(1 s) (the latter follows from (1)). Thus, we apply Fubini lemma to derive 1 E x s 1 1 s ds = E x s 1 s(1 s) 1 s ds = C ds <, 1 s for some constant C >. This implies that 1 the process Y t = db s x s 1 sds < almost surely. Defining x s 1 s ds and taking any partition = t < t 1 <... < t n = 1,

38 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 38 we derive n 1 n 1 Y ti+1 Y ti = i= from what the claim follows. i= i+1 t i x n 1 s 1 s ds = i= 1 i+1 t i x s 1 s ds x s 1 s ds <,

39 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 39 Problem Sheet 9: SDE and Grisanov Exercise 54 Let us consider an SDE dx t = m k=1 σ k(x t )dbt k + σ (x t )dt on R d with infinitesimal generator L. Let D be a bounded open subset of R d with closure D and C 2 boundary D. Let x D. Suppose that there is a global solution to the SDE with initial value x and denote by τ its first exit time from D. Suppose that τ < almost surely. Let g : D R be a continuous function, and u : D R be a solution to the Dirichlet problem Lu = g on D and u = on the boundary. Prove that u(x ) = E τ g(x s)ds. [Hint: Use a sequence of increasing stopping times τ n with limit τ]. Solution: Let D n be a sequence of increasing open sets of D n D n+1 D and τ n the first exit time of x t from D n. By Itô s formula, u(x t τn ) = u(x ) + τn Lu(x s )ds + τn = u(x ) g(x s )ds + τn k=1 τn k=1 du(σ k (x s ))db k s du(σ k (x s ))db k s. Since u, g are continuous on D n, the stochastic integral is a martingale. We have E[u(x t τn )] = u(x ) E τn g(x s )ds. Since τ n < τ <, lim t (t τ n ) = t a.s. We take t followed by taking n and used dominated convergence theorem to take the limit inside the expectation, we see that lim lim E[u(x t τ n n )] = Eu(x τ ) =. t and τn τ lim lim E g(x s )ds = E g(x s )ds. n t This completes the proof. Exercise 55 A sample continuous Markov process on R 2 is a Markov process on the hyperbolic space (the upper half space model) if its infinitesimal generator is 1 2 y2 ( ). Prove that the solution to the following equation is a hyperbolic x 2 y 2 Brownian motion. dx t = y t dbt 1 dy t = y t dbt 2.

40 Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 4 [Hint: Show that if y > then y t is positive and hence the SDE can be considered to be defined on the upper half plane. Compute its infinitesimal generator L.] Solution: Just observe that y t = y e B2 t (1/2)t so y t > if y >. Compute the generator by Itô s formula. Exercise 56 Discuss the uniqueness and existence problem for the SDE dx t = sin(x t )db 1 t + cos(x t )db 2 t. Solution: The functions sin(x) and cos(x) are C 1 with bounded derivative, and hence Lipschitz continuous. For each initial value there is a unique global strong solution. Exercise 57 Let (x t ) and (y t ) be solutions to the following respective SDEs (in Stratnovitch form), x t = x Show that x 2 t + y 2 t is independent of t. y s db s, y t = y + Solution: Let us rewrite the SDEs in the Itô form: x t = x y s db s 1 2 < y, B > t, y t = y + We can calculate the bracket processes in these expressions: < y, B > t =< x s db s, B > t = Moreover, the quadratic variations of (x t ) and (y t ) are < y > t =< x s db s > t = x s db s. x s ds, < x, B > t = x 2 sds, < x > t = x s db s < x, B > t. y 2 sds. y s ds. Applying now the Itô formula to the function f(x, y) = x 2 + y 2, we obtain f(x t, y t ) =f(x, y ) + 2 =f(x, y ) 2 + =f(x, y ) x s dx s + 2 x s y s db s y s dy s + < x > t + < y > t x s d < y, B > s +2 y s d < x, B > s + < x > t + < y > t x 2 sds x s y s db s y 2 sds+ < x > t + < y > t = f(x, y ),

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Theoretical Tutorial Session 2

Theoretical Tutorial Session 2 1 / 36 Theoretical Tutorial Session 2 Xiaoming Song Department of Mathematics Drexel University July 27, 216 Outline 2 / 36 Itô s formula Martingale representation theorem Stochastic differential equations

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f), 1 Part I Exercise 1.1. Let C n denote the number of self-avoiding random walks starting at the origin in Z of length n. 1. Show that (Hint: Use C n+m C n C m.) lim n C1/n n = inf n C1/n n := ρ.. Show that

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Group 4: Bertan Yilmaz, Richard Oti-Aboagye and Di Liu May, 15 Chapter 1 Proving Dynkin s formula

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

Stability of Stochastic Differential Equations

Stability of Stochastic Differential Equations Lyapunov stability theory for ODEs s Stability of Stochastic Differential Equations Part 1: Introduction Department of Mathematics and Statistics University of Strathclyde Glasgow, G1 1XH December 2010

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is

More information

Hardy-Stein identity and Square functions

Hardy-Stein identity and Square functions Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there

More information

Math 735: Stochastic Analysis

Math 735: Stochastic Analysis First Prev Next Go To Go Back Full Screen Close Quit 1 Math 735: Stochastic Analysis 1. Introduction and review 2. Notions of convergence 3. Continuous time stochastic processes 4. Information and conditional

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989), Real Analysis 2, Math 651, Spring 2005 April 26, 2005 1 Real Analysis 2, Math 651, Spring 2005 Krzysztof Chris Ciesielski 1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion. Chapter Definition of Brownian motion Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

1.1 Definition of BM and its finite-dimensional distributions

1.1 Definition of BM and its finite-dimensional distributions 1 Brownian motion Brownian motion as a physical phenomenon was discovered by botanist Robert Brown as he observed a chaotic motion of particles suspended in water. The rigorous mathematical model of BM

More information

Stochastic Calculus February 11, / 33

Stochastic Calculus February 11, / 33 Martingale Transform M n martingale with respect to F n, n =, 1, 2,... σ n F n (σ M) n = n 1 i= σ i(m i+1 M i ) is a Martingale E[(σ M) n F n 1 ] n 1 = E[ σ i (M i+1 M i ) F n 1 ] i= n 2 = σ i (M i+1 M

More information

Feller Processes and Semigroups

Feller Processes and Semigroups Stat25B: Probability Theory (Spring 23) Lecture: 27 Feller Processes and Semigroups Lecturer: Rui Dong Scribe: Rui Dong ruidong@stat.berkeley.edu For convenience, we can have a look at the list of materials

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information

Introduction to stochastic analysis

Introduction to stochastic analysis Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA 2139-437, USA. Abstract These lectures notes are notes in progress designed for

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

An Overview of the Martingale Representation Theorem

An Overview of the Martingale Representation Theorem An Overview of the Martingale Representation Theorem Nuno Azevedo CEMAPRE - ISEG - UTL nazevedo@iseg.utl.pt September 3, 21 Nuno Azevedo (CEMAPRE - ISEG - UTL) LXDS Seminar September 3, 21 1 / 25 Background

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information

Homework #6 : final examination Due on March 22nd : individual work

Homework #6 : final examination Due on March 22nd : individual work Université de ennes Année 28-29 Master 2ème Mathématiques Modèles stochastiques continus ou à sauts Homework #6 : final examination Due on March 22nd : individual work Exercise Warm-up : behaviour of characteristic

More information

Stochastic Integration.

Stochastic Integration. Chapter Stochastic Integration..1 Brownian Motion as a Martingale P is the Wiener measure on (Ω, B) where Ω = C, T B is the Borel σ-field on Ω. In addition we denote by B t the σ-field generated by x(s)

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

Notes 18 : Optional Sampling Theorem

Notes 18 : Optional Sampling Theorem Notes 18 : Optional Sampling Theorem Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Chapter 14], [Dur10, Section 5.7]. Recall: DEF 18.1 (Uniform Integrability) A collection

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS Qiao, H. Osaka J. Math. 51 (14), 47 66 EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS HUIJIE QIAO (Received May 6, 11, revised May 1, 1) Abstract In this paper we show

More information

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Chapter 3 A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Abstract We establish a change of variable

More information

Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes

Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes BMS Advanced Course Stochastic Processes III/ Wahrscheinlichkeitstheorie IV Michael Scheutzow Lecture Notes Technische Universität Berlin Wintersemester 218/19 preliminary version November 28th 218 Contents

More information

Stochastic Calculus. Alan Bain

Stochastic Calculus. Alan Bain Stochastic Calculus Alan Bain 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral and some of its applications. They

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University Survival Analysis: Counting Process and Martingale Lu Tian and Richard Olshen Stanford University 1 Lebesgue-Stieltjes Integrals G( ) is a right-continuous step function having jumps at x 1, x 2,.. b f(x)dg(x)

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Letting p shows that {B t } t 0. Definition 0.5. For λ R let δ λ : A (V ) A (V ) be defined by. 1 = g (symmetric), and. 3. g

Letting p shows that {B t } t 0. Definition 0.5. For λ R let δ λ : A (V ) A (V ) be defined by. 1 = g (symmetric), and. 3. g 4 Contents.1 Lie group p variation results Suppose G, d) is a group equipped with a left invariant metric, i.e. Let a := d e, a), then d ca, cb) = d a, b) for all a, b, c G. d a, b) = d e, a 1 b ) = a

More information