Solutions to the Exercises in Stochastic Analysis

Similar documents
Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

1. Stochastic Processes and filtrations

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

n E(X t T n = lim X s Tn = X s

Brownian Motion and Conditional Probability

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Lecture 12. F o s, (1.1) F t := s>t

Lecture 21 Representations of Martingales

Applications of Ito s Formula

Exercises in stochastic analysis

Verona Course April Lecture 1. Review of probability

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Introduction to Random Diffusions

A D VA N C E D P R O B A B I L - I T Y

(A n + B n + 1) A n + B n

Exercises. T 2T. e ita φ(t)dt.

Selected Exercises on Expectations and Some Probability Inequalities

MA8109 Stochastic Processes in Systems Theory Autumn 2013

Notes 15 : UI Martingales

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

Brownian Motion and Stochastic Calculus

Theoretical Tutorial Session 2

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic integration. P.J.C. Spreij

On pathwise stochastic integration

Wiener Measure and Brownian Motion

ELEMENTS OF PROBABILITY THEORY

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Convergence at first and second order of some approximations of stochastic integrals

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

I forgot to mention last time: in the Ito formula for two standard processes, putting

Probability and Measure

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

Exercises Measure Theoretic Probability

Stochastic Analysis I S.Kotani April 2006

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

Math 6810 (Probability) Fall Lecture notes

µ X (A) = P ( X 1 (A) )

Universal examples. Chapter The Bernoulli process

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Stability of Stochastic Differential Equations

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Hardy-Stein identity and Square functions

Part III Stochastic Calculus and Applications

FE 5204 Stochastic Differential Equations

A Concise Course on Stochastic Partial Differential Equations

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Topics in fractional Brownian motion

(B(t i+1 ) B(t i )) 2

P (A G) dp G P (A G)

Lecture 22 Girsanov s Theorem

Lecture 17 Brownian motion as a Markov process

{σ x >t}p x. (σ x >t)=e at.

Lecture 19 L 2 -Stochastic integration

An essay on the general theory of stochastic processes

Reflected Brownian Motion

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

Math 735: Stochastic Analysis

Stochastic Differential Equations.

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

Part II Probability and Measure

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Brownian Motion. Chapter Definition of Brownian motion

Random Process Lecture 1. Fundamentals of Probability

1.1 Definition of BM and its finite-dimensional distributions

Stochastic Calculus February 11, / 33

Feller Processes and Semigroups

Lecture 22: Variance and Covariance

Stochastic Calculus (Lecture #3)

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Introduction to stochastic analysis

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

Exponential martingales: uniform integrability results and applications to point processes

An Overview of the Martingale Representation Theorem

4th Preparation Sheet - Solutions

1 Independent increments

Homework #6 : final examination Due on March 22nd : individual work

Stochastic Integration.

Useful Probability Theorems

Notes 18 : Optional Sampling Theorem

The Pedestrian s Guide to Local Time

Product measure and Fubini s theorem

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes

Stochastic Calculus. Alan Bain

1 Brownian Local Time

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Doléans measures. Appendix C. C.1 Introduction

Letting p shows that {B t } t 0. Definition 0.5. For λ R let δ λ : A (V ) A (V ) be defined by. 1 = g (symmetric), and. 3. g

Transcription:

Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional expectations. Exercise 1 For x, y R d define p t (x, y) = (2πt) d 2 e x y 2 2t. Prove that P t (x, dy) = p t (x, y)dy satisfies the Chapman-Kolmogorov equation: for Γ Borel subset of R d, P t+s (x, Γ) = P s (y, Γ)P t (x, dy). R d Solution: One one hand On the other hand, P t+s (x, Γ) = (2π(t + s)) d 2 P s (y, Γ)P t (x, dy) R d = (2πt) d 2 (2πs) d 2 We now complete the squares in y. y z 2 2s y x 2 2t R d Γ = 1 t + s 2 st Γ e y z 2 2s e x z 2 2(t+s) dz. tz + sx y t + s e y x 2 2t dzdy. 2 1 x z 2 2 t s. 1

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 2 next we change the variable y tz+sx t+s P s (y, Γ)P t (x, dy) R d = (2πt) d 2 (2πs) d 2 = (2πt) d 2 (2πs) d 2 = (2π(t s)) fd2 Γ R d to ỹ, then Γ R d e e 1 t+s 2 st ỹ 2 e 1 x z 2 2 t s 1 t+s 2 st ỹ 2 dỹ e x z 2 2(t s) dz. Γ dzdỹ e 1 x z 2 2 t s dz Exercise 2 Let (X t, t ) be a Markov process with X = and transition function p t (x, y)dy where p t (x, y) is the heat kernel. Prove the following statements. (1) For any s < t, X t X s P t s (, dy); (2) Prove that (X t ) has independent increments. (3) For every number p > there exists a constant c(p) such that E X t X s p = c(p) t s p 2. (4) State Kolomogorov s continuity theorem and conclude that for almost surely all ω, X t (ω) is locally Hölder continuous with exponent α for any number α < 1/2. (5) Prove that this is a Brownian motion on R d. Solution: Let f be a bounded measurable function. (1) Since (x s, x t ) P s (, dx)p t s (x, dy), Ef(x t x s ) = f(y x)p s (, x)p t s (x, y)dxdy R d R d = f(z)p s (, x)p t s (, z)dxdz R d R d = f(z)p t s (, z)dz p s (, x)dx = f(z)p t s (, z)dz. R d R d R d Hence x t x s P t s (, dz).

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 3 (2) Let us fix = t < t 1 < t 2 <... < t n and Borel sets A i B(R), i = 1,..., n. Let f i (x) = 1 x Ai where A i are Borel measurable set. Then we obtain P(X t1 A 1,..., X tn X tn 1 A n ) =... f 1 (x 1 )f 2 (x 2 x 2 )... f n (x n x n 1 ) R R p t1 (, x 1 )p t2 t 1 (x 1, x 2 )... p tn t n 1 (x n 1, x n ) dx n... dx 1, where in the last line we have used the identity (ii). Introducing new variables: y 1 = x 1, y 2 = x 2 x 1,..., y n = x n 1 x n, we obtain P(X t1 A 1,..., X tn X tn 1 A n ) =... f 1 (y 1 )f 2 (y 2 )... f n (y n ) = n R R p t1 (, y 1 )p t2 t 1 (, y 2 )... p tn t n 1 (, y n ) dy n... dy 1 f i (y i )p ti t i1 (, y i ) dy i. This means that {X ti t i 1, i = 1,..., n} are independent random variables. (3) E x t x s p 1 = z p e z 2 (2π(t s)) d 2(t s) dz 2 R d 1 = (t s) p (2π(t s)) d 2 y p e y 2 d 2 t s 2 dy 2 R d = (t s) p 1 2 y p e y 2 (2π) d 2 dy. 2 R d The integral is finite. Let R c(p) = y p p 1 (, y)dy, R d which is the pth moment of a N(, I d d ) distributed variable. (4) and (5) are straight forward application of Kolmogorov s theorem. Exercise 3 If (B t ) is a Brownian motion prove that (B t ) is a Markov process with transition function p t (x, y)dy.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 4 Solution: Let us denote for simplicity f i (x) = 1 x Ai. Furthermore, we define the random variables Y i = X ti X ti 1, for i = 1,..., n, where we have postulated t =. From the properties of the Brownian motion we obtain that the random variables {Y i } are independent and moreover Y i N (, t i t i 1 ). Thus, we have P[X t1 A 1,..., X tn A n ] = E n f i (X ti ) = E[f 1 (Y 1 )f 2 (Y 2 + Y 1 )... f n (Y n +... + Y 1 )] =... f 1 (y 1 )f 2 (y 2 + y 1 )... f n (y n +... + y 1 ) R R p t1 (, y 1 )p t2 t 1 (, y 2 )... p tn t n 1 (, y n ) dy n... dy 1. Now we introduce new variables: x 1 = y 1, x 2 = y 2 + y 1,..., x n = y n +... + y 1, and obtain that the last integral equals... f 1 (x 1 )f 2 (x 2 )... f n (x n ) R R p t1 (, x 1 )p t2 t 1 (, x 2 x 1 )... p tn t n 1 (, x n x n 1 ) dx n... dx 1. Noticing that p t (, y x) = p t (x, y) and recalling the definition of the functions f i, so the finite dimensional distribution agrees with that of the Markov process determined by the heat kernels. The two processes must agree. Exercise 4 Let (X t, t ) be a continuous real-valued stochastic process with X = and let p t (x, y) be the heat kernel on R. Prove that the following statements are equivalent: (i) (X t, t ) is a one dimensional Brownian motion. (ii) For any number n N, any sets A i B(R), i = 1,..., n, and any < t 1 < t 2 <... < t n, P[X t1 A 1,..., X tn A n ] =... p t1 (, y 1 )p t2 t 1 (y 1, y 2 )... p tn tn 1 (y n 1, y n ) dy n... dy 1. A 1 A k Solution: This follows from the previous exercises.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 5 Exercise 5 A zero mean Gaussian process Bt H is a fractional Brownian motion of Hurst parameter H, H (, 1), if its covariance is E(B H t B H s ) = 1 2 (t2h + s 2H t s 2H ). Then E Bt H Bs H p = C t s ph. If H = 1/2 this is Brownian motion (Otherwise this process is not even a semi-martingale). Show that (Bt H ) has a continuous modification whose sample paths are Hölder continuous of order α < H. Solution: Since E Bt H Bs H p = C t s ph = C t s 1+(pH 1), we can apply the Kolmogorov continuity criterion to obtain that Bt H has a modification whose sample paths are Hölder continuous of order α < (ph 1)/p. This means that for any α < H we can take p large enough to have α < (ph 1)/p. This finishes the proof. Exercise 6 Let (B t ) be a Brownian motion on R d. Let T be a positive number. For t [, T ] set Y t = B t t T B T. Compute the probability distribution of Y t. Solution: Ef(B t t T B T ) = (x tt ) y (2πt) d2 e x 2 2t (2π(T t)) d 2 e y x 2 2(T t) dxdy. R d We observe that R d f x 2 = x t 2 T y + 2t t2 x, y T T 2 y 2 and that x y 2 = x t 2 T y (T t) 2t(T t) 2 x, y + T T 2 y 2 (T t)2 + T 2 y 2. Thus, x 2 t + y x 2 (T t) = 1 t x t 2 T y + 1 T t x t 2 T y + 1 T y 2.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 6 Finally Ef(B t t T B T ) = (2πt) d 2 (2π(T t)) d 2 R d = (2πt) d 2 (2π(T t)) d 2 f(z)e R d = f(z)p t (, z)p T t (z, )dz (2πT ) d 2. R d Finally we see R d f (x tt y ) e 1 2t x t T y 2 e 1 1 2t z 2 e 1 2(T t) z 2 dz B t t T B T p t(, z)p T t (z, ) dz. p T (, ) R d e 2(T t) x t T y 2 e y 2 2T y 2 2T dy dxdy

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 7 2 Brownian motion, conditional expectation, and uniform integrability Exercise 7 Let (B t ) be a standard Brownian motion. Prove that (a) (i) for any t >, E[B t ] = and E[B 2 t ] = t; (ii) for any s, t, E[B s B t ] = s t, where s t = min(s, t). (b) (scaling invariance) For any a >, 1 a B at is a Brownian motion; (c) (Translation Invariance) For any t, B t +t B t is a standard Brownian motion; (d) If X t = B t tb 1, t 1, then E(X s X t ) = s(1 t) for s t. Compute the probability distribution of X t. Hint: break X t down as the sum of two independent Gaussian random variables, then compute its characteristic function). Solution: (a) (i) Since the distribution of B t is N (, t), we have E[B t ] = and E[B 2 t ] = t. (a) (ii) We fix any t s. Then we have E[B s B t ] = E[(B t B s )B s ] + E[B 2 s ]. Since the Brownian motion has independent increments, the random variables B t B s and B s are independent and we have E[(B t B s )B s ] = E[B t B s ]E[B s ] =. Furthermore, from (i) we know that E[B 2 s ] = s. Hence, we conclude E[B s B t ] = s, which is the required identity. (b) Let us denote W t = 1 a B at. Then W has continuous sample paths and independent increments, which follows from the same properties of B. Furthermore, for any t > s, we have W t W s = 1 a (B at B as ) N (, which finishes the proof. a(t s) ) = N (, t s), a

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 8 (c) If we denote W t = B t +t B t, then W has continuous sample paths and independent increments, which follows from the same properties of B. Moreover, for any t > s, we have W t W s = B t +t B t +s N (, (t + t) (t + s)) = N (, t s), which finishes the proof. (d) For t = or t = 1 we have X t =. For 1 > t s > we have E[X t X s ] = E[(B t tb 1 )(B s sb 1 )] =E[B t B s ] se[b t B 1 ] te[b 1 B s ] + ste[b 1 B 1 ] =s st st + st = s(1 t). Let us take t (, 1). Then we have X t = (1 t)b t t(b 1 B t ). Since the random variables B t and B 1 B t are independent, the distribution of X t is N (, (1 t) 2 t + t 2 (1 t)) = N (, t(1 t)). Exercise 8 Let X L 1 (Ω, F, P ). Prove that the family of random variable {E{X G} : G F} is L 1 bounded, i.e. sup G F E ( E{X G} ) <. Solution: Let us take any G F. Then using the Jensen s inequality we have E ( E{X G} ) E (E{ X G}) = E X <, which proves the claim. Exercise 9 Let X L 1 (Ω; R). Prove that the family of functions is uniformly integrable. {E{X G} : G is a sub σ-algebra of F} Solution: We note that, if a set of measurable sets A C satisfies lim C P[A C ] =, then lim C E[1 AC X ] = ( since X L 1, dominated convergence). Let us take any G F and consider the family of events A(C, G) = {ω : E[X G](ω) C}. Applying the Markov s and Jensen s inequalities we obtain P(A(C, G)) C 1 E ( E[X G] ) C 1 E[E[ X G]] = C 1 E X, as C ( since E X < ).

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 9 For any ε >, there exist a δ > such that if P[A] < δ, then E(1 A X ) < ε. For this δ, take C > E X δ, then P[A(C, G)] < δ for any G F, which implies Finally, we conclude sup E[1 A(C,G) X ] < ε. G F sup E[1 A(C,G) E[X G] ] sup E[1 A(C,G) X ] < ε, G F G F which proves the claim. Exercise 1 Let (G t, t ), (F t, t ) be filtrations with the property that G t F t for each t. Suppose that (X t ) is adapted to (G t ). If (X t ) is an (F t )- martingale prove that (X t ) is an (G t )-martingale. Solution: The fact that (X t ) is (G t )-adapted, follows from the inclusion G t F t for each t. Furthermore, for any t > s, using the tower property of the conditional expectation, we obtain E[X t G s ] = E[E[X t F s ] G s ] = E[X s G s ] = X s. This shows that (X t ) is an (G t )-martingale. Exercise 11 (Elementary processes) Let = t < < t n < t n+1, H i be bounded F ti -measurable functions, and H t (ω) = H (ω)1 {} (t) + H i (ω)1 (ti,t i+1 ]. (1) Prove that H : R + Ω R is Borel measurable. Define the stochastic integral I t H s dm s H i (M ti+1 t M ti t) and prove that [ ] [ ] 2 [ ] E H s db s =, E H s db s = E (H s ) 2 ds. (2)

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 1 Solution: First, we will prove that the function (1) is Borel-measurable. To this end, we take any Borel set A B(R) and we have to show that the set {(t, ω) : H(t, ω) A} is measurable in the product space (R +, B(R + )) (Ω, F). We can rewrite this set in the following way: {(t, ω) : H(t, ω) A} = ({} {ω : H (ω) A}) n ((t i, t i+1 ] {ω : H i (ω) A}). Since the sets {} and (t i, t i+1 ], i = 1,..., n, belong to B(R + ), and {ω : H i (ω) A} F ti F, the claim now follows from the fact that the product of two measurable sets is measurable in the product space. Next, we will show the identities (2). Let us denote I t = H sdb s. Then we have E[I t ] = E [ H i (B ti+1 t B ti t) ] = E [H i ] E [ B ti+1 t B ti t] =, where in the second equality we have used the independence of B ti+1 t B ti t from F ti, which follows from the properties of the Brownian motion and the fact that B ti+1 t B ti t = if t i t. Furthermore, in the last equality we have used E [ B ti+1 t B ti t] =. For the variance of the stochastic integral we have E[I 2 t ] = E [ Hi 2 (B ti+1 t B ti t) 2] + i E [ H i H (B ti+1 t B ti t)(b t+1 t B t t) ]. In the same way as before, using the independence of the increments of the Brownian motion, we obtain that the second sum is. Thus, we have E[It 2 ] = E [ Hi 2 (B ti+1 t B ti t) 2] = = E [ Hi 2 ] [ E (Bti+1 t B ti t) 2] E [ [ Hi 2 ] t ] (ti+1 t t i t) = E (H s ) 2 ds, where in the second line we have used the fact that Hi 2 and (B t i+1 t B ti t) 2 are independent.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 11 3 Martingales and Conditional Expectations A given non-specific filtration {F t } is used unless otherwise stated. Exercise 12 Let µ be a probability measure. If {X n } is u.i. and X n X in measure, prove that X n is L 1 bounded and X L 1. Solution: By the u.i., there exists a number C such that X X n C n dµ 1. X n dµ X n dµ + X n dµ 1 + C. X n C X n C Take an almost surely convergence sub-sequence if necessary, we may and will assume that X n X. Then which is finite. E[ X ] = E[lim inf X n ] lim inf E X n sup E X n, n n Exercise 13 If X is an L 1 function, prove that X t := E{X F t } is an F t -martingale. Solution: By the definition of conditional expectations, each X t L 1. By the definition X t F t for each t. By the tower property if s < t, E(X t F s ) = E(E(X F t ) F s ) = E(X F s ) = X s. Exercise 14 (Discrete Martingales) If (M n ) is a martingale, n N, its quadratic variation is the unique process (discrete time Doob-Meyer decomposition theorem) M n such that M = and Mn 2 M n is a martingale. Let (X i ) be a family of i.i.d. s with EX i = and E(Xi 2) = 1. Then S n = n X i is a martingale w.r.t. its natural filtration. This is the random walk. Prove that S n is a martingale and that its quadratic variation is S n = n. Solution: Let F n denotes the natural filtration of {X n }. Then E{S n F n 1 } = E{S n 1 F n 1 } + E{X n F n 1 } = S n 1 +. We used the fact that X n is independent of F n 1. Similarly, E{(S n ) 2 n F n 1 } = E{(S n 1 ) 2 F n 1 } + 2E{S n 1 X n F n 1 } + E{(X n ) 2 F n 1 } n = (S n 1 ) 2 + 2S n 1 E{X n F n 1 } + E[(X n ) 2 ] n = (S n 1 ) 2 (n 1). So (S n ) 2 n is an F n martingale. n

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 12 Exercise 15 Let φ : R d R be a convex function. Show that (a) If (X t ) is a sub-martingale and φ is increasing s.t. φ(x t ) L 1, then (φ(x t )) is a sub-martingale. (b) If (X t ) is a martingale and φ(x t ) L 1, then φ(x t ) is a sub-martingale. (c) If (X t ) is an L p integrable martingale, for a number p 1, prove that ( X t p ) is a sub-martingale. (d) If (X t ) is a real valued martingale, prove that (X t ) is a sub-martingale. Solution: Since φ is convex, it is continuous and hence it is measurable. This means that in what follows the process φ(x t ) is adapted. (a) For t > s, using the Jensen s inequality we obtain E[φ(X t ) F s ] φ(e[x t F s ]) φ(x s ), where in the last inequality we have used E[X t F s ] X s and the fact that φ is increasing. (b) The claim can be shown in the same way as (a), but now φ(e[x t F s ]) = φ(x s ). (c), (d) The claims follow from (b) and the fact that the maps x x p and x x are convex. Exercise 16 (Limit of Martingales) Let {(M n (t), t [, 1]), n N} be a family of martingales. Suppose that for each t, lim n M n (t) = M t almost surely and {M n (t), n N} is uniformly integrable for each t. Prove that M(t) is a martingale. Solution: For any t [, 1] and any A F fixed, the family {M n (t)1 A, n N} is uniformly integrable. Indeed, lim sup M n (t) 1 A dp lim sup M n (t) dp =, C n C n { M n(t) 1 A C} { M n(t) C} where the inequality follows from M n (t) 1 A M n (t) and the last equality follows from the fact that {M n (t), n N} are uniformly integrable. Since, lim n M n (t)1 A = M t 1 A almost surely, the uniform integrability implies convergence in L 1. Take A = Ω, we see that M t is integrable. For any s < t 1 and any A F s we have E[M(t)1 A ] = lim n E[M n(t)1 A ] = lim n E[M n(s)1 A ] = E[M(s)1 A ],

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 13 where the second equality holds, because M n is a martingale. This implies that E[M(t) F s ] = M(s), which finishes the proof. Exercise 17 Let a > be a real number. For = t 1 < < t n < t n+1 <... and i = 1,..., n,... let H i be bounded F ti measurable functions and, let H be a bounded F -measurable function. Define H t (ω) = H (ω)1 {} (t) + H i (ω)1 (ti,t i+1 ](t). Let (M t, t a) a martingale with M Define the elementary integral =, and (H t ) an elementary process. I t H s dm s H i (M ti+1 t M ti t). Prove that (I t, t a) is an F t - martingale. Solution: Note: The sum is always a finite sum. Please refer also to Exercise 11. It is clear that M ti+1 t M ti t F t. There exists an n s.t. t < t n+1. Then M ti+1 t M ti t =, i n + 1. H s dm s H i (M ti+1 t M ti t), and for 1 i n, H i F ti F t. In particular, I t F t for each t and the process (I t ) is (F t )-adapted. The finite number of bounded random variables { H i, i = 1,..., n} is bounded by a common constant C. Furthermore, since each s, M s is integrable, E H s dm s E ( H i M ti+1 t M ti t ] ) C ( E Mti+1 t + E M ti t ) <, proving that for each t, I t is integrable. Finally, we prove the martingale property. First note that if (M s, s ) is a martingale then E{M t F s } = M s t, s, t (3)

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 14 Take t s and assume that s [t k, t k+1 ), for some k {,..., n}. Explanation: We only need to consider two cases: (1) i k in which case H i F s in which case we can take H i out of the conditional expectation and (2) i k + 1 in which case s < t i and we may use tower property, to condition in addition w.r.t F ti and take H i out. Then we have E[I t F s ] = = E[H i (M ti+1 t M ti t) F s ] k E[H i (M ti+1 t M ti t) F s ] + i=k+1 E[H i (M ti+1 t M ti t) F s ]. For i k, we have t i s, and do the random variable H i F s and k E[H i (M ti+1 M ti ) F s ] = = k H i (M ti+1 s M ti s) = k H i E[(M ti+1 M ti ) F s ] H i (M ti+1 s M ti s) = I s. If i k + 1 then s t i, we may use the tower property of the conditional expectation: i=k+1 = = E[H i (M ti+1 t M ti t) F s ] = i=k+1 i=k+1 i=k+1 E[H i E[(M ti+1 t M ti t) F ti ] F s ] { ( }} ){ E H i Mti+1 t t i M ti t t i Fs =. Combing all these equalities we conclude E[I t F s ] = I s, E [ E [ H i (M ti+1 t M ti t) F ti ] Fs ] and (I t ) is a martingale.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 15 4 Stopping times, progressively measurability The filtration F t satisfies the usual conditions. Exercise 18 Let S, T be stopping times. Prove that (1) S T, S T, as where a > 1, are stopping times. (2) If T is stopping time, then there exists a sequence of stopping times T n such that T n takes only a finite number of values and T n decreases to T. Solution: (1) For any t we have {S T t} = {S t} {T t} F t, {S T t} = {S t} {T t} F t, {as t} = {S t/a} F t/a F t, which proves the claim. (2) For any n N we define the stopping time { i2 T n = n, if (i 1)2 n T < i2 n for some i < n2 n, +, if T n. These stopping times satisfy the required conditions. Exercise 19 Prove that T is a stopping time iff {T < t} F t, for any t >. Solution: If T is a stopping time, then {T < t} = n 1 {T t 1 n } F t, because {T t 1 n } F t 1/n F t. Conversely, if {T < t} F t, for any t >, then {T t} = n 1 {T < t + 1 n } F t+ = F t, because the filtration is right-continuous. Exercise 2 Let (M t, t I) be an (F t )-martingale. Let τ be a bounded stopping time that takes countably many values. Let Y be a bounded F τ measurable random variable. Let N t = Y (M t M t τ ). Prove that (N t ) is a martingale.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 16 Solution: Since Y is bounded, N t is an integrable process. Furthermore, it is adapted, since, for any t and any Borel set B, we have {N t B} = ({Y (M t M t τ ) B} {τ t}) ({ B} {τ > t}) F t. Let us now take any s < t. Then we have E[N t F s ] = E[Y (M t M t τ )1 τ>s F s ] + E[Y (M t M t τ )1 τ s F s ] = I 1 + I 2. For the first term we have, by the optional stopping theorem, I 1 = E[E[Y (M t M t τ )1 τ>s F τ ] F s ] = E[Y {}}{ E[M t M t τ F τ ] 1 τ>s F s ] =. For the second term we can get, again by the optional stopping theorem, I 2 = E[Y (M t M τ )1 τ s F s ] = Y 1 τ s E[M t M τ F s ] = 1 τ s Y (M s M τ s ) = N s. This finishes the proof. Exercise 21 Show that for s < t and A F s, τ = s1 A + t1 A c is a stopping time. Solution: Indeed, for any r we have, if r < s {τ r} = A, if s r < t Ω, if r t F r. Exercise 22 Let = t 1 < < t n+1 <... with lim n t n =. For each i =, 1,..., let H i be a real valued F ti measurable random variable and H an F -measurable random variable. For t >, we define X t (ω) = H (ω)1 {} (t) + H i (ω)1 (ti,t i+1 ](t). i= Prove that (X t, t ) is progressively measurable.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 17 Solution: To this end, we take any Borel set A B(R) and we have to show that for any t, the set {(s, ω) : H(s, ω) A} is measurable in the product space ([, t], B([, t])) (Ω, F t ). We can rewrite this set in the following way: {(s, ω) [, t] Ω : H(s, ω) A} = ({} {ω : H (ω) A}) ti t ((t i t, t i+1 t] {ω : H i (ω) A}). Since the sets {} and (t i, t i+1 ], i = 1,..., n, belong to B([, t]), and {ω : H i (ω) A} F ti F, the claim now follows from the fact that the product of two measurable sets is measurable in the product space. Exercise 23 Let s < t u < v and let (H s ) and (K s ) be two elementary processes. We define: s H rdb r = H rdb r s H rdb r. Prove that [ v ] E H r db r K r db r =. s u Solution: Recall that the stochastic process ( H rdb r, t ) is measurable w.r.t. F t and is a martingale. We use the tower property to obtain the following: [ v ] [ [ v ]] Fu E H r db r K r db r = E E H r db r K r db r s u s u [ [ v ]] Fu = E H r db r E K r db r =, s u because the stochastic integral is a martingale, and hence the inner expectation vanishes. Exercise 24 Let f : R + R be a differentiable function with f L 1 ([, 1]). Let n : = t (n) 1 < t (n) 2 < < t (n) N n = t be a sequence of partitions of [, t] with lim n n. Prove that lim N n ( n =1 f(t (n) ) f(t (n) 1 ) ) 2 =. Hint: f is uniformly continuous on [, t] and f(t) f(s) = s f (r) dr. Solution: We have a simple estimate N n ( =1 f(t (n) ) ) f(t (n) 2 1 ) max f(t (n) ) f(t (n) N n 1 ) f(t (n) =1 ) f(t (n) 1 ).

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 18 f(t (n) Firstly, we have max ) f(t (n) 1 ), as n, because of the uniform continuity of f on [, t]. Secondly, we can estimate N n f(t (n) ) f(t (n) N n 1 ) = =1 =1 (n) t t (n) 1 Nn f (r) dr = =1 (n) t (n) 1 f (r) dr f (r) dr <, which follows from the properties of f. Thus, from these facts the claim follows.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 19 5 Martingales and Optional Stopping Theorem Let (F t ) be a filtration satisfying the usual assumptions, unless otherwise stated.. Exercise 25 Use the super-martingale convergence theorem to prove the following statement. Let (X n ) be a sub-martingale sequence and sup n E(X + n ) <. Then lim n X n exists almost surely. Solution: The process Y n = X n is a super-martingale. Furthermore, sup n E(Yn ) = sup n E(X n + ) <. Thus, from the super-martingale convergence theorem, the limit lim n Y n = lim n X n exists almost surely. Exercise 26 Let (M t, t t ) be a continuous local martingale with M =. Prove that (M t, t t ) is a martingale if sup t t M t L 1. Solution: Since M =, there is a sequence of stopping times T n almost surely, such that (Mt Tn ) is a martingale. Hence, for any t > s, we have E[Mt Tn F s ] = Ms Tn M s, almost surely, as n. Observe that Mt Tn sup t t M t, the condition that sup t t M t L 1 allows us to apply the dominated convergence theorem to obtain E[M Tn t F s ] E[M t F s ], almost surely, as n. This shows that E[M t F s ] = M s, what means that (M t ) is a martingale. Exercise 27 Let {B t } t be one dimensional Brownian Motion starting at x (B = x), and let a < x < b. In this question we are going to find the probability of the Brownian Motion hitting b before a using the Optional Stopping Theorem (OST). Set T a = inf {t : B t = a}, T b = inf {t : B t = b}, and T = T a T b. (a) Give an easy arguments why T a, T b and T are all stopping times with respect to the natural filtration of the Brownian Motion. (b) One would like to compute E[B T ] using OST, but (B t, t ) is not a uniformly integrable martingale and apply OST would require for T to be a bounded stopping time. Instead we are using the limiting argument. (b1) Let n N use OST to prove that E[B T n ] = x.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 2 (b2) Conclude that E[B T ] = x. (c) Compute P(T a > T b ). Solution: (a) T a is a stopping time since it is the hitting time by (B t ) of the closed set {a} and the same can be said for T b. That T = T a T b is the stopping time by Exercise 18. (b1) Note that T n and both and T n are bounded stopping times. Hence by the OST E[B T n B ] = B = x almost surely. Taking the expectation on the both parts and using the tower law we get the result. (b2) It is clear that pointwise B T n B T as n. Moreover B T n is bounded between a and b hence we can use DCT to conclude: (c) Denote P(T a > T b ) by p, then: E[B T ] = lim n E[B T n] = x x = E[B T ] = E[B Tb 1 {Ta>T b } + B Ta 1 {Tb >T a}] = E[b1 {Ta>T b } + a1 {Tb >T a}] = bp + a(1 p) From here we deduce that P(T a > T b ) = x a b a. Exercise 28 If (M t, t I) is a martingale and S and T are two stopping times with T bounded, prove that Solution: We can write M S T = E{M T F S }. E{M T F S } = E{1 T <S M T F S } + E{1 T S M T F S }. Note, that 1 T <S M T is F S -measurable. Thus, for the first term we have E{1 T <S M T F S } = 1 T <S M T = 1 T <S M S T. One can see that {T S} F S T. Indeed, for any t one has {T S} {T S t} = {S T t} = {S T t} ({S t} {T > t}) F t, because every set belongs to F t. Thus, we conclude E{1 T S M T F S } = E{1 T S M T F S T } = 1 T S E{M T F S T } = 1 T S M S T, where we have used the Doob s optional stopping theorem. Combining all these equalities together, we obtain the claim.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 21 Exercise 29 Prove the following: (1) A positive right continuous local martingale (M t, t ) with M = 1 is a super-martingale. (2) A positive right continuous local martingale (M t, t ) is a martingale if E M < and EM t = EM for all t >. Solution: (1) Since E M <, there is a sequence of increasing stopping times T n, such that the process Mt Tn is a martingale. In particular, Mt Tn is integrable. For s < t, we use Fatou s lemma to obtain E[M t F s ] = E[lim inf n M t T n F s ] lim inf n E[M t T n F s ] = lim inf n M s T n = M s. (2) Let S T be two stopping times, bonded by a constant K. Then, from (1), we have EM EM S EM T EM K = EM, which implies EM T = EM S. We know that EM T = EM S for any two bounded stopping times t T implies that (M t ) is a martingale, completing the proof. Exercise 3 Let (M t, t ) and (N t, t ) be continuous local martingales with M = and N =. (1) Let (A t ) and (A t) be two continuous stochastic processes of finite variation with initial values and such that (M t N t A t ) and (M t N t A t) are local martingales. Prove that (A t ) and (A t) are indistinguishable. (2) Prove that M, N t is symmetric in (M t ) and (N t ) and is bilinear. ) (3) Prove that M, N t = 1 4 ( M + N, M + N t M N, M N t. (4) Prove that M M, N N t = M, N t. (5) Let T be a stopping time, prove that M T, N T = M, N T = M, N T. Solution: (1) We use the following theorem. If (M t, t T ) is a continuous local martingale with M =. Suppose that (M t, t T ) has finite total variation. Then M t = M, any t T. The process A t A t = (M t N t A t ) (M t N t A t) is a continuous local martingale, and at the same time a process of finite total variation. Then A t A t = A A =.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 22 (2) The symmetry equality M, N = N, M comes from that of the product: MN = NM. Next, we will prove The process M 1 + M 2, N = M 1, N + M 2, N. M 1 N + M 2 N M 1, N M 2, N is a local martingale. Thus, the claim follows from the uniqueness of the bracket process. Similarly for k R, km k M, is a local martingale and km = k M. (3) is a consequence of the bi-linearity, proved earlier. (4) Since (M t N, t ) is a local martingale, the bracket process M, N vanishes. By the same reason we have M, N = M, N =. Hence, the claim follows from the bilinearity. (5) By the definition M T N T M T, N T and (MN) T M, N T are local martingales. This implies M T, N T = M, N T, from the uniqueness of the bracket process. Furthermore, both M T N T M T, N T and (M M T )N T are local martingales, hence their sum MN T M T, N T is a local martingale as well. This implies M, N T = M T, N T, from the uniqueness of the bracket process. Exercise 31 Let (M t, t [, 1]) and (N t, t [, 1]) be bounded continuous martingales with M = N =. If (M t ) and (N t ) are furthermore independent prove that their quadratic variation process ( M, N t ) vanishes. Solution: Let us take a partition = t < t 1 <... < t n = 1. Then we have = = [ E (M ti M ti 1 )(N ti N ti 1 ) ] 2 [ ] [ ] E (M ti M ti 1 )(M t M t 1 ) E (N ti N ti 1 )(N t N t 1 ) i,=1 ] 2E ] 2 ] ] E [M ti M ti 1 [N ti N ti 1 = E [M 2 ti M 2 ti 1 E [N 2 ti N 2 ti 1 ] ] E [M 2 ti M 2 ti 1 sup E [N 2 t N 2 t 1 ] E [M 2 ti M 2 ti 1 E sup Because N 2 t is uniformly continuous, sup N 2 t N 2 t 1 almost surely and N 2 t is bounded, E sup [ N 2 t N 2 t 1 ]. Since the bracket process is the limit [N 2 t N 2 t 1 ].

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 23 of n (M t i M ti 1 )(N ti N ti 1 ) (in probability), this implies M, N t =. Exercise 32 Prove that (1) for almost surely all ω, the Brownian paths t B t (ω) has infinite total variation on any intervals [a, b]. (2) For almost surely all ω, B t (ω) cannot have Hölder continuous path of order α > 1 2. Solution: (1) We know that t is the quadratic variation of (B t ). We can choose a sequence of partitions {t (n) } =1,...,Mn such that for almost surely all ω, ( 2 B (n) t (ω) B (n) t (ω)) (b a), (4) 1 =1 as n. Also for every ω, M n =1 (B t (n) (ω) B t (n) 1 (ω)) 2 max max B (n) t (ω) B (n) t 1 M n (ω) =1 B (n) t (ω) B (n) t (ω) B(ω) TV[a,b]. 1 B (n) t (ω) B (n) t (ω) 1 If B(ω) TV[a,b] is finite then, since t B t (ω) is uniformly continuous on [a, b], M n =1 (B t (n) (ω) B t (n) 1 (ω)) 2 max B (n) t (ω) B (n) t (ω) TV [a,b] (B(ω)). 1 But for almost surely all ω, this limit is b a, and hence TV [a,b] (B(ω)) = for almost surely all ω. (2) Suppose that for some α > 1 2 and some number C, both may depend on ω, B t (ω) B s (ω) C(ω) t s α. We take the partitions {t (n) } =1,...,Mn, as in (1). For any ω we have M n =1 (B t (n) (ω) B t (n) 1 M n (ω)) 2 C 2 =1 t (n) = C 2 (b a) n 2α 1, t (n) M n 1 2α C 2 n 2α 1 =1 t (n) what contradicts with (4), unless b a =. Hence for almost surely all ω, the Brownian path is not Hölder continuous of order α > 1/2 in any interval [a, b]. t (n) 1

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 24 6 Local Martingales and Stochastic Integration If M H 2, L 2 (M) denotes the L 2 space of progressively measurable processes with the following L 2 norm: [ ] f L 2 (M) := E (f s ) 2 d M, M s ) <. If (M t ) is a continuous local martingale we denote by L 2 loc (M) the space of progressively measurable processes with (K s ) 2 d M, M s <, t. Exercise 33 Let (B t ) be a standard Brownian Motion and let f s, g s L 2 loc (B). Compute the indicated bracket processes, the final expression should not involve stochastic integration. 1. f sdb s, g sdb s. 2. B sdb s, B3 s db s t. Solution: (1a) By the definition, f s db s, (1b) By Itô isometry: B s db s, g s db s t = B 3 s db s t = (2) We have the following identities: ( s ) I t = (2B s + 1)d B r d(b r + r) = = 2 B 2 s db s + B s db s + 2 f s g s d B s = B 4 s d B s = B 2 s ds + f s g s ds. B 4 s ds. (2B s + 1)B s d(b s + s) B s ds. Exercise 34 We say that a stochastic process belongs to C α if its sample paths are of locally Hölder continuous of order γ for every γ < α. Let (H t, t 1) be an adapted continuous and L p bounded stochastic process where p > 2, with H L 2 ([, 1] Ω). Let (B t ) be a one dimensional Brownian motion and set M t = H sdb s.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 25 (a) Prove that (M t ) belongs to C 1 2 1 p. (b) If (H t ) is a bounded process, i.e. sup (t,ω) H t (ω) <, prove that (M t ) belongs to C 1 2. Proof: First note that the bracket process of s H rdb r is s (H r) 2 dr. (1) Since H is L p bounded, there exists C s.t. sup s [,1] E( H s p ) < C. By Burkholder-Davis-Gundy inequality, ( ) p E M t M s p c p E (H r ) 2 2 dr cp (t s) p 1 2 s t s E Cc p (t s) p 2. s (H r ) p dr By Kolmogorov s continuity criterion, (M t ) has a modification which is Hölder continuous for any γ < 1 p ( p 2 1) = 1 2 1 p. (2) Similarly for any p, ( ) p E M t M s p c p E (H r ) 2 2 dr cp (t s) p 2 sup (H r (ω)) p, s r [,1],ω Ω so that (M t ) has a modification which is Hölder continuous for any γ < 1 2 1 p for any p > and concluding the second assertion, again by Kolmogorov s continuity theorem. Exercise 35 Let T >. Let f be a left continuous and adapted process such that E T (f s) 2 ds <. Prove that ( f sdb s, t T ) is a martingale. Solution: We know that ( f sdb s, t T ) is a local martingale. Furthermore, E( f s db s, ) 2 c 2 E (f s ) 2 ds c 2 E T (f s ) 2 ds <. by Burkholder-Davis-Gundy inequality. It is therefore an L 2 bounded process, and so a martingale. Exercise 36 Let M H 2 and H L 2 (M). Let τ be a stopping time. Prove that τ H s dm s = 1 s τ H s dm s = H s dm τ s.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 26 Solution: For any N H 2, we use Exercise 3 to obtain the following identities H s d M τ, N s = = 1 s τ H s d M, N s = H s d M, N τ s τ t H s d M, N s. The claim now follows from the fact, that the stochastic integral I t = H sdms τ is defined uniquely by the identities for any N H 2. I, N t = H s d M τ, N s, t, Exercise 37 Let M H 2 and K E. Prove that the elementary integral I t := t K sdm s satisfies for any N H 2. I, N t = Solution: See lecture notes. K s d M, N s, t, Exercise 38 Let B t = (Bt 1,..., Bt n ) be an n-dimensional Brownian motion. Prove that B t 2 = 2 BsdB i s i + nt. Solution: Either use B t 2 = n Bi t 2 apply the product to each component (B i t), then summing up or use the multi-dimensional version of the Itô s formula. Let us consider the function f : R n R, f : x x 2. Then its derivatives are given by i f(x) = 2x i, 2 if(x) = 2δ i,. Applying the Itô formula, we obtain B t 2 = f(b t ) = f(b ) + i f(b s )dbs i + 1 2 = + 2 which is the required identity. B i sdb i s + i,=1 ds = 2 2 if(b s )d < B i, B > s B i sdb i s + nt,

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 27 Exercise 39 Give an expression for s db s that does not involve stochastic integration. Solution: Applying the classical integration by parts formula, we obtain s db s = tb t B s ds. Exercise 4 Write (2B s + 1)d ( s B rd(b r + r) ) as a function of (B t ), not involving stochastic integrals. Solution: Firstly, ( s (2B s + 1)d = 2B 2 s db s + ) B r d(b r + r) = B s db s + From the Itô formula for the function x 3 we obtain B 2 s db s = 1 3 B3 t And the Itô formula for the function x 2 gives (2B s + 1)B s d(b s + s) (2B 2 s + B s )ds. B s ds. B s db s = 1 2 (B2 t t). Substituting these stochastic integrals into the formula obtained in (2a), we get I t = 2 3 B3 t + 1 2 (B2 t t) + 2 B 2 s ds B s ds. Exercise 41 If (H t ) and (K t ) are continuous martingales, prove that HK, B t = H r d K, B r + K r d H, B r.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 28 Solution: The product formula gives H t K t = H r dk r + Hence, by Itô isometry, we obtain K r dh r + H K + H, K t. HK, B t =< = what is the required identity. H r dk r + H r d K, B r + K r dh r, B t K r d H, B r,

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 29 7 Itô s Formula Exercise 42 If (N t ) is a continuous local martingale with ) N (e Nt 1 2 N,N t ) is a local martingale, and E (e Nt 1 2 N,N t 1. =, show that Solution: Let us denote X t = e Nt 1 2 N,N t. The process (N t 1 2 N, N t) is a semi-martingale, and we can apply the Itô formula to the function e x : X t = 1 + X s dn s. Since the stochastic integral is a local martingale, we conclude that (X t ) is a local martingale. Moreover, let T n be the sequence of increasing stopping times such that Xt Tn is a uniformly integrable martingale. Then, applying the Fatou lemma, we get E[X t ] = E[ lim n XTn t ] lim inf n E[XTn t ] = E[X ] = 1, which is the required bound. Exercise 43 Show that a positive continuous local martingale (N t ) with N = 1 can be written in the form of N t = exp (M t 1 2 M, M t) where (M t ) is a continuous local martingale. dn s. Applying the Itô formula to the func- Solution: Let us define M t = tion log(x), we obtain N s 1 log(n t ) = log(n ) + which implies the claim. N 1 s dn s 1 2 N 2 s d N s = M t 1 2 M t, Exercise 44 Let (B t ) be a one dimensional Brownian motion on (Ω, F, F t, P ) and let g : R R be a bounded Borel measurable function. Find a solution to x t = x + x s g(b s )db s. Solution: We define M t = g(b s)db s. Then dx t = X t dm t whose solution is the exponential martingale: e Mt 1 2 M t.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 3 Let f(x) = x and y t = M t 1 2 M t. We see that e yt = 1 + proving the claim. e ys dy s + 1 2 e ys d M s = 1 + e ys dm s, Exercise 45 1. Let f, g : R R be C 2 functions, prove that f(b), g(b) t = f (B s )g (B s )ds. 2. Compute exp (M t 1 2 M t), exp (N t 1 2 N t), where (M t ) and (N t ) are continuous local martingales. Solution: (1) By Itô formula we have Hence, we obtain f(b t ) = f() + g(b t ) = g() + f(b), g(b) t = = f (B s )db s + 1 2 g (B s )db s + 1 2 f (B s )db s, f (B s )g (B s )ds, f (B s )ds, g (B s )ds. g (B s )db s t what is the required identity. (2) Let us denote X t = exp (M t 1 2 M t) and Y t = exp (N t 1 2 N t). Then, applying the Itô formula to the functions e x, we obtain X t = exp (M ) + Hence, we conclude X s dm s, Y t = exp (N ) + Y s dn s. X, Y t = X s dm s, Y s dn s t = X s Y s d M, N s.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 31 Exercise 46 Let B t = (B 1 t,..., B n t ) be an n-dimensional Brownian motion on (Ω, F, F t, P ), with n > 1. We know that for any t >, {r s for all s t} has probability one. Prove that the process r t = B t is a solution of the equation n 1 r t = r + B t + ds, r =. 2r s Solution: (1) Let us take a function f(x) = x, for x R n. Its derivatives are given by f x i f(x) = x i x, 2 f x 2 (x) = x 2 x 2 i i x 3, for x. Note that f is C 2 on R 2 \{}. Let τ be the first time that X t := x +B t hits. Then τ = almost surely. Applying the Itô formula to f and X τ t, the latter equals X s almost surely. We obtain r t = r + = r + X i s X s dbi s + 1 2 X i s X s dbi s + 1 2 X s 2 (X i s) 2 X s 3 ds n X s 2 X s 2 X s 3 ds. To finish the proof, we have to show that B t := n Xs i X s dbi s is a Brownian motion. Firstly it is a continuous martingale, starting at. It has the quadratic variation B, B (B i t = s) 2 ds = t. B s 2 Hence, the claim follows from the Lévy characterization theorem. Exercise 47 Let f : R R be C 2. Suppose that there is a constant C > s.t. f C. Let x 1 h(x) = f(y) dy. Suppose that lim ( x + h(x) ) =. Denote by h 1 : [, ) [, ) its inverse.. Prove that h 1 h(x ) + B t satisfies: x t = x + f(x s )db s + 1 2 f(x s )f (x s )ds.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 32 ) Solution: Let us denote g(x) = h (h(x 1 ) + x. Then its derivatives are given by g (x) = f(g(x)), g (x) = f (g(x))f(g(x)). The claim now follows by applying the Itô formula to g(b t ). Exercise 48 Let (B t ) be a Brownian motion and τ its hitting time of the set [2, ). 1 Is the stochastic process ( B t 2, t < τ) a solution to a stochastic differential equation? (The time τ is the natural life time of the 1 process B t 2.) Solution: Note that τ = inf{t : B t [2, )}. Let τ n be a sequence of stopping time increasing to τ. Let f(x) = (x 2) 1. Where f is differentiable, f (x) = (x 2) 2 = f 2 and f (x) = 2(x 2) 3 = 2f 3. Applying the Itô formula to f and to the stopped process X t := (B t 2) 1 we obtain i.e. X τn t f(b τn t X τn t We see that on the set {t < τ n }, ) = 1 2 (X τn s ) 2 db s + = 1 τn 2 (X s ) 2 db s + X t = 1 2 Xs 2 db s + τn X 3 s ds. (X τn s ) 3 ds, (X s ) 3 ds, which is the required equation. Since {t < τ} = n {t < τ n }, the equation holds on {t < τ}. Exercise 49 Let (X t ) be a continuous local martingale begins with. Prove that X 2, X t = 2 X sd X, X s. Solution: Note that X t = dx s and Consequently, X 2 t = 2 X s X s + X, X t. X 2, X t = dx s, 2 X s X s t = 2 X s d X, X s.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 33 8 Lévy s Characterisation Theorem, SDEs Exercise 5 Let σ k = (σk 1,..., σd k ), b = (b1,..., b d ) be C 2 functions from R d to R d. Let ( L = 1 d m ) σk i 2 d 2 σ k + b l. x i x x l Suppose that for C >, i,=1 k=1 l=1 σ(x) 2 c(1 + x 2 ), b(x), x c(1 + x 2 ). (a) Let f(x) = x 2 + 1. Prove that Lf af for some a. (b) If x t = (x 1 t,..., x d t ) is a stochastic process with values in R d, satisfying the following relations: x i t = x + m k=1 σ i k (x s)db k s + b i (x s )ds, and f : R d R, prove that f(x t ) f(x ) Lf(x s)ds is a local martingale and give the semi-martingale decomposition for f(x t ). Solution: (a) The partial derivatives of f are given by f x i (x) = 2x i, 2 f x i x (x) = 2δ i,. Hence, we obtain the following bound Lf(x) = d m d (σk i )2 + 2 b i x i = σ(x) 2 + 2 < b(x), x > k=1 cf(x) + 2cf(x) = 3cf(x), which is the required bond. (b) Firstly, B i, B s = t if i = and vanishes otherwise. Hence, m m m x i, x t = σk i (x s)dbs k, σ l (x s)dbs l = σk i (x s)σ k (x s)ds. k=1 l=1 t k=1

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 34 f(x t ) =f(x ) + =f(x ) + + 1 2 d i,=1 d d f x i (x s )dx i s + 1 2 f x i (x s ) 2 f x i x (x s ) i,=1 k=1 d i,=1 m σk i (x s) dbs k + k=1 2 f x i x (x s )d x i, x s d m σk i (x s)σ k (x s)ds k=1 f x i (x s )b i (x s )ds m d f =f(x ) + (x s )σk i k=1 x (x s) dbs k i ( t + 1 d m ) σk i 2 (x s)σ k (x 2 f d f s) (x s ) + (x s )b i (x s ) ds x i x x i d k=1 Thus f(x t ) f(x ) Lf(x s)ds = m f x i (x s )σk i (x s) dbs k is a local martingale and the semi-martingale decomposition is as given in the identity earlier. Exercise 51 Let T be a bounded stopping time and (B t ) a one dimensional Brownian motion. Prove that (B T +s B T, s ) is a Brownian motion. Solution: Let us denote W s = B T +s B T. It is obvious, that W = and W t has continuous sample paths. Moreover, by the OST we have, for any s < t, E[W t F T +s ] = E[B T +t F T +s ] B T = B T +s B T = W s. Let G t = F T +t. Then (W t ) is a (G t )-martingale. Next, we take s < t and derive E[W 2 t F T +s ] = E[B 2 T +t + B 2 T 2B T B T +t F T +s ] = E[B 2 T +t (T + t) F T +s ] + (T t) + B 2 T 2B T B T +s = (B 2 T +s (T + s)) + (T t) + B 2 T 2B T B T +s = W 2 s + t s, where we have used the fact that (B T +t ) 2 (T + t) is a martingale. This implies that Wt 2 t is a (G t )-martingale, and hence W, W t = t. Using the Lévy characterization theorem, we conclude that W t is a (G t )-Brownian motion. Exercise 52 Let {B t, W 1 t, W 2 t } be independent one dimensional Brownian motions.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 35 1. Let (x s ) be an adapted continuous stochastic process. Prove that (W t ) defined below is a Brownian motion, W t = cos(x s )dw 1 s + sin(x s )dw 2 s. 2. Let sgn(x) = 1 if x > and sgn(x) = 1 if x. Prove that (W t ) is a Brownian motion if it satisfies the following relation W t = sgn(w s )db s. 3. Prove that the process (X t, Y t ) is a Brownian motion if they satisfy the following relations, X t = Y t = 1 Xs>Y s dw 1 s + 1 Xs Y s dw 1 s + 1 Xs Y s dw 2 s 1 Xs>Y s dw 2 s. Solution: (1) The process (W t ) is a continuous martingale with the quadratic variation ( < W > t = (cos(xs )) 2 + (sin(x s )) 2) ds = t. Hence, by Lévy characterization theorem, we conclude that (W t ) is a Brownian motion. (2) Can be shown in the same way. (3) The processes (X t ) and (Y t ) are martingales, and their quadratic variations are < X > t = < Y > t = The bracket process is equal to < X, Y > t = ( 1 2 Xs>Y s + 1 2 X s Y s ) ds = t, ( 1 2 Xs Y s + 1 2 X s>y s ) ds = t. 1 Xs>Y s 1 Xs Y s ds + 1 Xs Y s 1 Xs>Y s ds =, where we have used the fact that 1 Xs>Ys 1 Xs Ys =. The claim now follows from the Lévy characterization theorem.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 36 Exercise 53 Let t [, 1). Define x = and 1 x t = (1 t) 1 s db s. (1) Prove that [ ] 2 db s E = t 1 s 1 t. (2) Prove that Define A(t) = (3) Define t 1 t. Then A 1 (r) = W r = db s = t 1 s t 1 t. A 1 (r) r r+1. db s 1 s, r <. Let G r = F A 1 (r). Prove that (W r ) is an (G r )-martingale. (4) For a standard one dimensional Brownian motion B t, lim r rb 1/r =. Use this to prove that lim t 1 x t =. (5) Prove that x t solves x s x t = B t 1 s ds. (6) Prove that x s 1 sds is of finite total variation on [, 1]. Hint: it is sufficient to prove that almost surely. 1 x s 1 s ds < Solution: (1) By Itô isometry we obtain [ E (2) We have ] 2 db t s = 1 s ds (1 s) 2 = t 1 t. db t s d < B > s = 1 s t (1 s) 2 = t 1 t.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 37 (3) It is obvious that (W r ) is integrable and adapted to (G r ). Moreover, for any t > s, we have [ ] t/(t+1) db s/(s+1) r db r E[W t G s ] = E F 1 r s/(s+1) = 1 r = W s, because s/(s + 1) < t/(t + 1). Thus the Itô integral (W t ) is a martingale with respect to (F t ). This shows that (W r ) is a martingale with respect to (G r ). (4) By (2), we can calculate W t = A 1 (t) 1 A 1 (t) = t. By Paul Lévy characterization theorem, we conclude that (W t ) is a Brownian motion. Then, lim r rw 1/r =. This implies lim X t = lim X r/(r+1) = lim rw 1/r =. t 1 r r (5) Since the function 1 1 s has finite total variation on [, c], for every c < 1, the integral B sd 1 1 s, for t c, can be defined in the Stieltes sense. Using integration by parts, we see that the process W t is defined in the Stieltes sense as well. We can use the classical analysis to derive the following equalities: x s 1 s ds = = t W A(s) ds = sw A(s) t db t s 1 s = x t + B t, s 1 s db s = t s dw A(s) db t s 1 s db s 1 s + which is the claimed equality, for any t c. Passing c 1, we conclude that the equality holds for any t [, 1]. (6) Since, x s is defined as an integral of a non-random process with respect to the Brownian motion, it has the normal distribution, whose parameters are easily seen to be and s(1 s) (the latter follows from (1)). Thus, we apply Fubini lemma to derive 1 E x s 1 1 s ds = E x s 1 s(1 s) 1 s ds = C ds <, 1 s for some constant C >. This implies that 1 the process Y t = db s x s 1 sds < almost surely. Defining x s 1 s ds and taking any partition = t < t 1 <... < t n = 1,

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 38 we derive n 1 n 1 Y ti+1 Y ti = i= from what the claim follows. i= i+1 t i x n 1 s 1 s ds = i= 1 i+1 t i x s 1 s ds x s 1 s ds <,

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 39 Problem Sheet 9: SDE and Grisanov Exercise 54 Let us consider an SDE dx t = m k=1 σ k(x t )dbt k + σ (x t )dt on R d with infinitesimal generator L. Let D be a bounded open subset of R d with closure D and C 2 boundary D. Let x D. Suppose that there is a global solution to the SDE with initial value x and denote by τ its first exit time from D. Suppose that τ < almost surely. Let g : D R be a continuous function, and u : D R be a solution to the Dirichlet problem Lu = g on D and u = on the boundary. Prove that u(x ) = E τ g(x s)ds. [Hint: Use a sequence of increasing stopping times τ n with limit τ]. Solution: Let D n be a sequence of increasing open sets of D n D n+1 D and τ n the first exit time of x t from D n. By Itô s formula, u(x t τn ) = u(x ) + τn Lu(x s )ds + τn = u(x ) g(x s )ds + τn k=1 τn k=1 du(σ k (x s ))db k s du(σ k (x s ))db k s. Since u, g are continuous on D n, the stochastic integral is a martingale. We have E[u(x t τn )] = u(x ) E τn g(x s )ds. Since τ n < τ <, lim t (t τ n ) = t a.s. We take t followed by taking n and used dominated convergence theorem to take the limit inside the expectation, we see that lim lim E[u(x t τ n n )] = Eu(x τ ) =. t and τn τ lim lim E g(x s )ds = E g(x s )ds. n t This completes the proof. Exercise 55 A sample continuous Markov process on R 2 is a Markov process on the hyperbolic space (the upper half space model) if its infinitesimal generator is 1 2 y2 ( 2 + 2 ). Prove that the solution to the following equation is a hyperbolic x 2 y 2 Brownian motion. dx t = y t dbt 1 dy t = y t dbt 2.

Stochastic Analysis (214). Lecturer: Xue-Mei Li, Support Class: Andris Gerasimovics 4 [Hint: Show that if y > then y t is positive and hence the SDE can be considered to be defined on the upper half plane. Compute its infinitesimal generator L.] Solution: Just observe that y t = y e B2 t (1/2)t so y t > if y >. Compute the generator by Itô s formula. Exercise 56 Discuss the uniqueness and existence problem for the SDE dx t = sin(x t )db 1 t + cos(x t )db 2 t. Solution: The functions sin(x) and cos(x) are C 1 with bounded derivative, and hence Lipschitz continuous. For each initial value there is a unique global strong solution. Exercise 57 Let (x t ) and (y t ) be solutions to the following respective SDEs (in Stratnovitch form), x t = x Show that x 2 t + y 2 t is independent of t. y s db s, y t = y + Solution: Let us rewrite the SDEs in the Itô form: x t = x y s db s 1 2 < y, B > t, y t = y + We can calculate the bracket processes in these expressions: < y, B > t =< x s db s, B > t = Moreover, the quadratic variations of (x t ) and (y t ) are < y > t =< x s db s > t = x s db s. x s ds, < x, B > t = x 2 sds, < x > t = x s db s + 1 2 < x, B > t. y 2 sds. y s ds. Applying now the Itô formula to the function f(x, y) = x 2 + y 2, we obtain f(x t, y t ) =f(x, y ) + 2 =f(x, y ) 2 + =f(x, y ) x s dx s + 2 x s y s db s y s dy s + < x > t + < y > t x s d < y, B > s +2 y s d < x, B > s + < x > t + < y > t x 2 sds x s y s db s y 2 sds+ < x > t + < y > t = f(x, y ),