Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction: Let (T n ) n be a localising sequence, so that the stopped process X Tn is a martingale. Since X t Tn X t a.s. and the family (X t Tn ) n is uniformly integrable, we have X t Tn X t in L 1. Hence for s t, E(lim n X t Tn F s ) = lim n E(X t Tn F s ) = lim n X s Tn = X s Hence X is a martingale. And since it is in class D, the process (X t ) t is uniformly integrable since non-random times t are also stopping times. The only if direction: First a general result. For any integrable random variable Y the family G of sub-sigma-fields of F, we have {E(Y G) : G G} is uniformly integrable. Now if X is uniformly integrable, X t X a.s. and in L 1. Hence for any bounded stopping time T we have X T = E(X F T ) by the optional stopping theorem. This shows that {X T : T bounded stopping time } is uniformly integrable. Problem. Let X be a continuous local martingale. Show that if for each t, then X is a true martingale. E[ sup X s ] < s t Solution. Since for any stopping time T we have X t T sup s t X s, we can conclude {X t T : T } is uniformly integrable. In particular, for any localising sequence (T n ) we see that X t Tn X t a.s. and in L 1 and hence X is a martingale. [We could also argue directly via the dominated convergence theorem.] Problem 3. Let X be a non-negative local martingale. Prove that X is a supermartingale. Solution 3. By definition, there exists an increasing sequence of stopping times (T n ) n such that T n almost surely and the stopped process X T is a non-negative martingale. Hence, by (the conditional version of) Fatou s lemma we have the inequality E(X t F s ) = E( lim n X t Tn F s ) lim inf n E(X t T n F s ) = X s. Problem 4. Let X be a supermartingale. Show that if E(X t ) = E(X ) for all t, then X is a martingale. 1
Solution 4. Not that X s E(X t F s ) a.s. for each s t by assumption. But by the tower property we have E(X s ) E[E(X t F s )] = X E(X t ) =. Hence, X s E(X t F s ) = a.s. by the pigeonhole principle. Problem 5. Suppose X is an adapted integrable process such that E(X T ) = E(X ) for all bounded stopping times. Show that X is a martingale. [Hint: Fix non-random s < t and an event A F s, and let T = s1 A + t1 A c.] Solution 5. Let T be as in the hint, and note that T is a stopping time since if u < s {T u} = A if s u < t Ω if u t and A F u whenever u s. Hence by assumption = E(X T X ) = E(X s 1 A + X t 1 A c X ) = E[(X s X t )1 A ] where we have used the assumption E(X t X ) =. Since X is adapted and A was an arbitrary event in F s, we have E(X t F s ) = X s. Problem 6. Let X be a continuous square-integrable martingale with X =, and A a continuous, adapted process of finite variation with A =. Suppose for all bounded stopping times T that E(XT ) = E(A T ). Show that A = X. Find a similar characterisation of X when X is only a local martingale. Solution 6. By problem 5, the process X A is a martingale. The conclusion follows from the characterisation of quadratic variation from lectures. Similarly, let X is a continuous local martingale and (S n ) n be a sequence of stopping times reducing X to a square integrable martingale. If E(X S n T ) = E(A Sn T ) for all bounded stopping times T and all n then A = X. Problem 7. For a continuous local martingale X and stopping time T, show that X T = X T. Solution 7. One could do this by hand, or by using the previous problem by noticing (X T ) = (X ) T. [To be clear, the is an exponent and the T is superscript denoting stopping.] Problem 8. Show that if X is a continuous local martingale and Y is predictable and such that Y s d X s < for all t, then Y dx = Y d X.
Solution 8. In lectures, it was shown that if X M and Y L (X) then [ ( ) ] E Y s dx s = E Ys d X s. Now let XinM loc and Y L loc (X). Let { S n = inf t : X t > n or and T a bounded stopping time. We then have [ ( ) ] [ ( E Y dx = E S n T and the result follows from problem 6. = E = E Sn T } Ys d X s > n ) ] Y s 1 (,Sn T ](s)dxs Sn T Y s 1 (,S n T ](s)d X Sn T s Y s d X s Problem 9. Suppose that f : [, ) R is absolutely continuous, in the sense that for an integrable function f. Show that f(t) = f() + f t,var = Solution 9. The lower bound is easy: f(t) f(s) = and hence for a partition of [, t] we have s f (s)ds f (s) ds. f (u)du f (u) du s f(p ) f(p 1 ) p p 1 f (u) du = f (u) du. Now tae the supremum over the partition. To prove the upper bound, fix t and ɛ >. By the density, there exists a step function g such that f (s) g(s) ds < ɛ. 3
Now suppose that the points of discontinuity of g on [, t] are (p ). Hence f t,var p f (u)du p 1 p p g(u)du p 1 [f (u) g(u)]du p 1 = p p 1 g(u) du g(u) du f (u) du ɛ p p 1 f (u) g(u) du (*) f (s) g(s) ds by repeated use of the triangle inequality. The tric is at line (*), where g = g since g is constant on the interval [p 1, p ]. Problem 1. Let A, B : [, ) R be bounded and measurable, and let f : [, ) R be continuous and of finite variation. Show that ( ) A d B df = AB df in the sense of Lebesgue Stieltjes integration. Solution 1. Use a monotone class argument. Problem 11. Show that a sequence of processes (X n ) n converges uniformly on compacts in probability to X if and only if E[ sup Xt n X t 1]. t =1 Solution 11. If direction: Note the inequality ɛ1 {Y ɛ} Y 1 for Y and ɛ 1. Also note that for fixed K 1 E[ sup Xt n X t 1] K E[ sup Xt n X t 1]. t K t Hence =1 P( sup Xt n X t ɛ) E[sup t K Xt n X t 1] t K ɛ Only if direction: Note the inequality Y 1 ɛ + 1 {Y ɛ} for Y and ɛ 1. Now, for every K 1 and ɛ > we have E[ sup Xt n X t 1] E[ sup Xt n X t 1] + K t t K =1 ɛ + P( sup Xt n X t ɛ) + K. t K 4
Now send n, and then ɛ and K. The point of this problem is to show that the u.c.p. topology is metrisable. Problem 1. Fix a T and let W be a scalar Brownian motion. Show that for any p >, N p/ 1 N =1 W T/N W ( 1)T/N p c p T p/ in probability, where c p = π 1/ p/ Γ( p+1 ). (Hint: Use the law of large numbers) Why is this conclusion not surprising when p =? Solution 1. Write W T/N W ( 1)T/N p = (T/N) p/ Z,N where E(Z,N ) = c p and Var(Z,N ) = c p c p < and for every N the random variables Z 1,N,..., Z N,N are independent. Now use a wea law of large numbers. Problem 13. Let W be a Brownian motion and let W s Ŵ t = W t s ds. Show that Ŵ is not a martingale in the filtration generated by W, but that Ŵ is a martingale in its own filtration. Solution 13. Letting (F t ) t be the filtration generated by W, we have the direct calculation E(Ŵt F s ) = Ŵs W s log(t/s) for < s t. Or simply note that the difference Ŵ W is not constant and has sample paths of finite variation on any interval [a, b] for < a b. For the second part, note that Ŵ is a Brownian motion from example sheet 1. Problem 14. Let (W t ) t be a scalar Brownian motion, and for h H = L [, ) let W (h) = Show that {W (h), h H} is an isonormal process. h(s)dw s Solution 14. Since step functions are dense in L [, ) we can find a sequence of step functions h n h in L [, ) and hence W (h n ) W (h) in L (Ω) since E[ W (h) W (h n ) ] = Note that we have used W t = t. Now W (h n ) = h n (s)dw s = h(s) h n (s) ds N h n (t 1 )(W t W t 1 ) =1 is Gaussian as it is the sum of independent Gaussians with E[W (h n )] = and E[W (h n ) ] = 5 h n (s) ds
To show that W (h) N(, h ) we appeal to a general result: if X n N(µ n, σ n) for each n and µ n µ and σ n σ then X n N(µ, σ ) in distribution. Problem 15. Let X be a continuous local martingale with X = such that X t t is also a local martingale. Show that X is a Brownian motion. Solution 15. The unique continuous adapted finite variation process A such A = and X A is a local martingale is A = X. Hence X t = t and the conclusion follows from Lévy s characterisation of Brownian motion. Problem 16. Let N be a Poisson process of rate λ = 1 and let X t = N t t. Show that X is a finite variation. Show that both X and X t t are martingales. Why is does this not contradict the previous question? Solution 16. The point of the question is to highlight the complications that happen if we try to extend our results to discontinuous martingales. For instance, since N has independent increments we easily see that X is a martingale in its own filtration. Also as the difference of monotone paths, X is of finite variation. But continuous local martingales of finite variation are constant. Problem 17. (Burholder inequality) Let p and M a continuous local martingale with M =. Use Itô s formula, Doob s maximal inequality, and Hölder s inequality to show that there is a constant C p > such that E(max s t M p s) C p E( M p/ t ). Solution 17. First suppose that M is a bounded martingale. inequality, we have ( ) p p E(max s t M p s) E( M p t ) p 1 Now, Itô s formula yields First by Doob s maximal M t p = p sign(m s ) M s p 1 p(p 1) dm s + M s p d M s Now, since M is bounded, the first term on the right defines a true martingale. Hence we can tae expectations to get E( M t p ) = p(p 1) E p(p 1) E(max p(p 1) M s p d M s M s p M t ) s t [ ] 1 /p [ E(max M s p ) s t E( M p/ t ) by Hölder s inequality. Combining this with Doob s inequality yields the result with C p = p/ p (p+1)/ (p 1) (p 1)/. Now for the general case let T N = inf{t : M t > N} so that M T N is a bounded martingale. The inequality we have just proven implies E( max s t T N M p s) C p E( M p/ t T N ). 6 ] /p
Now send N and use the monotone convergence theorem. Problem 18. Let H n (x) denote the nth Hermite polynomial as defined in example sheet 1. Let X be a continuous local martingale. Show that ( ) X n/ X t t H n X 1/ t defines a local martingale for each n. Hint: Use Itô s formula. Solution 18. From example sheet 1, t n/ H n (W t / t) defines a martingale when W is a scalar Brownian motion. Also by Itô s formula, we have d[t n/ H n (W t / t)] = 1 tn/ 1 (nh n (W t / t)h n + H n)dt + t (n 1)/ H ndw t. Since the finite variation part must vanish, we have this identity for the nth Hermite polynomial nh n (x) xh n(x) + H n(x) =. (Of course, this can be proven in other ways.) Now using Itô s formula again, we have ( d [ X n/ H n X/ )] X = 1 X n/ 1 (nh n (X/ X )H n + H n)d X + X (n 1)/ H ndx. ( = X (n 1)/ H n X/ ) X dx t. We have expressed the given process as a stochastic integral with respect to a continuous local martingale, so it must be a continuous local martingale itself. 7