Applications of Ito s Formula

Similar documents
Lecture 22 Girsanov s Theorem

Lecture 12. F o s, (1.1) F t := s>t

1. Stochastic Processes and filtrations

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Lecture 21 Representations of Martingales

n E(X t T n = lim X s Tn = X s

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Risk-Minimality and Orthogonality of Martingales

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Wiener Measure and Brownian Motion

Contents. 1 Preliminaries 3. Martingales

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Generalized Gaussian Bridges of Prediction-Invertible Processes

Stochastic Differential Equations.

The Pedestrian s Guide to Local Time

1 Brownian Local Time

(B(t i+1 ) B(t i )) 2

Exercises in stochastic analysis

A Concise Course on Stochastic Partial Differential Equations

Verona Course April Lecture 1. Review of probability

Solutions to the Exercises in Stochastic Analysis

I forgot to mention last time: in the Ito formula for two standard processes, putting

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Exercises. T 2T. e ita φ(t)dt.

Stochastic Calculus February 11, / 33

Brownian Motion and Conditional Probability

Reflected Brownian Motion

Part II Probability and Measure

Stochastic Differential Equations

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

(A n + B n + 1) A n + B n

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Doléans measures. Appendix C. C.1 Introduction

Lecture 17 Brownian motion as a Markov process

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

A Short Introduction to Diffusion Processes and Ito Calculus

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

STAT 331. Martingale Central Limit Theorem and Related Results

A D VA N C E D P R O B A B I L - I T Y

Stochastic integration. P.J.C. Spreij

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

P (A G) dp G P (A G)

On pathwise stochastic integration

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

4 Sums of Independent Random Variables

The Cameron-Martin-Girsanov (CMG) Theorem

An essay on the general theory of stochastic processes

Convergence at first and second order of some approximations of stochastic integrals

{σ x >t}p x. (σ x >t)=e at.

9 Radon-Nikodym theorem and conditioning

Lecture 19 L 2 -Stochastic integration

ELEMENTS OF PROBABILITY THEORY

Part III Stochastic Calculus and Applications

Jump Processes. Richard F. Bass

Some Tools From Stochastic Analysis

STAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model

Topics in fractional Brownian motion

Representations of Gaussian measures that are equivalent to Wiener measure

Exercises Measure Theoretic Probability

3 Integration and Expectation

The Wiener Itô Chaos Expansion

Stochastic Analysis I S.Kotani April 2006

Tools of stochastic calculus

Integral Jensen inequality

Notions such as convergent sequence and Cauchy sequence make sense for any metric space. Convergent Sequences are Cauchy

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

Product measure and Fubini s theorem

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Brownian Motion and Stochastic Calculus

Exponential martingales: uniform integrability results and applications to point processes

5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing.

Selected Exercises on Expectations and Some Probability Inequalities

An Introduction to Malliavin calculus and its applications

Exercises Measure Theoretic Probability

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

Gaussian Processes. 1. Basic Notions

Introduction to Random Diffusions

Information and Credit Risk

L p Functions. Given a measure space (X, µ) and a real number p [1, ), recall that the L p -norm of a measurable function f : X R is defined by

Stochastic Calculus (Lecture #3)

9 Brownian Motion: Construction

Chapter 1. Poisson processes. 1.1 Definitions

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Stochastic Integration.

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

MA8109 Stochastic Processes in Systems Theory Autumn 2013

Universal examples. Chapter The Bernoulli process

REAL AND COMPLEX ANALYSIS

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

δ xj β n = 1 n Theorem 1.1. The sequence {P n } satisfies a large deviation principle on M(X) with the rate function I(β) given by

Transcription:

CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale characterization of Brownian motion Recall that B is a Brownian motion with respect to a filtration F if the increment B t B s has the normal distribution N(, t s) independent of F s. A consequence of this definition is that B is continuous martingale with quadratic variation process B, B t = t. The following theorem shows that this property characterizes Brownian motion completely. THEOREM 1.1. (Lévy s characterization of Brownian motion) Suppose that B is a continuous local martingale with respect to a filtration F whose quadratic variation process is B, B t = t (or equivalently, the process { B t t, t } is also a local martingale), then it is a Brownian motion with respect to F. PROOF. We will establish the following: for any s < t and any bounded F s -measurable random variable Z, (1.1) E Z exp {ia(b t B s )} = exp a (t s) EZ. Letting Z = 1, we see that B t B s is distributed as N(, t s). Letting Z = e iby for Y F s, we infer by the uniqueness of two-dimensional characteristic functions that B t B s is independent of any Y F s, which shows that B t B s is independent of F s. Therefore it is sufficient to show (1.1) Denote the left side of (1.1) by F(t). We first use Itô s formula to obtain exp {ia (B t B s )} =1 + ia a s s exp {ia (B u B s )} db u exp {ia (B u B s )} du. The stochastic integral is uniformly bounded because the other terms in this equality are uniformly bounded. Therefore the stochastic integral is not only a local martingale, but a martingale. Multiply both sides by Z and take the expectation. Noting that Z F s, we have F(t) = EZ a 65 s F(u) du.

66 4. APPLICATIONS OF ITO S FORMULA Solving for F(t), we obtain F(t) = exp This is what we wanted to prove. a (t s) EZ. A very useful corollary of Lévy s criterion is that every continuous local martingale M is a time change of Brownian motion. This can be seen as follows. We know that the quadratic variation process M, M is a continuous increasing process. Let us assume that as t, M, M t with probability 1. Let τ = {τ t } be the right inverse of M, M defined by τ t = inf {s : M, M s > t}. It is easy to show that t τ t is a right-continuous, increasing process and M, M τt = t and τ M,M t = t. Furthermore, τ t is a stopping time for each fixed t. Consider the time-changed process B t = M τt. Let σ n = inf {t : M t n}. Since M is a continuous local martingale, we have σ n and each stopped process M σ n is a square integrable martingale. From M, M t = t, we have by Fatou s lemma E Bt lim inf E M n σ n τ t = lim inf E M, M σ n n τ t E M, M τt = t. By the optional sampling theorem, B is a martingale. We now show that B is continuous. There exists a sequence of partitions { n } of the time set R + such that n and that with probability 1, t : j for all t. Now for any fixed t, the jump M t n j t M t n j 1 t = M, M t (B t B t ) = (M τt M τt ) is bounded by the quadratic variation of M in the time interval τ t δ, τ t for any δ >. Hence B t B t lim M t n j τ t M t n j 1 τ t = M, M τt M, M τt δ = δ. n t n j τ t δ Since δ is arbitrary, we see that B t = B t. This shows that B is continuous with probability 1. So far we have proved that B is a continuous local martingale. We compute the quadratic variation of B. The process M M, M is a continuous local martingale. By the optional sampling theorem, we see that M τ t M, M τt = B t t

. EXPONENTIAL MARTINGALE 67 is a continuous local martingale. It follows that the quadratic variation process of B is just B, B t = t. Now we can use Lévy s characterization to conclude that B t = M τt is a Brownian motion. Therefore we have shown that every continuous local martingale can be transformed into a Brownian motion by a time change. From B t = M τt and τ M,M t = t we have M t = B M,M t. In this sense we say that every continuous local martingale is the time change of a Brownian motion.. Exponential martingale We want to find an analog of the exponential function e x. The defining property of the exponential function is the differential equation or equivalently d f (x) dx = f (x), f () = 1, x f (x) = 1 + f (t) dt. So we define anexponential martingale E t by the stochastic integral equation E t = 1 + E s dm s, where M is a continuous local martingale. This equation can be solved explicitly. Instead of writing down the formula and verify it, let us discover the formula. Since E = 1 we can take the logarithm of E t at least for small time t. Let C t = log E t. By Itô s formula we have C t = E 1 s de s 1 E s d E, E s. Since de s = E s dm s, we have d E, E s = E s d M, M s. Hence C t = M t 1 M, M t. Therefore the formula for exponential martingale is E t = exp M t 1 M, M t. Now it is easy to verify directly that this process satisfies the defining equation for E t. Every strictly positive continuous local martingale can be written in the form of an exponential martingale: E t = exp M t 1 M, M t where the local martingale M can be expressed in terms of E by M t = E 1 s de s.,

68 4. APPLICATIONS OF ITO S FORMULA We can say that an exponential martingale is nothing but a positive continuous local martingale. Exponential martingale is related to iterated stochastic integrals defined by I (t) = 1 and I n (t) = I n 1 (s) dm s. By iterating the defining equation of the exponential martingale we have the expansion E t = exp M t 1 M, M t = I n (t) n= This formula can be verified rigorously, namely, it can be shown that the infinite series converges and the remainder from the iteration tends to zero. To continue our discussion let us introduce a parameter λ by replacing M with λm and obtain (.1) exp If we set x = λm t λ M, M t = λ n I n (t). n= M t M, M t and θ = λ, M, M t then the left side of (.1) becomes exp xθ θ. The coefficients of its Taylor expansion in θ are called Hermite polynomials It can be shown that e xθ θ = H n (x) θ n. n= n! ( ) H n (x) = e x / d n e x /. dx and {H n, n } is the orthogonal basis of L (R, e x / dx/ π) obtained by the Gramm-Schmidt procedure from the complete system {x n, n }. Now we can write exp λm t λ M, M t = λ n n= n! ( ) ( ) M, n/ M t M t H n. M, M t It follows that the iterated stochastic integrals can be expressed in terms of M t and M, M t via Hermite polynomials as follows: I n (t) = 1 ( ) ( ) n/ M, M t M H n t. n! M, M t

3. UNIFORMLY INTEGRABLE EXPONENTIAL MARTINGALES 69 Here are the first few iterated stochastic integrals: I (t) = 1, I 1 (t) = M t, I (t) = 1 M t M t, I 3 (t) = 1 M 3 6 t M t M t. Further discussion on this topic can be found in Hida 5. 3. Uniformly integrable exponential martingales In stochastic analysis exponential martingales (positive martingales) often appear in the following context. Suppose that (Ω, F, P) is a filtered probability space. Let Q be another probability measure which is absolutely continuous with respect to P on the σ-algebra F T. Then Q is also absolutely continuous with respect to P on F t for all t T. Let E t be the Radon-Nikodym derivative of Q with respect to P on F t : E t = dq dp. Ft Then it is easy to verify that E t = E {E T F t }. This shows that {E t, t T} is a positive martingale. If it is continuous, then it is an exponential martingale. Not only that, it is a uniformly integrable martingale. Conversely, if {E t, t T} is a uniformly integrable exponential martingale, then it defines a change of measure on the filtered probability space (Ω, F, P) up to time T. It often happens that we know the local martingale M and wish that the exponential local martingale E t = exp M t 1 M, M t defines a change of probability measures by dq/dp = E T. For this it is necessary that EE T = 1. However, in general, we only know that {E t } is a positive local martingale. It is therefore a supermartingale and EE T 1. Therefore the requirement EE T = 1 is not automatic and has to be proved by imposing further conditions on the local martingale M. PROPOSITION 3.1. Let M be a continuous local martingale and E t = exp M t 1 M, M t. Then EE T 1 and {E t, t T} is a uniformly integrable martingale if and only if EE T = 1.

7 4. APPLICATIONS OF ITO S FORMULA PROOF. We know that E is a local martingale, hence there is a sequence of stopping times τ n such that E t τn is a martingale, hence EE T τn = EE = 1. Letting n and using Fatou s lemma we have EE T 1. This inequality also follows from the fact that a nonnegative local martingale is always a supermartingale. Suppose that EE T = 1. Since E is a supermartingale we have E t E {E T F t }. Taking expected value gives 1 EE t EE T = 1. This shows that the equality must hold throughout and we have E t = E {E T F t }, which means that {E t, t T} is a uniformly integrable martingale. We need to impose conditions on the local martingale M to ensure that the local exponential martingale E is uniformly integrable. We have the following result due to Kamazaki THEOREM 3.. Suppose that M is a martingale and Ee MT/ is finite. Then { exp M t 1 } M t, t T is uniformly integrable. PROOF. This is a very interesting proof. Let E(λ) t = exp λm t λ M t. We need to show that E(1) is uniformly integrable, but first we take a slight step back and prove that E(λ) is uniformly integrable for all < λ < 1. We achieve this by proving that there is an r > 1 and C such that EE(λ) r σ C for all stopping time σ T. We have E(λ) r σ = exp ( λr λ 3 r ) M σ λ exp 3 rm σ λ r M σ. Using Hölder s inequality with the exponents 1 λ + λ = 1 we see that EE(λ) r σ is bounded by { ( λr ) } 1 λ { λ E exp 3 r λ } λ M σ E exp 1 λ rm σ λ r M σ. The second expectation on the right side does not exceed 1 (see PROPOSI- TION 3.1). For the first factor we claim that σ can be replaced by T and the coefficient can be replaced by 1/ if r > 1 is sufficiently close to 1. When r = 1 the coefficient is λ λ 3 1 λ = λ 1 + λ < 1

3. UNIFORMLY INTEGRABLE EXPONENTIAL MARTINGALES 71 because λ < 1. Hence the coefficient is still less than 1/ if r > 1 but sufficiently close to 1. On the other hand, we have assumed that M is a martingale, hence M σ = E {M T F σ }. By Jensen s inequalty we have e Mσ/ E {e M T/ } F σ. It follows that E exp ( λr ) λ 3 r M σ Ee Mσ/ Ee MT/. 1 λ We therefore have shown that for any λ < 1, there is an r > 1 such that EE(λ) r σ Ee M T/ for all stopping times σ T. This shows that E(λ) is a uniformly integrable martingale, which implies that EE(λ) T = 1 for all λ < 1. We now use the same trick again to show EE(1) T = 1. We have E(λ) T = exp λ M T λ M T exp (λ λ )M T. Using Hölder s inequality with the exponents λ + 1 λ = 1 we have { 1 = EE(λ) T E exp M T 1 } λ { ( )} λ 1 λ M T E exp 1 + λ M T. Because λ/(1 + λ) 1/, the second expectation on the right side can be replaced by E expm T /. Letting λ we obtain E exp M T 1 M T = 1. The condition E expm T / < is not easy to verify because we usually know the quadratic variation M T much better than M T itself. The following weaker criterion can often be used directly. COROLLARY 3.3. (Novikov s criterion) Suppose that M is a martingale. If Eexp M T / is finite, then E exp M T 1 M T = 1. PROOF. We have 1 exp M T 1 = exp M T 1 1 4 M T exp 4 M T. By the Cauchy-Schwarz inequality we have 1 E exp M T E exp M T 1 M T E exp 1 M T.

7 4. APPLICATIONS OF ITO S FORMULA The factor factor on the right side does not exceed 1. Therefore 1 1 E exp M T E exp M T. Therefore Novikov s condition implies Kamazaki s condition. 4. Girsanov and Cameron-Martin-Maruyama theorems In stochastic analysis, we often need to change the base probability measure from a given P to another measure Q. Suppose that B is a Brownian motion under P. In general it will no longer be Brownian motion under Q. The Girsanov theorem describes the decomposition of B as the same of a martingale (necessarily a Brownian motion) and a process of bounded variation under a class of change of measures. We assume that B is an F -Brownian motion and V a progressively measurable process such that exp V s db s 1 V s ds, t T is a uniformly integrable martingale. This exponential martingale therefore defines a change of measure on F T by dq T dp = exp V s db s 1 T V s ds, t T. This is the class of changes of measures we will consider. THEOREM 4.1. Suppose that Q is the new measure described above. Consider the Brownian motion with a drift (4.1) X t = B t V τ dτ, t T. Then X is Brownian motion under the probability measure Q. PROOF. Let {e s } be the exponential martingale { s e s = exp V τ, db τ 1 s } V τ dτ. Then the density function of Q with respect to P on F s is just e s. From this fact it is easy to verify that if Y is an adapted process then E Q {Y t F s } = e 1 s E P {Y t e t F s }. This means that Y is a (local) martingale under Q if and only if ey = {e s Y s, s } is a (local) martingale under Q. Now e is a martingale and de s = e s V s db s. On the other hand, using Itô s formula we have d(e s X s ) = e s db s + e s X s V s db s.

4. GIRSANOV AND CAMERON-MARTIN-MARUYAMA THEOREMS 73 This shows that ex is local martingale under P, hence X is a local martingale under Q. Since Q and P are mutually absolutely continuous, it should be clear that the quadratic variation process X under P and under Q should be the same, i.e., X t = t. We can also verify this fact directly by applying Itô s formula to Z s = e s (X s s). We have dz s = e s d(x s s) + (X s s) de s + X s d e, X s. The second term on the right side is a martingale. We have d(x s s) = X s dx s d X, X s ds = X s db s X s V s ds. On the other hand, from de s = e s V s db s we have It follows that d e, X s = e s V s ds. dz s = e s X s db s + (X s s) de s, which shows that Z is a local martingale. Now we have shown that both X s and Xs s are local martingales under Q. By Lévy s criterion we conclude that X is a Brownian motion under Q. The classical Cameron-Martin-Maruyama theorem is a special case of the Girsanov theorem but stated in a slightly different form. Let µ be the Wiener measure on the path space W(R). Let h W(R) and consider the shift in the path space ξ h w = w + h. The shifted Wiener measure is µ h = µ ξ 1 h. If X is the coordinate process on W(R), then µh is just the law of X + h. We will prove the following dichotomy: either µ h and µ are mutually absolutely continuous or they are mutually singular. In fact we have an explicit criterion for this dichotomy. DEFINITION 4.. A path (function) h W(R) is called Cameron-Martin path (function) if it is absolutely continuous and its derivative is square integrable. The Cameron-Martin norm is defined by h H = ḣs ds. The space of Cameron-Martin paths is denoted by H. It is clear that H is a Hilbert space. THEOREM 4.3. (Cameron-Martin-Maruyama theorem) Suppose that h H. Then the shifted Wiener measure µ h and µ are mutually absolutely continuous and dµ h (w) = exp ḣ s dw s 1 ḣs ds. dµ PROOF. Denote the exponential martingale by e 1 (w). We need to show that for nonnegative measurable function F on W(R), E µh (F) = E µ (Fe 1 ).

74 4. APPLICATIONS OF ITO S FORMULA Let X be the coordinate process on W(R). E µ F(X + h). Introduce the measure ν by dν 1 dµ = exp ḣ s dw s 1 We have Then the left side is simply dµ 1 dν = exp ḣ s dw s + 1 = exp ḣ s d(w s + h) 1 ḣs ds ḣs ds. ḣs ds. Hence we can write dµ/dν = e 1 (X + h). It follows that E µh F = E µ F(X + h) = E ν F(X + h) dµ = E ν F(X + h)e 1 (X + h). dν By Girsanov s theorem X + h is a Brownian motion under ν. On the other hand, X is a Brownian motion under µ. Therefore on the right side of the above equality we can replace ν by µ and at the same time replace X + h by X, hence E µh F = E µ F(X)e 1 (X) = E µ (Fe 1 ). THEOREM 4.4. Let h W(R). The shifted Wiener measure µ h mutually absolutely continuous or mutually singular with respect to µ according as h H or h H. PROOF. We need to show that if h H, then µ h is singular with respect to µ. First we need to convert the condition h H into a more convenient condition. A function ḣ to be square integrable if and only if f s ḣ s ds C f for some constant C. Therefore it is conceivable that if h H, then for any C, there is a step function f such that f = n f i (s i s i 1 ) = 1 i=1 and n f s dh s = f i (h si h si 1 ) C. i=1 This characterization of h H can indeed be verified rigorously. Second, the convenient characterization that µ h is singular with respect to µ is the following: for any positive ɛ, there is a set A such that µ h (A) 1 ɛ and µ(a) ɛ.

5. MOMENT INEQUALITIES FOR MARTINGALES 75 Consider the random variable Z(w) = f s dw s = n f i (w si w si 1 ). i=1 It is a Gaussian random variable with mean zero and variance f 1 = 1. Therefore it has the standard Gaussian distribution N(, 1). Let A = {Z C/}. We have µ(a) ɛ for sufficiently large C. On the other hand, { µ h (A) = µ w W(R) : Z(w + h) C }. By the definition of Z(w) we have Therefore Z(w + h) = Z(w) + n i=1 f i (h si h si 1 ) Z(w) + C. { µ h (A) = µ Z + C C } { = µ Z C } and µ h (A) 1 ɛ for sufficiently large C. Thus we have shown that µ h and µ are mutually singular. 5. Moment inequalities for martingales Let M be a continuous martingale and Mt = max M s. s t The moment E M p t can be bounded both from above and from below by E M, M p/ t. THEOREM 5.1. Let M be a continuous local martingale. For any p >, there are positive constants c p, C p such that c p E M, M p/ E (Mt ) p C p E M, M p/ t. PROOF. The case p = is obvious. We only prove the case p >. The case < p < is slightly more complicated, see Ikeda and Watanabe 6. By the usual stopping time argument we may assume without loss of generality that M is uniformly bounded, so there is no problem of integrability. We prove the upper bound first. We start with the Doob s submartingale inequality (5.1) E ( M p p t p 1 ) p E M t p.

76 4. APPLICATIONS OF ITO S FORMULA We use Itô s formula to M t p. Note that x x p is twice continuously differentiable because p >. This gives M t p = p M s p 1 sgn(m s ) dm s + p(p 1) M s p sgn(m s ) d M s. Take expectation and using the obvious bound M s Ms we have E M t p p(p 1) E M (p ) t M, M t. We use Hölder s inequality on the right side and (5.1) on the left side to obtain E M p { p} (p )/p { } t C E M t E M, M p/ /p t, where C is a constant depending on p. The upper bound follows immediately. The lower bound is slightly trickier. Using Itô s formula we have M t M, M (p )/4 t = M, M (p )/4 s dm s + M s d, M, M (p )/4 s. The first term on the right side is the term we are aiming at because its second moment is precisely the left side of the inequality we wanted to prove. This is the reason why choose the exponent (p )/4. We have M, M (p )/4 s dm s Mt M, M (p )/4 t. Squaring the inequality and taking the expectation, we have after using Hölder s inequality, p E M, M p/ t 4 { E M p } /p { } t E M, M p/ (p )/ t. The lower bound follows immediately. 6. Martingale representation theorem Let B be a standard Brownian motion and Ft B = σ {B s, s t} be its associated filtration of σ fields properly completed so that it satisfies the usual condition. Now suppose M is a square-integrable martingale with respect to this filtration. We will show it can always be represented as a stochastic integral with respect to the Brownian motion, namely, there exists a progressively measurable process H such that M t = H s db s. Thus every martingale with respect to the filtration generated by a Brownian motion is a stochastic integral with respect to this Brownian motion. This is a very important result in stochastic analysis. In this section we will give a proof of this result by an approach we believe is direct, short, and

6. MARTINGALE REPRESENTATION THEOREM 77 well motivated. It uses nothing more than Itô s formula in a very elementary way. In the next section we will discuss another approach to this useful theorem. We make a few remarks before the proof. First of all, the martingale representation theorem is equivalent to the following representation theorem: if X is a square integrable random variable measurable with respect to FT B, then it can be represented in the form X = EX + for if we take X = M T, then we have M T = T T H s db s, H s db s. Since both sides are martingales, the equality must also holds if T is replaced by any t T. Second, the representation is unique because if then from E T T H s db s = T G s db s, T T H s db s G s db s = E H s G s ds we have immediately H = G on, T Ω with respect to the measure L P, where L is the Lebesgue measure. Third, if X n X in L (Ω, F, P) and X n = EX n + T H n s db s, then EX n EX n and T Hs m Hs n ds = E X m X n. This shows that {H n } is a Cauchy sequence in L (, T Ω, L P). It therefore converges to a process H and X = EX + T H s db s. The point here is that it is enough to show the representation theorem for a dense subset of random variables. Finally we observe a square integrable martingale with respect to the filtration generated by a Brownian motion is necessarily continuous because every stochastic integral is continuous with respect to its upper time limit. We may assume without loss generality that T = 1. Our method starts with a simple case X = f (B 1 ) for a smooth bounded function f. It is easy to

78 4. APPLICATIONS OF ITO S FORMULA prove the theorem in this case and find the explicit formula for the process H. PROPOSITION 6.1. For any bounded smooth function f, f (W 1 ) = E f (W 1 ) + E { f (W 1 ) F s } dws. PROOF. We have f (W 1 ) = f (W t + W 1 W t ). We know that W 1 W t has the normal distribution N(, 1 t) and is independent of F t. Hence, (6.1) E { f (W 1 ) F t } = 1 π(1 t) R 1 f (W t + x)e x /(1 t) dx. Now we regard the right side as a function of B t and t and apply Itô s formula. Since we know that it is a martingale, we only need to find out its martingale part, which is very easy: just differentiate with respect to B t and integrate the derivative with respect to B t. We have f (W 1 ) = E f (W 1 ) + 1 π(1 t) R 1 f (W t + x)e x /(1 t) dx dw s. The difference between the integrand and the right side of (6.1) is simply that f is replaced by its derivative f. This shows that f (W 1 ) = E f (W 1 ) + E { f (W 1 )F t } dt. The general case can be handled by an induction argument. THEOREM 6.. Let X L (Ω, F 1, P). Then there is a progressively measurable process H such that X = E X + H s dw s. PROOF. It is enough to show the representation theorem for random variables of the form X = f (W s1, W s W s1,, W sn W s1 ), where f is a bounded smooth function with bounded first derivatives because the random variables of this type form a dense subset of L (Ω, F 1, P) (see the remarks at the beginning of this section). By the induction hypothesis applied to the Brownian motion {W s W s1, s 1 s 1 }, we have where f (x, W s W s1,, W sn W s1 ) = h(x) + h(x) = E f (x, W s W s1,, W sn W s1 ), s 1 H x s dw s,

6. MARTINGALE REPRESENTATION THEOREM 79 and the dependence of H x on x is smooth. Hence, replacing x by W s1 we have X = h(w s1 ) + H s dw s, s 1 where H s = H W s 1 s. We also have h(w s1 ) = E h(w s1 ) + It is clear that E h(w s1 ) = E X, hence X = EX + s1 H s dw s. H s dw s. We now give a general formula for the integrand in the martingale representation theorem for a wide class of random variables. Recall the definition of the Cameron-Martin space H = { h C, 1 : h() =, ḣ L, 1 } and the Cameron-Martin norm h H = ḣs ds. A function F : C, 1 R is called a cylinder function if it has the form F(w) = f (w s1,..., w sl ), where < s 1 < < s l 1 and f : R l R is bounded smooth function. We also assume that f has bounded first derivatives and use the notation F xi (w) = f xi (w s1,..., w sl ). For a cylinder function F and h C, 1, the directional derivative along h W(R) is defined by F(w + th) F(w) D h F(w) = lim. t t If h H, then it is easy to verify that where Note that the derivative D h F(w) = DF(w), h H, DF(w) s = l i=1 min {s, s i } F xi (w). D s F(w) = d {DF(w) s} ds

8 4. APPLICATIONS OF ITO S FORMULA is given by l D s F(w) = F xi (w)i,si (s). i=1 We have the following integration by parts formula. THEOREM 6.3. Let H : Ω H be F -adapted and E exp { H H /} is finite. Then the following integration by parts formula holds for a cylinder function F: ED H F = E DF, H = E F Ḣ s dw s. PROOF. By Girsanov s theorem, under the probability Q defined by d Q 1 d P = exp t Ḣ s dw s t 1 Ḣ s ds, the process X s = W s th s, s 1, is a Brownian motion, hence E Q F(X) = E F(W). This means that { E Q F(X) = E F(W th) exp t Ḣ s dw s t 1 } Ḣ s ds is independent of t. Differentiating with respect to t and letting t =, we obtain the integration by parts formula. The explicit martingale representation theorem is given by the following Clark-Ocone formula. THEOREM 6.4. Let F be a cylinder function on the path space W(R). Then F(W) = E F(W) + E {D s F(W) F s } dw s. PROOF. By the martingale representation theorem we have F(W) = E F(W) + where Ḣ is F -adapted Ḣ. By definition, E D G F = E DF, G H. Ḣ s dw s, By the integration by parts formula, the left side is E D G F = E F Ġ s dw s = E Ḣ s dw s Ġ s dw s = E Ḣ s Ġ s ds.

7. REFLECTING BROWNIAN MOTION 81 The right side is E DF, G H = E Hence for all F -adapted process Ġ we have E Ġ s D s F ds = E E {D s F F s } Ġ s ds. E {D s F F s } Ġ s ds = E Ḣ s Ġ s ds. Since E {D s F F s } Ḣ s is also adapted, we must have Ḣ s = E {D s F F s }. 7. Reflecting Brownian motion Let B be a one-dimensional Brownian motion starting from zero. The process X t = B t is called reflecting Brownian motion. We have X t = F(B t ), where F(x) = x. We want to apply Itô s formula to F(B t ), but unfortunately F is not C. We approximate F by F ɛ (x) = 1 ɛ x u1 du 1 I ɛ,ɛ (u ) du. The above function is still not C because F ɛ = I ɛ,ɛ /ɛ, which is not continuous, but it is clear that F ɛ (x) x and F ɛ(x) sgn(x) as ɛ. Now let φ be a continuous function and define F φ (x) = x u1 du 1 φ(u ) du. Itô s formula can be applied to F φ (B t ) and we obtain F ɛ (B t ) = F φ(b s ) db s + 1 φ(b s )ds. Now for a fixed ɛ we let φ in the above formula to be the continuous function, if x ɛ + n 1, φ n (x) = ɛ 1, if x ɛ, linear, in the two remaining intervals. Then F φn F ɛ, and F φ n F. It follows that we have In the last term, F ɛ (B t ) = F (B s ) db s + 1 ɛ I ɛ,ɛ (B s ) ds I ɛ,ɛ (B s ) ds. is the amount of time Brownian paths spends in the interval ɛ, ɛ up to time t. Now let ɛ in the above identity for F ɛ (B t ). The term on the left

8 4. APPLICATIONS OF ITO S FORMULA side converges to B t and the first term on the right side converges to the stochastic integral sgn(b s ) db s. Hence the limit must exist and we have 1 t L t = lim I ɛ ɛ ( ɛ,ɛ) (B s ) ds B t = sgn(b s ) db s + 1 L t. We see that L t can be interpreted as the amount of time Brownian motion spends in the interval ( ɛ, ɛ) properly normalized. It is called the local time of Brownian motion B at x =. Let W t = sgn(b s ) db s. Then W is a continuous martingale with quadratic variation process W, W t = sgn(b s ) ds = t. Note that Brownian motion spends zero amount of time at x = because EI {} (B s ) = P {B s = } = and E I {} (B s ) ds = EI {} (B s ) ds =. We thus conclude that reflecting Brownian motion B t is submartingale with the decomposition B t = W t + 1 L t. It is interesting to note that W can be expressed in terms of reflecting Brownian motion by where W t = X t 1 L t, 1 (7.1) L t = lim I ɛ ɛ,ɛ (X s ) ds. We now pose the question: Can X and L be expressed in terms of W? That the answer to this question is affirmative is the content of the so-called Skorokhod problem. DEFINITION 7.1. Given a continuous path f : R + R such that f (). A pair of functions (g, h) is the solution of the Skorokhod problem if (1) g(t) for all t ; () h is increasing from h() = and increases only when g = ; (3) g = f + h.

7. REFLECTING BROWNIAN MOTION 83 The main result is that the Skorokhod problem can be solved uniquely and explicitly. THEOREM 7.. There exists a unique solution to the Skorokhod equation. PROOF. It is interesting that the solution can be written down explicitly: h(t) = min f (s), s t g(t) = f (t) min f (s). s t Let s assume that f () = for simplicity. If f () >, then h(t) = and g(t) = f (t) before the first time f reaches and after this time it is as if the path starts from. The explicit solution in this case is h(t) = min s t f (s), g(t) = f (t) min f (s). s t It is clear that g(t) for all t and h increases starting from h() =. The equation f = g + h is also obvious. We only need to show that h increases only when g(t) =. This means that as a Borel measure h only charges the zero set {t : g(t) = }. This requirement is often written as h(t) = I {} (g(s)) dh(s). Equivalently, it is enough to show that for any t such that g(t) > there is a neighborhood (t δ, t + δ) of t such that h is constant on there. This should be clear, for if g(t) >, then f (t) > min s t f (s), which means that the minimum must be achieved at a point ξ, t) and f (t) > f (ξ). By continuity a small change of t will not alter this situation, which means that h = f (ξ) in a neighborhood of t. More precisely, from g(t) = f (t) min s t f (s) > and the continuity of f, there is a positive δ such that min f (s) > min f (s). t δ s t+δ s t δ Hence { } min f (s) = min min f (s), min f (s) s t+δ s t δ t δ s t+δ = min s t δ f (s). This means that h(t + δ) = h(t δ), which means that h must constant on (t δ, t + δ) because h is increasing. We now show that the solution to the Skorokhod problem is unique. Suppose that (g 1, h 1 ) and let ξ = h h 1. It is continuous and of bounded variation, hence ξ(t) = ξ(s) dξ(s). On the other hand, ξ(s) = g(s) g 1 (s), hence ξ(t) = {g(s) g 1 (s)} d {h(s) h 1 (s)}. There are four terms on the right side: g(s) dh(s) = g 1 (s) dh 1 (s) = because h increases on when g = and h 1 increases only when g 1 = ;

84 4. APPLICATIONS OF ITO S FORMULA g(s) dh 1 (s) and g 1 (s) dh(s) because g(s) and g 1 (s). Putting these observations together we have ξ(t), which means that ξ(t) =. This proves the uniqueness. If we apply Skorokhod equation to Brownian motion by replacing f with Brownian paths, we obtain some interesting results. We have shown that B t = W t + 1 L t, where W is a Brownian motion. From the solution of the Skorokhod problem we conclude that B and L are determined by W: (7.) B t = W t min s t W s, 1 L t = min s t W s. THEOREM 7.3. Let W be a Brownian motion. (1) The processes { } max W s W t, t and { W t, t } s t have the same law, i.e, that of a reflecting Brownian motion. () We have max W 1 s = lim s t ɛ ɛ I,ɛ ( max u s W u W s ) ds. PROOF. The assertions follow immediately from (7.) by replacing W with W, which is also a Brownian motion; in the second assertion, we use the fact that L is the normalized occupation time of reflecting Brownian motion (7.1). REMARK 7.4. We have calculated the joint distribution of max s t W s and W t for a fixed t and we know that max s t W s W s and W t have the same distribution for each fixed t. The above theorem claims much more: they have the same distribution as two stochastic processes. 8. Brownian bridge If we condition a Brownian motion to return to a fixed point x at time t = 1 we obtain Brownian bridge from o to x with time horizon 1. Let L x (R n ) = {w W o (R n ) : w 1 = x}. The law of a Brownian bridge from o to x in time 1 is a probability measure µ x on L x (R n ), which we will call the Wiener measure on L x (R n ). Note that L x (R n ) is a subspace of W o (R n ), thus µ x is also a measure on W o (R n ). By definition, we can write intuitively µ x (C) = µ {C w 1 = x}. Here µ is the Wiener measure on W o (R n ). The meaning of this suggestive formula is as follows. If F is a nice function measurable with respect to B s with s < 1 and f a measurable function on R n, then E µ {F f (X 1 )} = E µ {E µ X 1 (F) f (W 1 )},

8. BROWNIAN BRIDGE 85 where W denotes the coordinate process on W o (R n ). Using the Markov property at time s we have E F µ p(1 s, W s, y) f (y) dy = E µ y (F)p(1, o, y) f (y)dy, R n R n where ( ) 1 n/ p(t, y, x) = e y x /t πt is the transition density function of Brownian motion X. This being true for all measurable f, we have for all F B s, (8.1) E µ x p(1 s, F = E µ Ws, x) F. p(1, o, x) Therefore µ x is absolutely continuous with respect to µ on F s for any s < 1 and the Radon-Nikodym density is given by dµ x dµ (w) = p(1 s, w s, x) = e s. Fs p(1, o, x) The process {e s, s < 1} is a necessarily a positive (local) martingale under the probability µ. It therefore must have the form of an exponential martingale, which can be found explicitly by computing the differential of log e s. The density function p(t, y, x) satisfies the heat equation p t = 1 yp in (t, y) for fixed x. This equation gives y log p = log p y log p. t Using this fact and Iô s formula we find easily that d log e s = log p(1 s, w s, x), dw s 1 p(1 s, w s, x) ds. Hence e s is an exponential martingale of the form dµ o s dµ = exp V u, dw u 1 Bs where s V s = y log p(1 s, w s, x). V u du, By Girsanov s theorem, under probability µ x, the process B s = W s s log p(1 τ, W τ, x)dτ, s < 1 is a Brownian motion. The explicit formula for p(t, y, x) gives y log p(1 τ, y, x) = y x 1 τ.

86 4. APPLICATIONS OF ITO S FORMULA Under the probability µ x the coordinate process W is a reflecting Brownian motion from o to x in time 1. Therefore we have shown that reflecting Brownian motion is the solution to the following stochastic differential e- quation dw s = db s W s x 1 s ds. This simple equation can be solved explicitly. From the equation we have d(w s x) + W s x 1 s ds = db s. The left side after dividing by 1 s is the differential of (W s x)/(1 s), hence s W s = sx + (1 s) db u 1 u. This formula shows that, like Brownian motion itself, Brownian bridge is a Gaussian procss. The term Brownian bridge is often reserved specifically for Brownian bridge which returns to its starting point at the terminal time. In this case we have dw s = db s W s 1 s ds and s W s = (1 s) db u 1 u. For dimension n = 1 it is easy verify that the covariance function is given by E {W s W t } = min {s, t} st. Using this fact we obtain another representation of Brownian bridge. THEOREM 8.1. If {B s } is a Brownian motion, then the process is a Brownian bridge. W s = B s sb 1 PROOF. (n = 1) Verify directly that W defined above has the correct covariance function. The following heuristic discussion of W s = B s sb 1 is perhaps more instructive. Let F be the filtration of the Brownian motion B. We enlarge the filtration to G s = σ {F s, B 1 }. We compute the Doob-Meyer decomposition of W with respect to G. Of course the martingale part is a Brownian motion because its quadratic variation process will be the same has that of W. Denote this Brownian motion by Ω. Doob s explicit decomposition formula for a semimartingale suggest that s W s = Ω s + E {dw s G s }.

9. FOURTH ASSIGNMENT 87 We have dw s = db s B 1 ds. The differential dw s = W s+ds W s is forward differential. We need to project dw s to the L -space generated by G s = σ {B u, u s; B 1 } = {B u, u s; B 1 B s }. Note that σ {B u, u s} is orthogonal to B 1 B s. The differential db s is orthogonal to the first part and its projection to the second part is db s (B 1 B s ) (B 1 1 B s ) = B 1 B s ds. s 1 s The differential B 1 ds is of course in the target space already, hence E {dw s G s } = B 1 B s 1 s It follows that ds B 1 ds = B s sb 1 1 s s W s = Ω s W τ 1 τ dτ, ds = W s ds 1 s. which is exactly the stochastic differential equation for a reflecting Brownian motion. 9. Fourth assignment EXERCISE 4.1. Let φ be a strictly convex function. If both N and φ(n) are continuous local martingales then N is trivial, i.e., there is a constant C such that N t = 1 with probability 1 for all t. EXERCISE 4.. Let M be a continuous local martingale. Show that there is a sequence of partitions 1 3 such that n and with probability 1 the following holds: for all t, lim n i=1 ( M t n i t M t n i 1 t) = M, M t. EXERCISE 4.3. Let B be the standard Brownian motion. Then the reflecting Brownian motion X t = B t is a Markov process. This means { } P X t+s C Fs X = P {X t+s C X s }. What is its transition density function q(t, x, y) = P {X t+s dy X s = x}? dy EXERCISE 4.4. Let L t be the local time of Brownian motion at x =. Show that 8t EL t = π. EXERCISE 4.5. Show that Brownian bridge from o to x in time 1 is a Markov process with transition density is q(s 1, y; s, z) = p(s s 1, y, z)p(1 s, z, x). p(1 s 1, y, x)

88 4. APPLICATIONS OF ITO S FORMULA EXERCISE 4.6. The finite-dimensional marginal density for Brownian bridge from o to x in time 1 is p(1, o, x) 1 Convention: x = o, s =. l p(s i+1 s i, x i, x i+1 ). i= EXERCISE 4.7. Let {W s, s 1} be a Brownian bridge at o. Then the reversed process {W 1 s, s 1} is also a Brownian bridge. EXERCISE 4.8. For an a > define the stopping time Show that for µ, σ a = inf {t : B t t = a}. Ee µσ a = e ( 1+µ 1)a. EXERCISE 4.9. Use the result of the previous exercise to show that Ee µσ a is infinite if µ > 1/. EXERCISE 4.1. By the martingale representation theorem there is a process H such that W1 3 = H s dw s. Fin an explicit expression for H.