Lecture 21 Representations of Martingales

Similar documents
Lecture 22 Girsanov s Theorem

Lecture 19 L 2 -Stochastic integration

Lecture 17 Brownian motion as a Markov process

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic integration. P.J.C. Spreij

1. Stochastic Processes and filtrations

On pathwise stochastic integration

Applications of Ito s Formula

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Brownian Motion and Stochastic Calculus

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

A D VA N C E D P R O B A B I L - I T Y

Wiener Measure and Brownian Motion

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

n E(X t T n = lim X s Tn = X s

Solutions to the Exercises in Stochastic Analysis

Lecture 4 Lebesgue spaces and inequalities

Part III Stochastic Calculus and Applications

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

An essay on the general theory of stochastic processes

Doléans measures. Appendix C. C.1 Introduction

I forgot to mention last time: in the Ito formula for two standard processes, putting

Exponential martingales: uniform integrability results and applications to point processes

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

ITO OLOGY. Thomas Wieting Reed College, Random Processes 2 The Ito Integral 3 Ito Processes 4 Stocastic Differential Equations

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

The strictly 1/2-stable example

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

An Overview of the Martingale Representation Theorem

Lecture 12. F o s, (1.1) F t := s>t

A Concise Course on Stochastic Partial Differential Equations

Stochastic Processes

Stochastic Calculus. Alan Bain

MATH 6605: SUMMARY LECTURE NOTES

Additive Lévy Processes

Exercises in stochastic analysis

The Skorokhod problem in a time-dependent interval

Convergence at first and second order of some approximations of stochastic integrals

The Azéma-Yor Embedding in Non-Singular Diffusions

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

Exercises. T 2T. e ita φ(t)dt.

1 Stat 605. Homework I. Due Feb. 1, 2011

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

1 Independent increments

Real Analysis Notes. Thomas Goller

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

Selected Exercises on Expectations and Some Probability Inequalities

(A n + B n + 1) A n + B n

Stochastic Calculus (Lecture #3)

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

A Barrier Version of the Russian Option

Tools from Lebesgue integration

Topics in fractional Brownian motion

Brownian Motion and Conditional Probability

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.

On Integration-by-parts and the Itô Formula for Backwards Itô Integral

Integration on Measure Spaces

Analysis Comprehensive Exam Questions Fall 2008

Stochastic Processes

Summable processes which are not semimartingales

Recent results in game theoretic mathematical finance

Poisson random measure: motivation

4 Countability axioms

I. ANALYSIS; PROBABILITY

Measure and integration

Stochastic Analysis I S.Kotani April 2006

Uniformly Uniformly-ergodic Markov chains and BSDEs

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

MA8109 Stochastic Processes in Systems Theory Autumn 2013

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Martingales, standard filtrations, and stopping times

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

Generalized Gaussian Bridges of Prediction-Invertible Processes

Lecture 6 Basic Probability

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Advanced Probability

Risk-Minimality and Orthogonality of Martingales

Regular Variation and Extreme Events for Stochastic Processes

Representations of Gaussian measures that are equivalent to Wiener measure

Notes 1 : Measure-theoretic foundations I

STAT 331. Martingale Central Limit Theorem and Related Results

(B(t i+1 ) B(t i )) 2

Math 735: Stochastic Analysis

5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing.

Convergence of Feller Processes

Metric Spaces Lecture 17

LOCALLY INTEGRABLE PROCESSES WITH RESPECT TO LOCALLY ADDITIVE SUMMABLE PROCESSES

Tools of stochastic calculus

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

An Introduction to Stochastic Processes in Continuous Time

µ X (A) = P ( X 1 (A) )

Transcription:

Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let A denote the set of all càdlàg and non-decreasing function defined on [, ) and taking values in [, ], with f () =. For f A, we also set f ( ) = lim t f (t). Even it does not need to be invertible, a function f in A admits an a right-continuous inverse, namely the function g : [, ) [, ], denoted by g = f 1, and given by g(s) = inf{t : f (t) > s}. The picture on the right shows a (portion of) the graph of a typical function in A (black), as well as its right-continuous inverse (blue). Note how jumps correspond to flat stretches and vice versa, and how the non-injectivity is resolved to yield right continuity. As always h(t ) = lim t t,t <t h(t ), for t > and h( ) =, for any càdlàg function h on [, ). Here is a precise statement of some of the properties of rightcontinuous inverses. This is best understood by looking at the picture above, so no proof is given (to practice real analysis, supply the proof yourself). 1.2 1..8.6.4.2.2.4.6.8 1. 1.2 Right-continuous inverses of each other. Proposition 21.1. For f A and its right-continuous inverse g = f 1, we have 1. g A, 2. f is the right-continuous inverse of g, 3. g( f (t)) = sup Flat f (t), for each t [, ), where Flat f (t) = {t : f (t) = f (t )}. 4. Similarly, f (g(s)) = sup Vert f (s), for each s f ([, )), where Vert f (s) is the set of all values in the (unique) interval of the form [ f (t ), f (t)] which contains s. In particular, if f is continuous, then f (g(s)) = s, for all s. Last Updated: April 3, 215

Lecture 21: Representations of Martingales 2 of 11 Even though we will have no further use for it in these notes, the result of following problem (a change-of-variable formula) comes in handy from time to time. It is a mild generalization of the formula we use to compute expectations of functions of random variables (how?). Problem 21.1. Show that, for f A, and a non-negative Borel function ϕ on [, ), we have ϕ(t) d f (t) = ϕ(g(s))1 {g(s)< } ds, where g = f 1 is the right-continuous inverse of f. Deduce that, a < b <, and a non-decreasing and continuous function u : [a, b] [, ), we have b a ϕ(u(s)) d f (u(s)) = u(b) u(a) ϕ(t) d f (t). Time-changes Definition 21.2. A (not-necessarily adapted) stochastic process {τ s } s [, ) with trajectories in A is called a change of time or time change if the random variable τ s is a stopping time, for each s. Given a filtration {F t } t [, ) and a predictablely measurable process {X t } t [, ), the composition X τs defines a random variable for each s ; the stochastic process {X τs } s [, ) is called the time change of {X t } t [, ) by {τ s } s [, ). Similarly, we may define the time-changed filtration {G s } s [, ) by G s = F τs. By Problem 16.4, we have the following: {G s } s [, ) is right-continuous, since {F t } t [, ) is (we always assume that, remember). Same for completeness. the time-changed process {X τs } s [, ) is {G s } s [, ) adapted. If {X t } t [, ) happens to be càdlàg then so is {X τs } s [, ). Is the same true for continuous (or càglàd) processes? Even though we are asking only for τ s to be a stopping time for deterministic s, this property extends to the class of all {G s } s [, ) - stopping times: Proposition 21.3. Then the random variable τ σ is an {F t } t [, ) -stopping time, as soon as σ is a {G s } s [, ) -stopping time. Proof. It is enough to deal with countable-valued stopping times. Indeed, a general stopping time σ can be approximated from the right Last Updated: April 3, 215

Lecture 21: Representations of Martingales 3 of 11 by a countable-valued sequence {σ n } n N of stopping times such that σ n σ, so that, by right contiuity of τ, we have τ σn τ σ, a.s. The right continuity of the filtration {F t } t [, ) takes care of the rest. We turn to a countable-valued stopping time σ of the form σ = s k1 Ak, where A k G sk = F τsk, for k N and A 1, A 2,... form a partition of Ω. For t, we have ) {τ σ t} = ({τ sk t} A k. Since A k F τsk, we have A k {τ sk t} F t, for each k N, and so {τ σ t} F t. Definition 21.4. Let {τ s } s [, ) be a time change. A process {X t } t [, ) is said to be τ-continuous if it is continuous and X is constant on [τ s, τ s ], for all s, a.s. It is clear that {X τs } s [, ) is a continuous process if {τ s } s [, ) is a time change and {X t } t [, ) is a τ-continuous process. A deeper property of τ-continuity is that it preserves martingality. Before we state the precise result, let us show what can go wrong: Example 21.5. Let {B t } t [, ) be a Brownian motion, and let {F t } t [, ) be the right-continuous augmentation of its natural filtration. We define the family {τ s } s [, ) of stopping times by τ s = inf{t : B t > s}, s. To show that {τ s } s [, ) is a time change, we simply express its paths as right-continuous inverses of the continuous and non-decreasing process {M t } t [, ), where M t = sup s t B s. By the continuity of M and the part (6) of Proposition 21.1, we have B τs = s, for all s, a.s. Therefore, we have managed to time-change a martingale B into a constant, finite-variation process s. Note that the time change τ is by no means continuous, as all flat stretches of M correspond to jumps in τ. In fact, it can be shown that, in some sense, τ grows by jumps only. Proposition 21.6. Let {τ s } s [, ) be a time change, and let M M loc,c be a τ-continuous local martingale. Then the time-changed process {M τs } s [, ) is a continuous local martingale for the time-changed filtration {G s } s [, ) = {F τs } s [, ). Proof. By τ-continuity, the process {N s } s [, ), given by N s = M τs is a continuous process adapted to the filtration {G s } s [, ). Given an {F t } t [, ) -stopping time T such that M T is bounded, we set S = inf{s : τ s T}, Last Updated: April 3, 215

Lecture 21: Representations of Martingales 4 of 11 and show that that N S is a {G s } s [, ) -martingale. By τ-continuity of M, we have N s S = M τs S = M τ(s S), and, since τ (s S) T, we conclude that N S is bounded. Next, we pick a {G s } t [, ) -stopping time σ and note that the random variable τ (σ S) is, according to Proposition 21.3, an {F t } t [, ) -stopping time. Also, even though τ σ S is not necessarily bounded from above by T, the martingale M τ σ S remains bounded by the same constant as M T, so we can use the optional sampling theorem to get E[N S σ ] = E[N σ S ] = E[M τ(σ S) ] = E[M τ ] = E[N ] = E[N S ], which implies that N S is a martingale. We can repeat the procedure for a reducing sequence {T n } n N with M T n bounded, to conclude that N is, indeed, a local martingale. We have seen that the class of local martingales is closed under time changes only if additional conditions are fulfulled. The situation is quite diferent with semimartingales. We state two important results the proofs of which are outside the scope of these notes. Both results deal with càdlàg semimartingales, i.e., with processes which can be decomposed into sums of a càdlàg local martingale and a càdlàg and adapted process of finite variation. Theorem 21.7. Let {X t } t [, ) be a càdlàg semimartingale, and {τ s } s [, ) a time-change. Then the time-changed process {X τs } s [, ) is a càdlàg semimartingale (not necessarily continuous even if X is). Theorem 21.8 (Monroe). Let {X s } s [, ) be a càdlàg semimartingale. Then there exist a filtered probability space (Ω, F, {F t } t, P ) which supports an {F t } t [, ) -Brownian motion {B t} t [, ) and a time-change {τ s } s [, ) such that {X s } s [, ) and {B τs } s [, ) have the same (finite-dimensional) distributions. A theorem of Dambis, Dubins and Schwarz As we have seen in Example 21.5, a right-continuous inverse (computed ω-wise) of an adapted, càdlàg and non-decreasing process is clearly a time change. The most important example of such a timechange for our purposes is obtained by taking a right-continuous inverse of the quadratic variation process of a continuous local martingale {M t } t [, ) : τ s = inf{t : M t > s}. Last Updated: April 3, 215

Lecture 21: Representations of Martingales 5 of 11 We note that the process τ will not take the value if the local martingale {M t } t [, ) is divergent, i.e., if M =, a.s. Lemma 21.9. M is τ-continuous. Proof. By stopping, we may assume that both M and M are bounded. We pick a rational number r, and define the process N t = M r+t M r, t, which is clearly a martingale with N t = M r+t M t. The random variable T r = inf{t : N t > } is a stopping time, and therefore, the stopped martingale N T r is in M 2,c with N T r t =, for all t. Therefore, N T r t =, for all, t, and so, M is constant on [q, T q ]. It is not hard to see that any interval of constancy of M is a closure of a countable union of intervals of the form [r, t + T r ], and our claim follows. Theorem 21.1 (Dambis, Dubins and Schwarz). For a divergent M M loc,c, we define τ s = inf{t : M t > s}, and G s = F τs, for s. Then, the time-changed process {B s } s [, ), given by B s = M τs, s, is a G-Brownian motion and the local martingale M is a time-change of B, i.e. M t = B M t, for t. Proof. By Lemma 21.9 and Proposition 21.6, B is a continuous local martingale. To compute its quadratic variation, we start from the fact that Mt 2 M t is a τ-continous local martingale, and, therefore, timechange Mτ 2 s M τs is a continuous process. Since M τs = s, the process Bs 2 s is a {G s } s [, ) -local martingale, and so, B s = s. Lévy s characterization (Theorem implies that {B s } t [, ) is a Brownian motion. The fact that M t = B M t follows quite directly from Proposition 21.1, part 3. Indeed, B M t = M τ M t and τ M t is the the right edge of Flat M (t). Since M is τ M -continuous, we have M t = M τ M t = B M t. Remark 21.11. The restriction that M be divergent is here mostly for convenience. It can be shown (try to prove it), that in that case M can still be written as M t = B M t, t, where B is a Brownian motion. This time, however, B may have to be defined on an extension of the original probability space. The reason is that we do not have enough of M to construct the whole path of B from. Last Updated: April 3, 215

Lecture 21: Representations of Martingales 6 of 11 Martingals as stochastic integrals Let {M t } t [, ) be a continuous local martingale. A continuous local martingale {Z} t [, ) is called the stochastic (Doléans-Dade) exponential of M, denoted by Z = E(M), if (21.1) t Z t = 1 + Z u dm u, for all t, a.s. Itô s formula implies readily that the following prescription E(M) t = exp(m t 1 2 M t), t, defines a stochastic exponential of M. It can be shown that the process E(M) is the only such process, i.e., that the stochastic integral equation (21.1) has a unique solution (in a specific sense). Exponential martingales, together with the following simple lemma (whose easy proof we leave to the reader), and the proposition that follows it, play a central role in the proof of the celebrated martingalerepresentation theorem below. We remind the reader that subset E of L 2 is said to be total if its linear hull { n } α k X k : n N, α 1,..., α n R, X 1,..., X k E is dense in L 2 (F). Also, for E L 2, its orthogonal complement E is given by E = {X L 2 (F) : E[XY] = for all Y E} = {}. Lemma 21.12. Let (Ω, F, P) be a probability space, and let E be a subset of L 2 (F) such that E = {}. Then, E is total in L 2 (F). Proposition 21.13. Let {B t } t [, ) be a Brownian motion, and let {F t } t [, ) be the usual augmentation of its natural filtration. With I denoting the set of all (deterministic!) functions f : [, ) R of the form (21.2) f (t) = n λ k 1 (tk 1,t k ](t), for some n N, λ 1,..., λ n R and = t < t 1 < < t n <, the set is total in L 2 (F ). E = { E( } f (u) db u) : f I Proof. Suppose that Y L 2 (F ) is such that E[YX] =, for all X E, i.e., Then, given a finite partition = t < t 1 < < t n < and f as Last Updated: April 3, 215

Lecture 21: Representations of Martingales 7 of 11 in (21.2), we have = E[E( f (u) db u ) Y] [ ( n ) ] = E exp λ k (B tk B tk 1 ) 2 1 n λ 2 k (t k t k 1 ) Y. By rearrangement and conditioning, it follows that [ ( n ) ] [ ( n ) ] E exp α k B tk Z + = E exp α k B tk Z, for all α 1,..., α n R, where Z = E[Y σ(b t1,..., B tn )]. Given that Z is σ(b t1,..., B tn )-measurable, there exists a Borel function ζ such that Z = ζ(b t1,..., B tn ). Consequently R n e n α kx k ζ + (x 1,..., x n )ϕ(x 1,..., x n )dx 1,..., dx n = R n e n α kx k ζ (x 1,..., x n )ϕ(x 1,..., x n )dx 1,... dx n, for all α 1,..., α n, where ϕ is the density of (B t1,..., B tn ). Both integrals above are Laplace transforms, one of ζ ϕ and the other of ζ + ϕ, and, since they agree, so must the transformed functions, i.e., ζ =, a.e., with respect to the Lebesgue measure on R n. It follows that E[Y G] =, i.e., that E[Y + 1 A ] = E[Y 1 A ], for each A σ(b t1,..., B tn ). Since the union of all such finite-dimensional σ-algebras is a π-system which generates F T, we can conclude that Y =, a.s., and Lemma 21.12 applies. Proposition 21.14. Let {B t } t [, ) be a Brownian motion, let {F t } t [, ) be the usual augmentation of its natural filtration, and let F = σ(f t ; t ). For any random variable X L 2 (F ) there exists a λ P-a.e.-unique predictable process {H t } t [, ), such that E[ H2 u du] < and (21.3) X = E[X] + H u db u, a.s. Proof. Let us deal with uniqueness, first. Suppose that (21.3) holds for two predictable processes H and K. Then [ ( ) ] 2 = E H u db u K u db u = E[ (H u K u ) 2 du], and it follows that H = K, λ P-a.e., Last Updated: April 3, 215

Lecture 21: Representations of Martingales 8 of 11 Next, we deal with existence and start by defining the subset H of L 2 (F ) by H = { H u db u : H is predictable and E[ Hu 2 du] < }. Let {Y n } n N be a sequence in H - with Y n = Hn u db u - which converges to some Y L 2 (F ). By completenes of L 2 (F ), {Y n } n N is Cauchy, thus, thanks to Itô s isometry, the sequence {H n } n N is Cauchy in L 2 ([, ) Ω) = L 2 ([, ) Ω, P, λ P), where P denotes the predictable σ-algebra. By completeness of L 2 ([, ) Ω), H n H, for some H L 2 ([, ) Ω). Consequently, the H u db u = lim n H n u db u = lim n Y n = Y, a.s. Therefore, H is a closed subspace of L 2 (F ). Moreover, the representation (21.1) of the random variables E( f (u) db u), f I, (defined in the statement of Proposition 21.13) tells us that E( f (u) db u ) 1 H, for all f I. It remains now to use Proposition 21.13 to conclude that the linear span Span(H, 1) = { } x + Y : x R, Y H in L 2 (λ P), generated by H and the constant 1, is dense in L 2 (λ P), because it contains a total set. On the other hand, by the first part of the proof, Span(H, 1) is closed in L 2 (λ P), and, so, Span(H, 1) = L 2 (λ P). It follows that each X L 2 (F ) can be written as X = x + H u db u, for some predictable H with E[ H2 u du] <, and some constant x R. Taking expectations of both sides yields that x = E[X]. Even though, we have almost no use of non-continuous (RCLL) martingales in these notes, sometimes the show up naturally, even in the continuous-path framework. They are defined just like the continuous martingales - the only difference is that their trajectories are assumed to be RCLL. A deeper difference comes with the possible choices of the reducing sequence. In the continuous-path case this sequence can always be constructed to so that the stopped processed are bounded. That is no longer the case in the RCLL setting, and only uniform integrability, as required in the definition, can be guaranteed. Last Updated: April 3, 215

Lecture 21: Representations of Martingales 9 of 11 Corollary 21.15. Let {F t } t [, ) be the usual agumentation of the Brownian filtration, and let M be an L 2 -bounded, càdlàg {F t } t [, ) -martingale. Then, there exists a predictable process H with E[ H2 u du] < such that t M t = M + H u db u, for all t, a.s. Proof. By Proposition 21.14, we can express the last element M of M as M = M + H u db u, for some predictable H such that E[ H2 u du] <. Let the continuous martingale N be defined as t N t = M + H u db u, so that N = M. It follows that M t = E[M F t ] = E[N F t ] = N t, a.s., for all t, and right continuity implies that M and N are indistinguishable. Proposition 21.16. Let {F t } t [, ) be the usual augmentation of the Brownian filtration, and let {M t } t [, ) be a càdlàg {F t } t [, ) -local martingale. Then, there exists a predictable process H in L(B) such that t M t = M + H u db u, for all t, a.s. In particular, each local martingale in the augmented Brownian filtration is continuous. Proof. By stopping, we can reduce the statement to the case of an uniformly-integrable martingale (note that, as explained above, we cannot necessarily assume that M is a bounded martingale). Being UI, M admits the last element M and we can approximate it in L 2. More precisely, given ε >, there exists M ε L 2 (F ) such that M ε M L 1 < ε. Using M ε as the last element, We define the square-integrable martingale M ε by M ε t = E[M ε F t ], for t, and take a càdlàg modification (remember, we can do that for any martingale on a filtration satisfying the usual conditions). By the maximal inequality for martingales, we have [ ] (21.4) P sup Mt n Mt ε δ t 1 δ E[ Mn M ε ] 1 δ ε. Last Updated: April 3, 215

Lecture 21: Representations of Martingales 1 of 11 Corollary 21.15 implies that M ε is continuous, and we can interpret the equation (21.4) in the following way: with probability at least 1 ε/δ, the trajectory of M is in a δ-neighborhood of a continuous function. That means, in particular, that (21.5) P[( M) 2δ] 1 δ ε, where the jump process M is defined by M t = M t M t. The left-hand side of (21.5) does not depend on ε, so, we conclude that ( M) 2δ, a.s., for all δ >, making M continuous. Now that we know that M is continuous, we can reduce it by stopping to a square-integrable martingale and finish the proof. An explicit example Martingale representations of random variables (and martingales) are seldom explicit. Here is an example of a nontrivial one where everything can be worked out closed-form. Example 21.17. Let B be a Brownian motion and {F t } t [, ) its (augmented) filtration. With S t = sup u t B u denoting the running-maximum process, the random variable S 1 is in L 2 (F 1 ) so Proposition 21.14 guarantees that there exists a predictable process {H t } t [, ) in L 2 (B) such that (note that, necessarily, H t = for t > 1) 1 S 1 = E[S 1 ] + H u db u, a.s. Proposition 21.14 itself is silent about the exact form of H. We start by defining a martingale {M t } t [,1], in its continuous modification, by M t = E[S 1 F t ], a.s., so that M t = t H u db u, for all t [, 1], a.s. On the other hand, we have [ ( ) ] Ft E[S 1 F t ] = E max S t, B t + sup (B u B t ), a.s. u [t,1] The Strong Markov Property implies that sup u [t,1] (B u B t ) is independent of F t and distributed as S 1 t, so we have where, for s b, we define E[S 1 F t ] = F(t, S t, B t ), a.s., F(t, s, b) = E[max(s, b + S 1 t )] a.s., Last Updated: April 3, 215

Lecture 21: Representations of Martingales 11 of 11 By the reflection principle, we have F(t, s, b) = max(s, b + x 1 t) ϕ(x) dx, with ϕ denoting the density of the unit normal. The function F is twice differentiable in b for t < 1 (this can be checked directly) martingale on [, 1), and the the process F(t, S t, B t ) is a martingale. Therefore, Itô s formula implies that (why?) (21.6) t E[S 1 F t ] = E[S 1 ] + b F(u, S u, B u ) db u for t < 1, Since all local martingales in the Brownian filtration are continuous, we can let t 1, and conclude that the equality in (21.6) holds a.s., even for t = 1. It remains to compute the derivative of F; with Φ(x) = x ϕ(ξ) dξ, we have H t = ) 1 {Bt + x 1 t S t } (1 ϕ(x) dx = 2 Φ( S 1 t t B t ), A simulated path of a Brownian motion B (blue), its running max S (orange) and the martingale M (green) 1..8.6.4.2 so that S 1 = 2 π + 1 ( ) 2 1 Φ( S 1 u u B u ) db u Values of the process H corresponding to the simulated path of the Brownian motion B above. Last Updated: April 3, 215