Lecture 22 Girsanov s Theorem

Similar documents
Lecture 21 Representations of Martingales

Applications of Ito s Formula

Lecture 17 Brownian motion as a Markov process

n E(X t T n = lim X s Tn = X s

Lecture 19 L 2 -Stochastic integration

Exponential martingales: uniform integrability results and applications to point processes

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

1. Stochastic Processes and filtrations

Stochastic integration. P.J.C. Spreij

Part III Stochastic Calculus and Applications

A D VA N C E D P R O B A B I L - I T Y

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Risk-Minimality and Orthogonality of Martingales

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

Lecture 4 Lebesgue spaces and inequalities

Verona Course April Lecture 1. Review of probability

Lecture 6 Basic Probability

I forgot to mention last time: in the Ito formula for two standard processes, putting

Random Process Lecture 1. Fundamentals of Probability

Convergence at first and second order of some approximations of stochastic integrals

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

Part II Probability and Measure

Exercises in stochastic analysis

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Information and Credit Risk

(2) E exp 2 M, M t < t [0, ) guarantees that Z is a martingale. Kazamaki [27] showed that Z is a martingale provided

Generalized Gaussian Bridges of Prediction-Invertible Processes

arxiv: v2 [math.pr] 14 Nov 2018

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

Lecture 12. F o s, (1.1) F t := s>t

Doléans measures. Appendix C. C.1 Introduction

An essay on the general theory of stochastic processes

LAN property for sde s with additive fractional noise and continuous time observation

Part III Advanced Probability

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

The Cameron-Martin-Girsanov (CMG) Theorem

The Pedestrian s Guide to Local Time

Bayesian quickest detection problems for some diffusion processes

On the quantiles of the Brownian motion and their hitting times.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Summable processes which are not semimartingales

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes

Lecture 6 Random walks - advanced methods

Math 6810 (Probability) Fall Lecture notes

Reflected Brownian Motion

Exercises. T 2T. e ita φ(t)dt.

Mathematical finance (extended from lectures of Fall 2012 by Martin Schweizer transcribed by Peter Gracar and Thomas Hille) Josef Teichmann

Nonlinear Lévy Processes and their Characteristics

INVARIANCE TIMES. Laboratoire de Mathématiques et Modélisation d Évry and UMR CNRS Évry Cedex, France

On an Effective Solution of the Optimal Stopping Problem for Random Walks

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

Additive Lévy Processes

On pathwise stochastic integration

The Azéma-Yor Embedding in Non-Singular Diffusions

Optimal Stopping Problems and American Options

Harmonic Functions and Brownian motion

(A n + B n + 1) A n + B n

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

Skorokhod Embeddings In Bounded Time

Brownian Motion and Conditional Probability

Exercises Measure Theoretic Probability

Risk Measures in non-dominated Models

SUPERMARTINGALES AS RADON-NIKODYM DENSITIES AND RELATED MEASURE EXTENSIONS

Supermartingales as Radon-Nikodym Densities and Related Measure Extensions

r=1 I r of intervals I r should be the sum of their lengths:

Stochastic Integration and Continuous Time Models

Stochastic Calculus (Lecture #3)

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

MARTINGALES: VARIANCE. EY = 0 Var (Y ) = σ 2

Optimal Stopping under Adverse Nonlinear Expectation and Related Games

Poisson random measure: motivation

δ xj β n = 1 n Theorem 1.1. The sequence {P n } satisfies a large deviation principle on M(X) with the rate function I(β) given by

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

4. Conditional risk measures and their robust representation

Supplement A: Construction and properties of a reected diusion

MTH 404: Measure and Integration

Selected Exercises on Expectations and Some Probability Inequalities

Solutions to the Exercises in Stochastic Analysis

MATH 6605: SUMMARY LECTURE NOTES

ELEMENTS OF PROBABILITY THEORY

arxiv: v1 [math.pr] 24 Sep 2018

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

Stochastic Analysis I S.Kotani April 2006

ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS

Basics of Stochastic Analysis

1 Basics of probability theory

Change of Measure formula and the Hellinger Distance of two Lévy Processes

Lecture 22: Variance and Covariance

On the submartingale / supermartingale property of diffusions in natural scale

4 Sums of Independent Random Variables

Modern Discrete Probability Branching processes

Randomization in the First Hitting Time Problem

Transcription:

Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n ξ k, n =,..., N, k= where ξ k are independent N(, ) random variables. The random vector (X,..., X N ) is then, itself, Gaussian, and admits the density f (x,..., x N ) = C N e 2 (x 2 +(x 2 x ) 2 + +(x N x N ) 2 ) with respect to the Lebesgue measure on R N, for some C N >. Let us now repeat the whole construction, with the n-th step having the N(µ n, )-distribution, for some µ,..., µ N R. The resulting, Gaussian, distribution still admits a density with respect to the Lebesgue measure, and it is given by f (x,..., x N ) = C N e 2 ((x µ ) 2 +(x 2 x µ 2 ) 2 + +(x N x N µ N ) 2 ) The two densities are everywhere positive, so the two Gaussian measures are equivalent to each other and the Radon-Nikodym derivative turns out to be dq dp = f (X,...,X N ) f (X,...,X N ) ( ) ( ) = e µ X µ 2 (X 2 X ) µ N (X N X N ) + 2 µ 2 +µ2 + +µ 2 N = e N k= µ k(x k X k ) 2 N k= µ2 k.. Equivalent measure changes Let Q be a probability measure on F, equivalent to P, i.e., A F, P[A] = if and only if Q[A] =. Its Radon-Nikodym derivative Last Updated: May 5, 25

Lecture 22: Girsanov s Theorem 2 of 8 Z = dq dp is a non-negative random variable in L with E[Z] =. The uniformly-integrable martingale Z t = E[Z F t ], t, is called the density of Q with respect to P (note that we can - and do - assume that {Z t } t [, ) is càdlàg). We will often use the shortcut Q-(local, semi-, etc.) martingale for a process which is a (local, semi-, etc.) martingale for (Ω, F, F t, Q). Proposition 22.. Let {X t } t [, ) be a càdlàg and adapted process. Then X is a Q-local martingale if and only if the product {Z t X t } t [, ) is a càdlàg P-local martingale. Before we give a proof, here is a simple and useful lemma. Since the measures involved are equivalent, we are free to use the phrase almost surely without explicit mention of the probability. Lemma 22.2. Let (Ω, H, P) be a probability space, and let G H be a subσ-algebra of H. Givan a probability measure Q on H, equivalent to P, let Z = dq dp be its Radon-Nikodym derivative with respect to P. For a random variable X L (F, Q) we have XZ L (P) and E Q [X G] = E[XZ G], a.s. E[Z G] where E Q [ G] denotes the conditional expectation on (Ω, H, Q). Proof. First of all, note that the Radon-Nikodym theorem implies that XZ L (P) and that the set {E[Z G] = } has Q-probability (and, therefore P-probability). Indeed, Q[E[Z G] = ] = E Q [ {E[Z G]=} ] = E[Z {E[Z G]=} ] = E[E[Z {E[Z G]=} G]] Therefore, the expression on the right-hand side is well-defined almost surely, and is clearly G-measurable. Next, we pick A G, observe that E Q [ A E[Z G] E[XZ G]] = E[Z A E[Z G] E[XZ G]] = E[E[Z G] A E[Z G] E[XZ G]] = E[E[ZX A G]] = E Q [X A ], and remember the definition of conditional expectation. Proof of Proposition 22.. Suppose, first, that X is a Q-martingale. Then E Q [X t F s ] = X s, Q-a.s. By the tower property of conditional expectation, the random variable Z t is the Radon-Nikodym derivative of (the Last Updated: May 5, 25

Lecture 22: Girsanov s Theorem 3 of 8 restriction of) Q with respect to (the restriction of) P on the probability space (Ω, F t, P) (prove this yourself!). Therefore, we can use Lemma 22.2 with F t playing the role of H and F s the role G, and rewrite the Q-martingale property of X as (22.) Z s E[X t Z t F s ] = X s, Q a.s., i.e. E[X t Z t F s ] = Z s X s, P a.s. We leave the other direction, as well as the case of a local martingale to the reader. Proposition 22.3. Suppose that the density process {Z t } t [, ) is continuous. Let X be a continuous semimartingale under P with decomposition X = X + M + A. Then X is also a Q-semimartingale, and its Q-semimartingale decomposition is given by X = X + N + B, where N = M F, B = A + F where F t = Z t d M, Z. Proof. The process F is clearly well-defined, continuous, adapted and of finite variation, so it will be enough to show that M F is a Q- local martingale. Using Proposition 22., we only need to show that Y = Z(M F) is a P-local martingale. By Itô s formula (integrationby-parts), the finite-variation part of Y is given by Z u df u + Z, M t, and it is easily seen to vanish using the associative property of Stieltjes integration. One of the most important applications of the above result is to the case of a Brownian motion. A cloud of simulated Brownian paths on [, 3] Theorem 22.4 (Girsanov; Cameron and Martin). Suppose that the filtration {F t } t [, ) is the usual augmentation of the natural filtration generated by a Brownian motion {B t } t [, ).. Let Q P be a probability measure on F and let {Z t } t [, ) be the corresponding density process, i.e., Z t = E[ dq dp F t]. Then, here exists a predictable process {θ t } t [, ) in L(B) such that Z = E( θ u db u ) and B t θ u du is a Q-Brownian motion. 2. Conversely, let {θ t } t [, ) L(B) have the property that the process Z = E( θ u db u ) is a uniformly-integrable martingale with Z >, a.s. For any probability measure Q P such that E[ dq dp F ] = Z, B t θ u du, t, is a Q-Brownian motion. The same cloud with darker-colored paths corresponding to higher values of the Radon-Nikodym derivative Z 3. Last Updated: May 5, 25

Lecture 22: Girsanov s Theorem 4 of 8 Proof.. We start with an application of the martingale representation theorem (Proposition 2.6). It implies that there exists a process ρ L(B) such that Z t = + ρ u db u. Since Z is continuous and bounded away from zero on each segment, the process {θ t } t [, ), given by θ t = ρ t /Z t is in L(B) and we have Z t = + Z u θ u db u. Hence, Z = E( θ u db u ). Proposition 22.3 states that B is a Q- semimartingale with decomposition B = (B F) + F, where the continuous FV-process F is given by F t = Z u B, Z u = Z u Z u θ u du = θ u du. In particular, B F is a Q-local martingale. On the other hand, its quadratic variation (as a limit in P-, and therefore in Q-probability) is that of B, so, by Lévy s characterization, B F is a Q-Brownian motion. 2. We only need to realize that any measure Q P with E[ dq dp F ] = Z will have Z as its density process. The rest follows from (). Even though we stated it on [, ), most of applications of the Girsanov s theorem are on finite intervals [, T], with T >. The reason is that the condition that E( θ u db u ) be uniformly integrable on the entire [, ) is either hard to check or even not satisfied for most practically relevant θ. The simplest conceivable example θ t = µ, for all t and µ R \ {} gives rise to the exponential martingale Z t = e µb t 2 µ2t, which is not uniformly integrable on [, ) (why?). On any finite horizon [, T], the (deterministic) process µ {t T} satisfies the conditions of Girsanov s theorem, and there exists a probability measure P µ,t on F T with the property that ˆB t = B t µt is a P µ,t Brownian motion on [, T]. It is clear, furthermore, that for T < T 2, P µ,t coincides with the restriction of P µ,t 2 onto F T. Our life would be easier if this consistency property could be extended all the way up to F. It can be shown that this can, indeed, be done in the canonical setting, but not in same equivalence class. Indeed, suppose that there exists a probability measure P µ on F, equivalent to P, such that P µ, restricted to F T, coincides with P µ,t, for each T >. Let {Z t } t [, ) be the density process of P µ with respect to P. It follows that Z T = exp(µb T 2 µ2 T), for all T >. Last Updated: May 5, 25

Lecture 22: Girsanov s Theorem 5 of 8 Since µ =, we have B t 2 µt, a.s., as T and, so, Z = lim T Z T =, a.s. On the other hand, Z is the Radon- Nikodym derivative of P µ with respect to P on F, and we conclude that P µ must be singular with respect to P. Here is slightly different perspective on the fact that P and P µ must be mutually singular: for the event A F, given by { A = lim t } B t t =, we have P[A] =, by the Law of Large Numbers for the Brownian motion. On the other hand, with ˆB t being a P µ Brownian motion, we have P µ [A] = P µ B [ lim t t t = ] = P µ ˆB [ lim t t t = µ] =, because ˆB t /t, P µ -a.s. Not everything is lost, though, as we can still use employ Girsanov s theorem in many practical situations. Here is one (where we take P[X dx] = f (x) dx to mean that f (x) is the density of the distribution of X.) Example 22.5 (Hitting times of the Brownian motion with drift). Define τ a = inf{t : B t = a} for a >. By the formula derived in a homework, we have P[τ a dt] = a a2 e 2t dt. 2πt 3 For T, (the restriction of) P and P µ,t are equivalent on F T with the Radon-Nikodym derivative dp µ,t dp = exp(µb T 2 µ2 T). The optional sampling theorem (justified by the uniform integrability of the martingale exp(µb t 2 µ2 t) on [, T]) and the fact that {τ a T} F τa T F T imply that Therefore, E[exp(µB T 2 µ2 T) F τa T] = exp(µb τa T 2 µ2 (τ a T)). P µ,t [τ a T] = E µ,t [ {τa T}] = E[exp(µB T 2 µ2 T) {τa T}] = E[exp(µB τa T 2 µ2 (τ a T)) {τa T}] = E[exp(µB τa 2 µ2 τ a ) {τa T}] = E[exp(µa 2 µ2 τ a ) {τa T}] = = T e µa 2 µ2 t a 2πt 3 e a2 2t dt. T e µa 2 µ2t P[τ a dt] Last Updated: May 5, 25

Lecture 22: Girsanov s Theorem 6 of 8 On the other hand, {B t µt} t [,T] is a Brownian motion under P µ,t, so P µ,t [τ a T] = P[ ˆτ a T], where ˆτ a is the first hitting time of the level a of the Brownian motion with drift µ. It follows immediately that the density of ˆτ a is given by P[ ˆτ a dt] = a (a µt)2 e 2t dt. 2πt 3 We quote the word density because, if one tries to integrate it over all t, one gets P[ ˆτ a < ] = (a µt)2 a e 2t dt = exp(µa µa ). 2πt 3 In words, if µ and a have the same sign, the Brownian motion with drift µ will hit a sooner or later. On the other hand, if they differ in sign, the probability that it will never get there is strictly positive and equal to e 2µa. Kazamaki s and Novikov s criteria The message of the second part of Theorem 22.4 is that, given a drift process {θ t } t [, ), we can turn a Brownian motion into a Brownian motion with drift θ, provided, essentially, that a certain exponential martingale is a UI martingale. Even though useful sufficient conditions for martingality of stochastic integrals are known, the situation is much less pleasant in the case of stochastic exponentials. The most well-known criterion is the one of Novikov. Novikov s criterion is, in turn, implied by a slightly stronger criterion of Kazamaki. We start with an auxiliary integrability result. In addition to the role it plays in the proof of Kazamaki s criterion, it is useful when one needs E(M) to be a little more than just a martingale. Lemma 22.6. Let E(M) be the stochastic exponential of M M loc,c. If sup τ S b E[e am τ ] <, for some constant a > 2, where the supremum is taken over the set S b of all finite-valued stopping times τ, then E(M) is an L p -bounded martingale for p = 4a 4a2 (, ). Proof. We pick a finite stopping time τ and start from the following identity, which is valid for all constants p, s > : E(M) p τ = E( p/sm) s τe (p ps)m τ. Last Updated: May 5, 25

Lecture 22: Girsanov s Theorem 7 of 8 For > s >, we can use Hölder s inequality (note that /s and /( s) are conjugate exponents), to obtain (22.2) E[E(M) p ] (E[E( p/sm τ )]) s (E[exp( p ps s M τ )]) s. The first term of the product is the s-th power of the expectation a positive local martingale (and, therefore, supermartingale) sampled at a finite stopping time. By the optional sampling theorem it is always finite (actually, it is less then ). As for the second term, one can easily check that the expression p ps s attains its minimum in s over (, ) for s = 2p 2 p 2 p, and that this minimum value equals to f (p), where f (p) = 2 p 2p 2 p 2 p. If we pick p = 4a 4a2, then f (p) = a and both terms on the right hand side of (22.2) are bounded, uniformly in τ, so that E(M) is in fact a martingale and bounded in L p (why did we have to consider all stopping times τ, and only deterministic times?). Proposition 22.7 (Kazamaki s criterion). Suppose that for M M loc,c we have sup τ S b E[e 2 M τ ] <, where the supremum is taken over the set S b of all finite-valued stopping times, then E(M) is a uniformly integrable martingale. Proof. Note, first, that the function x exp( 2 x) is a test function of uniform integrability, so that the local martingale M is a uniformly integrable martingale and admits the last element M. For the continuous martingale cm, where < c < is an arbitrary constant, Lemma 22.6 and the assumption imply that the local martingale E(cM) is, in fact, a martingale bounded in L p, for p =. In particular, it is 2c c 2 uniformly integrable. Therefore, (22.3) E(cM) t = exp(cm t 2 c2 M t ) = E(M) c2 t e c( c)m t. By letting t in (22.3), we conclude that E(M) has the last element E(M), and that the equality in (22.3) holds at t =, as well. By Hölder s inequality with conjugate exponents /c 2 and /( c 2 ), we have = E[E(cM) ] E[E(M) ] c2 E[exp( c +c M )] c2. Jensen s inequality implies that E[exp( c +c M )] E[exp( 2 M )] 2c +c, and so E[E(M) ] c2 E[exp( 2 M )] 2c( c). We let c to get E[E(M) ], which, together with the nonnegative supermartingale property of E(M) implies that E(M) is a uniformly-integrable martingale. Last Updated: May 5, 25

Lecture 22: Girsanov s Theorem 8 of 8 Theorem 22.8 (Novikov s criterion). If M M loc,c is such that E[e 2 M ] <, then E(M) is a uniformly integrable martingale. Proof. Since e 2 M τ implies that = E(M) 2 e 4 M τ, the Cauchy-Schwarz inequality E[e 2 M τ ] = E[E(M) τ ] /2 E[e 2 M τ ] /2 E[e 2 M ], and Kazamaki s criterion can be applied. Last Updated: May 5, 25