The Cameron-Martin-Girsanov (CMG) Theorem

Similar documents
The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

1 Brownian Local Time

Tools of stochastic calculus

Estimates for the density of functionals of SDE s with irregular drift

On the quantiles of the Brownian motion and their hitting times.

Numerical methods for solving stochastic differential equations

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

The Wiener Itô Chaos Expansion

Lecture 22 Girsanov s Theorem

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

Weak solutions of mean-field stochastic differential equations

Applications of Ito s Formula

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

Representing Gaussian Processes with Martingales

MFE6516 Stochastic Calculus for Finance

Continuous Time Finance

Stochastic Calculus. Kevin Sinclair. August 2, 2016

Verona Course April Lecture 1. Review of probability

MATH 56A SPRING 2008 STOCHASTIC PROCESSES 197

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Multiple points of the Brownian sheet in critical dimensions

Malliavin Calculus in Finance

A Concise Course on Stochastic Partial Differential Equations

COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE

ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Exercises in stochastic analysis

CHANGE OF MEASURE. D.Majumdar

SEPARABLE TERM STRUCTURES AND THE MAXIMAL DEGREE PROBLEM. 1. Introduction This paper discusses arbitrage-free separable term structure (STS) models

On Stochastic Adaptive Control & its Applications. Bozenna Pasik-Duncan University of Kansas, USA

Itô s formula. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Wiener Measure and Brownian Motion

Jae Gil Choi and Young Seo Park

Harnack Inequalities and Applications for Stochastic Equations

MA50251: Applied SDEs

Introduction to Diffusion Processes.

Stochastic Integration and Stochastic Differential Equations: a gentle introduction

Stochastic Differential Equations

(B(t i+1 ) B(t i )) 2

A NOTE ON STATE ESTIMATION FROM DOUBLY STOCHASTIC POINT PROCESS OBSERVATION. 0. Introduction

Lecture 17 Brownian motion as a Markov process

An Introduction to Malliavin calculus and its applications

An Analytic Method for Solving Uncertain Differential Equations

1. Stochastic Processes and filtrations

LAN property for sde s with additive fractional noise and continuous time observation

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Potentials of stable processes

HJB equations. Seminar in Stochastic Modelling in Economics and Finance January 10, 2011

The Pedestrian s Guide to Local Time

Introduction to Stochastic SIR Model

Stochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier.

MAXIMAL COUPLING OF EUCLIDEAN BROWNIAN MOTIONS

Penalization of the Wiener Measure and Principal Values

ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Density models for credit risk

Stochastic Numerical Analysis

Exercise 4. An optional time which is not a stopping time

Lecture 12. F o s, (1.1) F t := s>t

Problem 1. Construct a filtered probability space on which a Brownian motion W and an adapted process X are defined and such that

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion

Laplace Transforms Chapter 3

Supplement A: Construction and properties of a reected diusion

Stochastic Differential Equations

Citation Osaka Journal of Mathematics. 41(4)

B8.3 Mathematical Models for Financial Derivatives. Hilary Term Solution Sheet 2

Poisson random measure: motivation

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2)

MA8109 Stochastic Processes in Systems Theory Autumn 2013

On continuous time contract theory

White noise generalization of the Clark-Ocone formula under change of measure

Continuous-time Gaussian Autoregression Peter Brockwell, Richard Davis and Yu Yang, Statistics Department, Colorado State University, Fort Collins, CO

PREDICTABLE REPRESENTATION PROPERTY OF SOME HILBERTIAN MARTINGALES. 1. Introduction.

The Azéma-Yor Embedding in Non-Singular Diffusions

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

arxiv: v2 [math.pr] 22 Aug 2009

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

Finite element approximation of the stochastic heat equation with additive noise

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Nested Uncertain Differential Equations and Its Application to Multi-factor Term Structure Model

Generalized Gaussian Bridges

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

A MODEL FOR THE LONG-TERM OPTIMAL CAPACITY LEVEL OF AN INVESTMENT PROJECT

Smoothness of the distribution of the supremum of a multi-dimensional diffusion process

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

On semilinear elliptic equations with measure data

Lecture 4: Introduction to stochastic processes and stochastic calculus

Homework # , Spring Due 14 May Convergence of the empirical CDF, uniform samples

Generalized Gaussian Bridges of Prediction-Invertible Processes

1 Solution to Problem 2.1

Exercises. T 2T. e ita φ(t)dt.

A Short Introduction to Diffusion Processes and Ito Calculus

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0}

Stochastic Differential Equations.

Stochastic Integration and Continuous Time Models

An Overview of the Martingale Representation Theorem

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Transcription:

The Cameron-Martin-Girsanov (CMG) Theorem There are many versions of the CMG Theorem. In some sense, there are many CMG Theorems. The first version appeared in ] in 944. Here we present a standard version, which gives the spirit of these theorems. Introductory material Let (Ω, F, P ) be a probability space. We can introduce other probability measures on (Ω, F) and obtain other probability spaces with the same sample space Ω and the same σ-algebra of events F. One easy way to do that is to start with a random variable Θ of (Ω, F) such that Θ P -a.s. and E P Θ] =, (.) where E P ] denotes the expectation associated to the measure P. Then, we can obtain a probability measure Q on (Ω, F) by setting Q(A) = E P Θ A ] for all A F, (.2) where A is the indicator (function) of the event A. It is easy to see that Q is a probability measure on (Ω, F) and, furthermore, that Q(A) = whenever P (A) =. (.3) Recall that if two measures P and Q are related as in formula (.3), we say that Q is absolutely continuous with respect to P and denote this as Q P. (.4) In fact, the Radon-Nicodym Theorem states that the above construction of Q is the only way to obtain probability measures, which are absolutely continuous with respect to P, namely: If Q is a probability measure on (Ω, F) and Q P, then there is a random variable Θ on (Ω, F) satisfying (.), for which Q(A) = E P Θ A ] for all A F. The random variable Θ is called the Radon-Nicodym derivative of Q with respect to P. Symbolically, Θ = dq dp and it is not hard to see that Θ is unique P -a.s. Furthermore, if we also have that P Q, then dp dq = Θ. If E Q ] is the expectation associated to the measure Q of (.2), then E Q A ] = Q(A) = E P Θ A ] for all A F,

and hence E Q X] = E P ΘX] for any r.v. X of (Ω, F) (.5) (if E P ΘX] does not exist, then so does E Q X]). The following proposition tells us how we can express conditional expectations associated to the measure Q of (.2) in terms of conditional expectations associated to P. Proposition. Let B be a subalgebra of F. Then E Q X B] = E P ΘX B] E P Θ B] Q-a.s. (.6) Proof. For typographical convenience let us put Ψ := E P Θ B]. (.7) Of course, Ψ P -a.s., Ψ M(B) (i.e. Ψ is B-measurable), and E P Ψ] = E P EP Θ B] ] = E P Θ] =. (.8) Because of the above we can define another probability measure on (Ω, F) as Q(A) = E P Ψ A ]. (.9) Observe that, for A B we have Q(A) = E P Θ A ] = E P EP Θ A B] ] = E P A E P Θ B] ] = E P Ψ A ], that is (in view of (.9)) which also implies immediately that We continue by setting Q(A) = Q(A) for all A B, (.) E Q X] = E QX] for all X M(B). (.) Y := E P ΘX B] and Y 2 := ΨE Q X B]. (.2) Then, formula (.6) is equivalent to Y = Y 2 Q-a.s. (.3) Notice that Y, Y 2 M(B). Hence, in view of (.), to establish (.3) it suffices to show that E QY A ] = E QY 2 A ] for all A B. (.4) 2

Now E QY A ] = E P ΨY A ] = E P ΨEP ΘX B] A ] = E P EP ΨΘX A B] ] = E P ΨΘX A ] = E Q ΨX A ] and (with the help of (.)) ] E QY 2 A ] = E Q ΨEQ X B] A = E Q EQ ΨX A B] ] = E Q EQ ΨX A B] ] = E Q ΨX A ]. Therefore (.4) is true and the proof is completed. Why do we need to consider more than one probability measures? First of all for educational purposes, since it helps to clarify certain concepts and issues. But, also, because such situations arise in applications. For example, two persons want to divide a pie (or a property) into two pieces so that each of them feels (s)he got a fair share. However, each person may have a different measure of fairness. 2 An example A random variable X of (Ω, F) is a F-measurable function X : Ω R. Thus, X depends on Ω and F. However, X does not depend on the probability measure put on (Ω, F). It is the distribution of X which depends on the measure. For instance, if P and Q are two probability measures on (Ω, F), then the distribution functions of X with respect to P and Q respectively are F P (x) = P {X x} and F Q (x) = Q{X x}. Of course, in general F P (x) F Q (x). Thus, e.g., the phrase X is a normal random variable (i.e. X is a normally distributed random variable ) may be misleading (or, at least, equivocal), if we work with more than one probability measures, since the distribution of X depends on the measure. The same remarks apply, of course, to random vectors and stochastic processes. For example, a process which is a Brownian motion with respect to a measure P, it will probably not be a Brownian motion with respect to another measure Q. Furthermore, if X and Y are random variables which are independent with respect to P, they may not be independent with respect to another measure Q. These issues do not come up if there is only one measure, but if there are more than one measures around, then they may become very important. Let us look at a specific example (taken from 3]). Suppose we have a probability space (Ω, F, P ) and a random variable Z of this space, which 3

has the standard normal distribution (with respect to P ). That is P {Z z} = Φ(z) = z 2π e ξ2 /2 dξ, z R. (2.) For typographical convenience we write Z P N(, ). Of course, there are many random variables of (Ω, F, P ) having the standard normal distribution (e.g., due to the symmetry of Φ(z), if Z is such a variable, then so is Z). Recall that E P Z] =, V P Z] = (2.2) (where V P ] is the variance with respect to P ), and E P e µz ] = e µ2 /2 for any µ C. (2.3) Equation (2.3) can be written as E P e µz 2 µ2] =. (2.4) Thus, for a Z as above and a µ R we can introduce the probability measure Q defined by ] Q(A) = E P e µz 2 µ2 A, A F. (2.5) A question arising here is: What is the distribution of Z with respect to Q? We have ] Q{Z x} = E P e µz 2 µ2 {Z x} = e µz 2 µ2 {z x} e z2 /2 dz 2π = x 2π e (z µ)2 2 dz, i.e., with respect to Q, Z is a normal random variable with mean µ and variance ; symbolically, Z Q N(µ, ), which can be written equivalently as (Z µ) Q N(, ), i.e. Z µ has the standard normal distribution with respect to Q. To extend this example, let the random variables Z,..., Z n be independent and identically distributed with respect to P, all having the standard normal distribution. We now set Q(A) = E P e n j= µ jz j n ] 2 j= µ2 j A, A F, (2.6) where µ,... µ n R are given constants. It is not hard to see that Q is a probability measure on (Ω, F). Again, we would like to find the (joint) distribution of Z,..., Z n with respect to Q. 4

For a reasonable (say bounded and continuous) function g : R n R we have (recall (.5)) E Q g(z,..., Z n )] = E P e n j= µ jz j n ] 2 j= µ2 j g(z,..., Z n ) ( = g(z,..., z n )e n j= µ jz j z n 2 ) 2 j= µ2 + + 2 z2 n2 j e dz (2π) n/2 dz n = g(z,..., z n ) e (2π) n/2 (z µ) 2 2 e (z n µ) 2 2 dz dz n. It follows that the joint probability density function of Z,..., Z n (with respect to Q) is f Q (z,..., z n ) = (z µ) 2 e 2 e (z n µ) 2 2, (2.7) (2π) n/2 which tells us that, with respect to Q, the variables Z,..., Z n are independent and furthermore Z j Q N(µ j, ), i.e. (Z j µ j ) Q N(, ), for j =,..., n. Thus, the random variable n Z j j= n µ j (2.8) is, with respect to Q, normally distributed with mean and variance n. Roughly speaking, the Cameron-Martin-Girsanov Theorem is a continuous version of the above simple example. In fact, having this example in mind, one can guess the statement of the CMG Theorem (see the remark after Theorem in the next section). 3 The Cameron-Martin-Girsanov Theorem 3. CMG Theorem in R Consider a probability space (Ω, F, P ) and an one-dimensional Brownian motion process {B t = B(t)} t (with respect to P ) with associated filtration {F t } t. Of course F t is a subalgebra of F for every t. Next, let {X t = X(t)} t be a (real-valued) stochastic process, adapted to the filtration {F t } t. We assume that for some constant T > we have j= Having X t we introduce T X 2 t dt < P -a.s. (3.) M t := e Y t, where Y t := 2 5 X 2 s ds + X s db s (3.2)

(notice that M t > P -a.s.). In differential notation we have Applying Itô Calculus to (3.2) yields dy t = 2 X2 t dt + X t db t, Y =. (3.3) dm t = e Y t dy t + 2 ey t X 2 t dt = M t X t db t, M =. (3.4) Of course, (3.4) can be written equivalently as an integral equation M t = + M s X s db s. (3.5) It follows that the Itô process M t (being a stochastic integral) is an F t - martingale (with respect to P ). In particular, E P M t ] = E P M ] = for all t, T ]. (3.6) Having M t, and in view of (3.6), we introduce the probability measures on (Ω, F) Q(A) = E P M T A ]; Q t (A) = E P M t A ], < t < T. (3.7) Observe that, for t T we have M t = E P M T F t ] (since M t is a martingale). Hence, as in (.), we have Q t (A) = Q(A) for all A F t. (3.8) Theorem (CMG in R ). Let (Ω, F, P ), B t, X t, F t, and T be as above. Set W t := B t X s ds, t, T ]. (3.9) Then, for any fixed T > the process W t, t T is an F t -Brownian motion on (Ω, F, Q) (i.e. with respect to Q). Proof. We will prove the theorem with the help of the Lévy characterization of Brownian motion in R : A continuous stochastic process W = {W t } in a probability space (Ω, F, Q) is an one-dimensional Brownian motion if and only if (i) W t is a martingale with respect to Ft W and (ii) Wt 2 t is a martingale with respect to Ft W. The proof of this characterization will not be given here (it can be found, e.g., in 3]). If we accept this characterization of Brownian motion, then, in order to prove Theorem we only need to check that W t of (3.9) satisfies (i) and (ii). 6

Let us check (i). Set K t := M t W t, (3.) where M t is given by (3.2). Then, Itô Calculus, (3.4), and (3.9) give dk t = W t dm t + M t dw t + dw t dm t ( ) = B t X s ds M t X t db t + M t (db t X t dt) + M t X t dt = ( B t ) ] X s ds M t X t + M t db t, (3.) which implies that K t = M t W t is an F t -martingale with respect to P. Now, for s t T, by (3.8) and Proposition we get E Q W t F s ] = E Qt W t F s ] = E P M t W t F s ] E P M t F s ] = M sw s M s = W s, Q-a.s. which says that W t is an F t -martingale with respect to Q. Checking of (ii), namely that W 2 t t is an F t -martingale with respect to Q, can be done in a similar way and is left as an exercise (EXERCISE ). Remark. The quantity M t of (3.2) can be viewed (at least in the case where X t is a deterministic function of t) as a continuous analog of the quantity e n j= µ jz j 2 n j= µ2 j, which appears in (2.6). Likewise, by writing B t = db s (assuming B = ) we can view the quantity W t of (3.9) as a continuous analog of the quantity displayed in (2.8). Theorem 2 (CMG in R d ). Let {B(t)} t, be a d-dimensional Brownian motion on (Ω, F, P ), with associated filtration {F t } t. Also, let {X(t)} t be a stochastic process with values in R d, adapted to the filtration {F t } t. We assume that for some constant T > we have Set T M t := e Yt, where Y t := 2 X t 2 dt < P -a.s. (3.2) X s 2 ds + X s db s (3.3) (notice that M t > P -a.s.). It is not hard to see that M t is an F t -martingale (with respect to P ) and E P M t ] = for all t, T ] (EXERCISE 2). If Q is the probability measure on (Ω, F) defined by Q(A) = E P M T A ], (3.4) 7

then, for any fixed T >, the process W t := B t is a d-dimensional F t -Brownian motion on (Ω, F, Q). X s ds, t, T ] (3.5) Proof. The proof is similar to the proof of Theorem and it is left as an exercise (EXERCISE 3). You may use (without proof) the Lévy characterization of Brownian motion in R d 3]: A continuous stochastic process W = {W (t)} in a probability space (Ω, F, Q) is an d-dimensional Brownian motion if and only if (i) W (t) is a martingale with respect to Ft W (meaning that each component W j (t), j =,..., d, is a martingale) and (ii) for any j, k {,..., d} we have that W j (t)w k (t) δ jk t is a martingale with respect to Ft W (here δ jk is the Kronecker delta). References ] R.H. Cameron and W.T. Martin, Transformation of Wiener integrals under translations, Annals of Mathematics, 45, 386 396, 944. 2] R. Durrett, Brownian Motion and Martingales in Analysis, Wadsworth, Inc. Belmont, CA, USA, 984. 3] I. Karatzas and S.E. Shreve, Brownian Motion and Stochastic Calculus, Second Edition, Springer, New York, 99. 4] B. Øksendal, Stochastic Differential Equations. An Introduction with Applications, Sixth Edition, Springer-Verlag, Berlin, 23. 8