Lecture 22 Girsanov s Theorem

Size: px
Start display at page:

Download "Lecture 22 Girsanov s Theorem"

Transcription

1 Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n ξ k, n =,..., N, k= where ξ k are independent N(, ) random variables. The random vector (X,..., X N ) is then, itself, Gaussian, and admits the density f (x,..., x N ) = C N e 2 (x 2 +(x 2 x ) 2 + +(x N x N ) 2 ) with respect to the Lebesgue measure on R N, for some C N >. Let us now repeat the whole construction, with the n-th step having the N(µ n, )-distribution, for some µ,..., µ N R. The resulting, Gaussian, distribution still admits a density with respect to the Lebesgue measure, and it is given by f (x,..., x N ) = C N e 2 ((x µ ) 2 +(x 2 x µ 2 ) 2 + +(x N x N µ N ) 2 ) The two densities are everywhere positive, so the two Gaussian measures are equivalent to each other and the Radon-Nikodym derivative turns out to be dq dp = f (X,...,X N ) f (X,...,X N ) ( ) ( ) = e µ X µ 2 (X 2 X ) µ N (X N X N ) + 2 µ 2 +µ2 + +µ 2 N = e N k= µ k(x k X k ) 2 N k= µ2 k.. Equivalent measure changes Let Q be a probability measure on F, equivalent to P, i.e., A F, P[A] = if and only if Q[A] =. Its Radon-Nikodym derivative Last Updated: May 5, 25

2 Lecture 22: Girsanov s Theorem 2 of 8 Z = dq dp is a non-negative random variable in L with E[Z] =. The uniformly-integrable martingale Z t = E[Z F t ], t, is called the density of Q with respect to P (note that we can - and do - assume that {Z t } t [, ) is càdlàg). We will often use the shortcut Q-(local, semi-, etc.) martingale for a process which is a (local, semi-, etc.) martingale for (Ω, F, F t, Q). Proposition 22.. Let {X t } t [, ) be a càdlàg and adapted process. Then X is a Q-local martingale if and only if the product {Z t X t } t [, ) is a càdlàg P-local martingale. Before we give a proof, here is a simple and useful lemma. Since the measures involved are equivalent, we are free to use the phrase almost surely without explicit mention of the probability. Lemma Let (Ω, H, P) be a probability space, and let G H be a subσ-algebra of H. Givan a probability measure Q on H, equivalent to P, let Z = dq dp be its Radon-Nikodym derivative with respect to P. For a random variable X L (F, Q) we have XZ L (P) and E Q [X G] = E[XZ G], a.s. E[Z G] where E Q [ G] denotes the conditional expectation on (Ω, H, Q). Proof. First of all, note that the Radon-Nikodym theorem implies that XZ L (P) and that the set {E[Z G] = } has Q-probability (and, therefore P-probability). Indeed, Q[E[Z G] = ] = E Q [ {E[Z G]=} ] = E[Z {E[Z G]=} ] = E[E[Z {E[Z G]=} G]] Therefore, the expression on the right-hand side is well-defined almost surely, and is clearly G-measurable. Next, we pick A G, observe that E Q [ A E[Z G] E[XZ G]] = E[Z A E[Z G] E[XZ G]] = E[E[Z G] A E[Z G] E[XZ G]] = E[E[ZX A G]] = E Q [X A ], and remember the definition of conditional expectation. Proof of Proposition 22.. Suppose, first, that X is a Q-martingale. Then E Q [X t F s ] = X s, Q-a.s. By the tower property of conditional expectation, the random variable Z t is the Radon-Nikodym derivative of (the Last Updated: May 5, 25

3 Lecture 22: Girsanov s Theorem 3 of 8 restriction of) Q with respect to (the restriction of) P on the probability space (Ω, F t, P) (prove this yourself!). Therefore, we can use Lemma 22.2 with F t playing the role of H and F s the role G, and rewrite the Q-martingale property of X as (22.) Z s E[X t Z t F s ] = X s, Q a.s., i.e. E[X t Z t F s ] = Z s X s, P a.s. We leave the other direction, as well as the case of a local martingale to the reader. Proposition Suppose that the density process {Z t } t [, ) is continuous. Let X be a continuous semimartingale under P with decomposition X = X + M + A. Then X is also a Q-semimartingale, and its Q-semimartingale decomposition is given by X = X + N + B, where N = M F, B = A + F where F t = Z t d M, Z. Proof. The process F is clearly well-defined, continuous, adapted and of finite variation, so it will be enough to show that M F is a Q- local martingale. Using Proposition 22., we only need to show that Y = Z(M F) is a P-local martingale. By Itô s formula (integrationby-parts), the finite-variation part of Y is given by Z u df u + Z, M t, and it is easily seen to vanish using the associative property of Stieltjes integration. One of the most important applications of the above result is to the case of a Brownian motion. A cloud of simulated Brownian paths on [, 3] Theorem 22.4 (Girsanov; Cameron and Martin). Suppose that the filtration {F t } t [, ) is the usual augmentation of the natural filtration generated by a Brownian motion {B t } t [, ).. Let Q P be a probability measure on F and let {Z t } t [, ) be the corresponding density process, i.e., Z t = E[ dq dp F t]. Then, here exists a predictable process {θ t } t [, ) in L(B) such that Z = E( θ u db u ) and B t θ u du is a Q-Brownian motion. 2. Conversely, let {θ t } t [, ) L(B) have the property that the process Z = E( θ u db u ) is a uniformly-integrable martingale with Z >, a.s. For any probability measure Q P such that E[ dq dp F ] = Z, B t θ u du, t, is a Q-Brownian motion. The same cloud with darker-colored paths corresponding to higher values of the Radon-Nikodym derivative Z 3. Last Updated: May 5, 25

4 Lecture 22: Girsanov s Theorem 4 of 8 Proof.. We start with an application of the martingale representation theorem (Proposition 2.6). It implies that there exists a process ρ L(B) such that Z t = + ρ u db u. Since Z is continuous and bounded away from zero on each segment, the process {θ t } t [, ), given by θ t = ρ t /Z t is in L(B) and we have Z t = + Z u θ u db u. Hence, Z = E( θ u db u ). Proposition 22.3 states that B is a Q- semimartingale with decomposition B = (B F) + F, where the continuous FV-process F is given by F t = Z u B, Z u = Z u Z u θ u du = θ u du. In particular, B F is a Q-local martingale. On the other hand, its quadratic variation (as a limit in P-, and therefore in Q-probability) is that of B, so, by Lévy s characterization, B F is a Q-Brownian motion. 2. We only need to realize that any measure Q P with E[ dq dp F ] = Z will have Z as its density process. The rest follows from (). Even though we stated it on [, ), most of applications of the Girsanov s theorem are on finite intervals [, T], with T >. The reason is that the condition that E( θ u db u ) be uniformly integrable on the entire [, ) is either hard to check or even not satisfied for most practically relevant θ. The simplest conceivable example θ t = µ, for all t and µ R \ {} gives rise to the exponential martingale Z t = e µb t 2 µ2t, which is not uniformly integrable on [, ) (why?). On any finite horizon [, T], the (deterministic) process µ {t T} satisfies the conditions of Girsanov s theorem, and there exists a probability measure P µ,t on F T with the property that ˆB t = B t µt is a P µ,t Brownian motion on [, T]. It is clear, furthermore, that for T < T 2, P µ,t coincides with the restriction of P µ,t 2 onto F T. Our life would be easier if this consistency property could be extended all the way up to F. It can be shown that this can, indeed, be done in the canonical setting, but not in same equivalence class. Indeed, suppose that there exists a probability measure P µ on F, equivalent to P, such that P µ, restricted to F T, coincides with P µ,t, for each T >. Let {Z t } t [, ) be the density process of P µ with respect to P. It follows that Z T = exp(µb T 2 µ2 T), for all T >. Last Updated: May 5, 25

5 Lecture 22: Girsanov s Theorem 5 of 8 Since µ =, we have B t 2 µt, a.s., as T and, so, Z = lim T Z T =, a.s. On the other hand, Z is the Radon- Nikodym derivative of P µ with respect to P on F, and we conclude that P µ must be singular with respect to P. Here is slightly different perspective on the fact that P and P µ must be mutually singular: for the event A F, given by { A = lim t } B t t =, we have P[A] =, by the Law of Large Numbers for the Brownian motion. On the other hand, with ˆB t being a P µ Brownian motion, we have P µ [A] = P µ B [ lim t t t = ] = P µ ˆB [ lim t t t = µ] =, because ˆB t /t, P µ -a.s. Not everything is lost, though, as we can still use employ Girsanov s theorem in many practical situations. Here is one (where we take P[X dx] = f (x) dx to mean that f (x) is the density of the distribution of X.) Example 22.5 (Hitting times of the Brownian motion with drift). Define τ a = inf{t : B t = a} for a >. By the formula derived in a homework, we have P[τ a dt] = a a2 e 2t dt. 2πt 3 For T, (the restriction of) P and P µ,t are equivalent on F T with the Radon-Nikodym derivative dp µ,t dp = exp(µb T 2 µ2 T). The optional sampling theorem (justified by the uniform integrability of the martingale exp(µb t 2 µ2 t) on [, T]) and the fact that {τ a T} F τa T F T imply that Therefore, E[exp(µB T 2 µ2 T) F τa T] = exp(µb τa T 2 µ2 (τ a T)). P µ,t [τ a T] = E µ,t [ {τa T}] = E[exp(µB T 2 µ2 T) {τa T}] = E[exp(µB τa T 2 µ2 (τ a T)) {τa T}] = E[exp(µB τa 2 µ2 τ a ) {τa T}] = E[exp(µa 2 µ2 τ a ) {τa T}] = = T e µa 2 µ2 t a 2πt 3 e a2 2t dt. T e µa 2 µ2t P[τ a dt] Last Updated: May 5, 25

6 Lecture 22: Girsanov s Theorem 6 of 8 On the other hand, {B t µt} t [,T] is a Brownian motion under P µ,t, so P µ,t [τ a T] = P[ ˆτ a T], where ˆτ a is the first hitting time of the level a of the Brownian motion with drift µ. It follows immediately that the density of ˆτ a is given by P[ ˆτ a dt] = a (a µt)2 e 2t dt. 2πt 3 We quote the word density because, if one tries to integrate it over all t, one gets P[ ˆτ a < ] = (a µt)2 a e 2t dt = exp(µa µa ). 2πt 3 In words, if µ and a have the same sign, the Brownian motion with drift µ will hit a sooner or later. On the other hand, if they differ in sign, the probability that it will never get there is strictly positive and equal to e 2µa. Kazamaki s and Novikov s criteria The message of the second part of Theorem 22.4 is that, given a drift process {θ t } t [, ), we can turn a Brownian motion into a Brownian motion with drift θ, provided, essentially, that a certain exponential martingale is a UI martingale. Even though useful sufficient conditions for martingality of stochastic integrals are known, the situation is much less pleasant in the case of stochastic exponentials. The most well-known criterion is the one of Novikov. Novikov s criterion is, in turn, implied by a slightly stronger criterion of Kazamaki. We start with an auxiliary integrability result. In addition to the role it plays in the proof of Kazamaki s criterion, it is useful when one needs E(M) to be a little more than just a martingale. Lemma Let E(M) be the stochastic exponential of M M loc,c. If sup τ S b E[e am τ ] <, for some constant a > 2, where the supremum is taken over the set S b of all finite-valued stopping times τ, then E(M) is an L p -bounded martingale for p = 4a 4a2 (, ). Proof. We pick a finite stopping time τ and start from the following identity, which is valid for all constants p, s > : E(M) p τ = E( p/sm) s τe (p ps)m τ. Last Updated: May 5, 25

7 Lecture 22: Girsanov s Theorem 7 of 8 For > s >, we can use Hölder s inequality (note that /s and /( s) are conjugate exponents), to obtain (22.2) E[E(M) p ] (E[E( p/sm τ )]) s (E[exp( p ps s M τ )]) s. The first term of the product is the s-th power of the expectation a positive local martingale (and, therefore, supermartingale) sampled at a finite stopping time. By the optional sampling theorem it is always finite (actually, it is less then ). As for the second term, one can easily check that the expression p ps s attains its minimum in s over (, ) for s = 2p 2 p 2 p, and that this minimum value equals to f (p), where f (p) = 2 p 2p 2 p 2 p. If we pick p = 4a 4a2, then f (p) = a and both terms on the right hand side of (22.2) are bounded, uniformly in τ, so that E(M) is in fact a martingale and bounded in L p (why did we have to consider all stopping times τ, and only deterministic times?). Proposition 22.7 (Kazamaki s criterion). Suppose that for M M loc,c we have sup τ S b E[e 2 M τ ] <, where the supremum is taken over the set S b of all finite-valued stopping times, then E(M) is a uniformly integrable martingale. Proof. Note, first, that the function x exp( 2 x) is a test function of uniform integrability, so that the local martingale M is a uniformly integrable martingale and admits the last element M. For the continuous martingale cm, where < c < is an arbitrary constant, Lemma 22.6 and the assumption imply that the local martingale E(cM) is, in fact, a martingale bounded in L p, for p =. In particular, it is 2c c 2 uniformly integrable. Therefore, (22.3) E(cM) t = exp(cm t 2 c2 M t ) = E(M) c2 t e c( c)m t. By letting t in (22.3), we conclude that E(M) has the last element E(M), and that the equality in (22.3) holds at t =, as well. By Hölder s inequality with conjugate exponents /c 2 and /( c 2 ), we have = E[E(cM) ] E[E(M) ] c2 E[exp( c +c M )] c2. Jensen s inequality implies that E[exp( c +c M )] E[exp( 2 M )] 2c +c, and so E[E(M) ] c2 E[exp( 2 M )] 2c( c). We let c to get E[E(M) ], which, together with the nonnegative supermartingale property of E(M) implies that E(M) is a uniformly-integrable martingale. Last Updated: May 5, 25

8 Lecture 22: Girsanov s Theorem 8 of 8 Theorem 22.8 (Novikov s criterion). If M M loc,c is such that E[e 2 M ] <, then E(M) is a uniformly integrable martingale. Proof. Since e 2 M τ implies that = E(M) 2 e 4 M τ, the Cauchy-Schwarz inequality E[e 2 M τ ] = E[E(M) τ ] /2 E[e 2 M τ ] /2 E[e 2 M ], and Kazamaki s criterion can be applied. Last Updated: May 5, 25

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1) Question 1 The correct answers are: a 2 b 1 c 2 d 3 e 2 f 1 g 2 h 1 Question 2 a Any probability measure Q equivalent to P on F 2 can be described by Q[{x 1, x 2 }] := q x1 q x1,x 2, 1 where q x1, q x1,x

More information

Lecture 4 Lebesgue spaces and inequalities

Lecture 4 Lebesgue spaces and inequalities Lecture 4: Lebesgue spaces and inequalities 1 of 10 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 4 Lebesgue spaces and inequalities Lebesgue spaces We have seen how

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s. 20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

Information and Credit Risk

Information and Credit Risk Information and Credit Risk M. L. Bedini Université de Bretagne Occidentale, Brest - Friedrich Schiller Universität, Jena Jena, March 2011 M. L. Bedini (Université de Bretagne Occidentale, Brest Information

More information

(2) E exp 2 M, M t < t [0, ) guarantees that Z is a martingale. Kazamaki [27] showed that Z is a martingale provided

(2) E exp 2 M, M t < t [0, ) guarantees that Z is a martingale. Kazamaki [27] showed that Z is a martingale provided ON THE MARTINGALE PROPERTY OF CERTAIN LOCAL MARTINGALES ALEKSANDAR MIJATOVIĆ AND MIKHAIL URUSOV Abstract. The stochastic exponential Z t = exp{m t M (1/2) M, M t} of a continuous local martingale M is

More information

Generalized Gaussian Bridges of Prediction-Invertible Processes

Generalized Gaussian Bridges of Prediction-Invertible Processes Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine

More information

arxiv: v2 [math.pr] 14 Nov 2018

arxiv: v2 [math.pr] 14 Nov 2018 arxiv:1702.03573v2 [math.pr] 14 Nov 2018 Stochastic Exponentials and Logarithms on Stochastic Intervals A Survey Martin Larsson Johannes Ruf November 16, 2018 Abstract Stochastic exponentials are defined

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

LAN property for sde s with additive fractional noise and continuous time observation

LAN property for sde s with additive fractional noise and continuous time observation LAN property for sde s with additive fractional noise and continuous time observation Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Samy Tindel (Purdue University) Vlad s 6th birthday,

More information

Part III Advanced Probability

Part III Advanced Probability Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after

More information

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f), 1 Part I Exercise 1.1. Let C n denote the number of self-avoiding random walks starting at the origin in Z of length n. 1. Show that (Hint: Use C n+m C n C m.) lim n C1/n n = inf n C1/n n := ρ.. Show that

More information

The Cameron-Martin-Girsanov (CMG) Theorem

The Cameron-Martin-Girsanov (CMG) Theorem The Cameron-Martin-Girsanov (CMG) Theorem There are many versions of the CMG Theorem. In some sense, there are many CMG Theorems. The first version appeared in ] in 944. Here we present a standard version,

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

Bayesian quickest detection problems for some diffusion processes

Bayesian quickest detection problems for some diffusion processes Bayesian quickest detection problems for some diffusion processes Pavel V. Gapeev Albert N. Shiryaev We study the Bayesian problems of detecting a change in the drift rate of an observable diffusion process

More information

On the quantiles of the Brownian motion and their hitting times.

On the quantiles of the Brownian motion and their hitting times. On the quantiles of the Brownian motion and their hitting times. Angelos Dassios London School of Economics May 23 Abstract The distribution of the α-quantile of a Brownian motion on an interval [, t]

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Summable processes which are not semimartingales

Summable processes which are not semimartingales Summable processes which are not semimartingales Nicolae Dinculeanu Oana Mocioalca University of Florida, Gainesville, FL 32611 Abstract The classical stochastic integral HdX is defined for real-valued

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes

Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes BMS Advanced Course Stochastic Processes III/ Wahrscheinlichkeitstheorie IV Michael Scheutzow Lecture Notes Technische Universität Berlin Wintersemester 218/19 preliminary version November 28th 218 Contents

More information

Lecture 6 Random walks - advanced methods

Lecture 6 Random walks - advanced methods Lecture 6: Random wals - advanced methods 1 of 11 Course: M362K Intro to Stochastic Processes Term: Fall 2014 Instructor: Gordan Zitovic Lecture 6 Random wals - advanced methods STOPPING TIMES Our last

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Mathematical finance (extended from lectures of Fall 2012 by Martin Schweizer transcribed by Peter Gracar and Thomas Hille) Josef Teichmann

Mathematical finance (extended from lectures of Fall 2012 by Martin Schweizer transcribed by Peter Gracar and Thomas Hille) Josef Teichmann Mathematical finance (extended from lectures of Fall 2012 by Martin Schweizer transcribed by Peter Gracar and Thomas Hille) Josef Teichmann Contents Chapter 1. Arbitrage Theory 5 1. Stochastic Integration

More information

Nonlinear Lévy Processes and their Characteristics

Nonlinear Lévy Processes and their Characteristics Nonlinear Lévy Processes and their Characteristics Ariel Neufeld Marcel Nutz January 11, 215 Abstract We develop a general construction for nonlinear Lévy processes with given characteristics. More precisely,

More information

INVARIANCE TIMES. Laboratoire de Mathématiques et Modélisation d Évry and UMR CNRS Évry Cedex, France

INVARIANCE TIMES. Laboratoire de Mathématiques et Modélisation d Évry and UMR CNRS Évry Cedex, France Submitted to the Annals of Probability By Stéphane INVARIANCE TIMES Crépey and Shiqi Song Université d Évry Val d Essonne Laboratoire de Mathématiques et Modélisation d Évry and UMR CNRS 807 9037 Évry

More information

On an Effective Solution of the Optimal Stopping Problem for Random Walks

On an Effective Solution of the Optimal Stopping Problem for Random Walks QUANTITATIVE FINANCE RESEARCH CENTRE QUANTITATIVE FINANCE RESEARCH CENTRE Research Paper 131 September 2004 On an Effective Solution of the Optimal Stopping Problem for Random Walks Alexander Novikov and

More information

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents MATH 3969 - MEASURE THEORY AND FOURIER ANALYSIS ANDREW TULLOCH Contents 1. Measure Theory 2 1.1. Properties of Measures 3 1.2. Constructing σ-algebras and measures 3 1.3. Properties of the Lebesgue measure

More information

Additive Lévy Processes

Additive Lévy Processes Additive Lévy Processes Introduction Let X X N denote N independent Lévy processes on. We can construct an N-parameter stochastic process, indexed by N +, as follows: := X + + X N N for every := ( N )

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

The Azéma-Yor Embedding in Non-Singular Diffusions

The Azéma-Yor Embedding in Non-Singular Diffusions Stochastic Process. Appl. Vol. 96, No. 2, 2001, 305-312 Research Report No. 406, 1999, Dept. Theoret. Statist. Aarhus The Azéma-Yor Embedding in Non-Singular Diffusions J. L. Pedersen and G. Peskir Let

More information

Optimal Stopping Problems and American Options

Optimal Stopping Problems and American Options Optimal Stopping Problems and American Options Nadia Uys A dissertation submitted to the Faculty of Science, University of the Witwatersrand, in fulfilment of the requirements for the degree of Master

More information

Harmonic Functions and Brownian motion

Harmonic Functions and Brownian motion Harmonic Functions and Brownian motion Steven P. Lalley April 25, 211 1 Dynkin s Formula Denote by W t = (W 1 t, W 2 t,..., W d t ) a standard d dimensional Wiener process on (Ω, F, P ), and let F = (F

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Skorokhod Embeddings In Bounded Time

Skorokhod Embeddings In Bounded Time Skorokhod Embeddings In Bounded Time Stefan Ankirchner Institute for Applied Mathematics University of Bonn Endenicher Allee 6 D-535 Bonn ankirchner@hcm.uni-bonn.de Philipp Strack Bonn Graduate School

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

Risk Measures in non-dominated Models

Risk Measures in non-dominated Models Purpose: Study Risk Measures taking into account the model uncertainty in mathematical finance. Plan 1 Non-dominated Models Model Uncertainty Fundamental topological Properties 2 Risk Measures on L p (c)

More information

SUPERMARTINGALES AS RADON-NIKODYM DENSITIES AND RELATED MEASURE EXTENSIONS

SUPERMARTINGALES AS RADON-NIKODYM DENSITIES AND RELATED MEASURE EXTENSIONS Submitted to the Annals of Probability SUPERMARTINGALES AS RADON-NIKODYM DENSITIES AND RELATED MEASURE EXTENSIONS By Nicolas Perkowski and Johannes Ruf Université Paris Dauphine and University College

More information

Supermartingales as Radon-Nikodym Densities and Related Measure Extensions

Supermartingales as Radon-Nikodym Densities and Related Measure Extensions Supermartingales as Radon-Nikodym Densities and Related Measure Extensions Nicolas Perkowski Institut für Mathematik Humboldt-Universität zu Berlin Johannes Ruf Oxford-Man Institute of Quantitative Finance

More information

r=1 I r of intervals I r should be the sum of their lengths:

r=1 I r of intervals I r should be the sum of their lengths: m3f33chiii Chapter III: MEASURE-THEORETIC PROBABILITY 1. Measure The language of option pricing involves that of probability, which in turn involves that of measure theory. This originated with Henri LEBESGUE

More information

Stochastic Integration and Continuous Time Models

Stochastic Integration and Continuous Time Models Chapter 3 Stochastic Integration and Continuous Time Models 3.1 Brownian Motion The single most important continuous time process in the construction of financial models is the Brownian motion process.

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION A la mémoire de Paul-André Meyer Francesco Russo (1 and Pierre Vallois (2 (1 Université Paris 13 Institut Galilée, Mathématiques 99 avenue J.B. Clément

More information

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011 Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding

More information

MARTINGALES: VARIANCE. EY = 0 Var (Y ) = σ 2

MARTINGALES: VARIANCE. EY = 0 Var (Y ) = σ 2 MARTINGALES: VARIANCE SIMPLEST CASE: X t = Y 1 +... + Y t IID SUM V t = X 2 t σ 2 t IS A MG: EY = 0 Var (Y ) = σ 2 V t+1 = (X t + Y t+1 ) 2 σ 2 (t + 1) = V t + 2X t Y t+1 + Y 2 t+1 σ 2 E(V t+1 F t ) =

More information

Optimal Stopping under Adverse Nonlinear Expectation and Related Games

Optimal Stopping under Adverse Nonlinear Expectation and Related Games Optimal Stopping under Adverse Nonlinear Expectation and Related Games Marcel Nutz Jianfeng Zhang First version: December 7, 2012. This version: July 21, 2014 Abstract We study the existence of optimal

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

δ xj β n = 1 n Theorem 1.1. The sequence {P n } satisfies a large deviation principle on M(X) with the rate function I(β) given by

δ xj β n = 1 n Theorem 1.1. The sequence {P n } satisfies a large deviation principle on M(X) with the rate function I(β) given by . Sanov s Theorem Here we consider a sequence of i.i.d. random variables with values in some complete separable metric space X with a common distribution α. Then the sample distribution β n = n maps X

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

4. Conditional risk measures and their robust representation

4. Conditional risk measures and their robust representation 4. Conditional risk measures and their robust representation We consider a discrete-time information structure given by a filtration (F t ) t=0,...,t on our probability space (Ω, F, P ). The time horizon

More information

Supplement A: Construction and properties of a reected diusion

Supplement A: Construction and properties of a reected diusion Supplement A: Construction and properties of a reected diusion Jakub Chorowski 1 Construction Assumption 1. For given constants < d < D let the pair (σ, b) Θ, where Θ : Θ(d, D) {(σ, b) C 1 (, 1]) C 1 (,

More information

MTH 404: Measure and Integration

MTH 404: Measure and Integration MTH 404: Measure and Integration Semester 2, 2012-2013 Dr. Prahlad Vaidyanathan Contents I. Introduction....................................... 3 1. Motivation................................... 3 2. The

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

arxiv: v1 [math.pr] 24 Sep 2018

arxiv: v1 [math.pr] 24 Sep 2018 A short note on Anticipative portfolio optimization B. D Auria a,b,1,, J.-A. Salmerón a,1 a Dpto. Estadística, Universidad Carlos III de Madrid. Avda. de la Universidad 3, 8911, Leganés (Madrid Spain b

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS

ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS ONLINE APPENDIX TO: NONPARAMETRIC IDENTIFICATION OF THE MIXED HAZARD MODEL USING MARTINGALE-BASED MOMENTS JOHANNES RUF AND JAMES LEWIS WOLTER Appendix B. The Proofs of Theorem. and Proposition.3 The proof

More information

Basics of Stochastic Analysis

Basics of Stochastic Analysis Basics of Stochastic Analysis c Timo Seppäläinen This version November 16, 214 Department of Mathematics, University of Wisconsin Madison, Madison, Wisconsin 5376 E-mail address: seppalai@math.wisc.edu

More information

1 Basics of probability theory

1 Basics of probability theory Examples of Stochastic Optimization Problems In this chapter, we will give examples of three types of stochastic optimization problems, that is, optimal stopping, total expected (discounted) cost problem,

More information

Change of Measure formula and the Hellinger Distance of two Lévy Processes

Change of Measure formula and the Hellinger Distance of two Lévy Processes Change of Measure formula and the Hellinger Distance of two Lévy Processes Erika Hausenblas University of Salzburg, Austria Change of Measure formula and the Hellinger Distance of two Lévy Processes p.1

More information

Lecture 22: Variance and Covariance

Lecture 22: Variance and Covariance EE5110 : Probability Foundations for Electrical Engineers July-November 2015 Lecture 22: Variance and Covariance Lecturer: Dr. Krishna Jagannathan Scribes: R.Ravi Kiran In this lecture we will introduce

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

Modern Discrete Probability Branching processes

Modern Discrete Probability Branching processes Modern Discrete Probability IV - Branching processes Review Sébastien Roch UW Madison Mathematics November 15, 2014 1 Basic definitions 2 3 4 Galton-Watson branching processes I Definition A Galton-Watson

More information

Randomization in the First Hitting Time Problem

Randomization in the First Hitting Time Problem Randomization in the First Hitting Time Problem Ken Jacson Alexander Kreinin Wanhe Zhang February 1, 29 Abstract In this paper we consider the following inverse problem for the first hitting time distribution:

More information