Applications of Ito s Formula

Size: px
Start display at page:

Download "Applications of Ito s Formula"

Transcription

1 CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale characterization of Brownian motion Recall that B is a Brownian motion with respect to a filtration F if the increment B t B s has the normal distribution N(, t s) independent of F s. A consequence of this definition is that B is continuous martingale with quadratic variation process B, B t = t. The following theorem shows that this property characterizes Brownian motion completely. THEOREM 1.1. (Lévy s characterization of Brownian motion) Suppose that B is a continuous local martingale with respect to a filtration F whose quadratic variation process is B, B t = t (or equivalently, the process { B t t, t } is also a local martingale), then it is a Brownian motion with respect to F. PROOF. We will establish the following: for any s < t and any bounded F s -measurable random variable Z, (1.1) E Z exp {ia(b t B s )} = exp a (t s) EZ. Letting Z = 1, we see that B t B s is distributed as N(, t s). Letting Z = e iby for Y F s, we infer by the uniqueness of two-dimensional characteristic functions that B t B s is independent of any Y F s, which shows that B t B s is independent of F s. Therefore it is sufficient to show (1.1) Denote the left side of (1.1) by F(t). We first use Itô s formula to obtain exp {ia (B t B s )} =1 + ia a s s exp {ia (B u B s )} db u exp {ia (B u B s )} du. The stochastic integral is uniformly bounded because the other terms in this equality are uniformly bounded. Therefore the stochastic integral is not only a local martingale, but a martingale. Multiply both sides by Z and take the expectation. Noting that Z F s, we have F(t) = EZ a 65 s F(u) du.

2 66 4. APPLICATIONS OF ITO S FORMULA Solving for F(t), we obtain F(t) = exp This is what we wanted to prove. a (t s) EZ. A very useful corollary of Lévy s criterion is that every continuous local martingale M is a time change of Brownian motion. This can be seen as follows. We know that the quadratic variation process M, M is a continuous increasing process. Let us assume that as t, M, M t with probability 1. Let τ = {τ t } be the right inverse of M, M defined by τ t = inf {s : M, M s > t}. It is easy to show that t τ t is a right-continuous, increasing process and M, M τt = t and τ M,M t = t. Furthermore, τ t is a stopping time for each fixed t. Consider the time-changed process B t = M τt. Let σ n = inf {t : M t n}. Since M is a continuous local martingale, we have σ n and each stopped process M σ n is a square integrable martingale. From M, M t = t, we have by Fatou s lemma E Bt lim inf E M n σ n τ t = lim inf E M, M σ n n τ t E M, M τt = t. By the optional sampling theorem, B is a martingale. We now show that B is continuous. There exists a sequence of partitions { n } of the time set R + such that n and that with probability 1, t : j for all t. Now for any fixed t, the jump M t n j t M t n j 1 t = M, M t (B t B t ) = (M τt M τt ) is bounded by the quadratic variation of M in the time interval τ t δ, τ t for any δ >. Hence B t B t lim M t n j τ t M t n j 1 τ t = M, M τt M, M τt δ = δ. n t n j τ t δ Since δ is arbitrary, we see that B t = B t. This shows that B is continuous with probability 1. So far we have proved that B is a continuous local martingale. We compute the quadratic variation of B. The process M M, M is a continuous local martingale. By the optional sampling theorem, we see that M τ t M, M τt = B t t

3 . EXPONENTIAL MARTINGALE 67 is a continuous local martingale. It follows that the quadratic variation process of B is just B, B t = t. Now we can use Lévy s characterization to conclude that B t = M τt is a Brownian motion. Therefore we have shown that every continuous local martingale can be transformed into a Brownian motion by a time change. From B t = M τt and τ M,M t = t we have M t = B M,M t. In this sense we say that every continuous local martingale is the time change of a Brownian motion.. Exponential martingale We want to find an analog of the exponential function e x. The defining property of the exponential function is the differential equation or equivalently d f (x) dx = f (x), f () = 1, x f (x) = 1 + f (t) dt. So we define anexponential martingale E t by the stochastic integral equation E t = 1 + E s dm s, where M is a continuous local martingale. This equation can be solved explicitly. Instead of writing down the formula and verify it, let us discover the formula. Since E = 1 we can take the logarithm of E t at least for small time t. Let C t = log E t. By Itô s formula we have C t = E 1 s de s 1 E s d E, E s. Since de s = E s dm s, we have d E, E s = E s d M, M s. Hence C t = M t 1 M, M t. Therefore the formula for exponential martingale is E t = exp M t 1 M, M t. Now it is easy to verify directly that this process satisfies the defining equation for E t. Every strictly positive continuous local martingale can be written in the form of an exponential martingale: E t = exp M t 1 M, M t where the local martingale M can be expressed in terms of E by M t = E 1 s de s.,

4 68 4. APPLICATIONS OF ITO S FORMULA We can say that an exponential martingale is nothing but a positive continuous local martingale. Exponential martingale is related to iterated stochastic integrals defined by I (t) = 1 and I n (t) = I n 1 (s) dm s. By iterating the defining equation of the exponential martingale we have the expansion E t = exp M t 1 M, M t = I n (t) n= This formula can be verified rigorously, namely, it can be shown that the infinite series converges and the remainder from the iteration tends to zero. To continue our discussion let us introduce a parameter λ by replacing M with λm and obtain (.1) exp If we set x = λm t λ M, M t = λ n I n (t). n= M t M, M t and θ = λ, M, M t then the left side of (.1) becomes exp xθ θ. The coefficients of its Taylor expansion in θ are called Hermite polynomials It can be shown that e xθ θ = H n (x) θ n. n= n! ( ) H n (x) = e x / d n e x /. dx and {H n, n } is the orthogonal basis of L (R, e x / dx/ π) obtained by the Gramm-Schmidt procedure from the complete system {x n, n }. Now we can write exp λm t λ M, M t = λ n n= n! ( ) ( ) M, n/ M t M t H n. M, M t It follows that the iterated stochastic integrals can be expressed in terms of M t and M, M t via Hermite polynomials as follows: I n (t) = 1 ( ) ( ) n/ M, M t M H n t. n! M, M t

5 3. UNIFORMLY INTEGRABLE EXPONENTIAL MARTINGALES 69 Here are the first few iterated stochastic integrals: I (t) = 1, I 1 (t) = M t, I (t) = 1 M t M t, I 3 (t) = 1 M 3 6 t M t M t. Further discussion on this topic can be found in Hida Uniformly integrable exponential martingales In stochastic analysis exponential martingales (positive martingales) often appear in the following context. Suppose that (Ω, F, P) is a filtered probability space. Let Q be another probability measure which is absolutely continuous with respect to P on the σ-algebra F T. Then Q is also absolutely continuous with respect to P on F t for all t T. Let E t be the Radon-Nikodym derivative of Q with respect to P on F t : E t = dq dp. Ft Then it is easy to verify that E t = E {E T F t }. This shows that {E t, t T} is a positive martingale. If it is continuous, then it is an exponential martingale. Not only that, it is a uniformly integrable martingale. Conversely, if {E t, t T} is a uniformly integrable exponential martingale, then it defines a change of measure on the filtered probability space (Ω, F, P) up to time T. It often happens that we know the local martingale M and wish that the exponential local martingale E t = exp M t 1 M, M t defines a change of probability measures by dq/dp = E T. For this it is necessary that EE T = 1. However, in general, we only know that {E t } is a positive local martingale. It is therefore a supermartingale and EE T 1. Therefore the requirement EE T = 1 is not automatic and has to be proved by imposing further conditions on the local martingale M. PROPOSITION 3.1. Let M be a continuous local martingale and E t = exp M t 1 M, M t. Then EE T 1 and {E t, t T} is a uniformly integrable martingale if and only if EE T = 1.

6 7 4. APPLICATIONS OF ITO S FORMULA PROOF. We know that E is a local martingale, hence there is a sequence of stopping times τ n such that E t τn is a martingale, hence EE T τn = EE = 1. Letting n and using Fatou s lemma we have EE T 1. This inequality also follows from the fact that a nonnegative local martingale is always a supermartingale. Suppose that EE T = 1. Since E is a supermartingale we have E t E {E T F t }. Taking expected value gives 1 EE t EE T = 1. This shows that the equality must hold throughout and we have E t = E {E T F t }, which means that {E t, t T} is a uniformly integrable martingale. We need to impose conditions on the local martingale M to ensure that the local exponential martingale E is uniformly integrable. We have the following result due to Kamazaki THEOREM 3.. Suppose that M is a martingale and Ee MT/ is finite. Then { exp M t 1 } M t, t T is uniformly integrable. PROOF. This is a very interesting proof. Let E(λ) t = exp λm t λ M t. We need to show that E(1) is uniformly integrable, but first we take a slight step back and prove that E(λ) is uniformly integrable for all < λ < 1. We achieve this by proving that there is an r > 1 and C such that EE(λ) r σ C for all stopping time σ T. We have E(λ) r σ = exp ( λr λ 3 r ) M σ λ exp 3 rm σ λ r M σ. Using Hölder s inequality with the exponents 1 λ + λ = 1 we see that EE(λ) r σ is bounded by { ( λr ) } 1 λ { λ E exp 3 r λ } λ M σ E exp 1 λ rm σ λ r M σ. The second expectation on the right side does not exceed 1 (see PROPOSI- TION 3.1). For the first factor we claim that σ can be replaced by T and the coefficient can be replaced by 1/ if r > 1 is sufficiently close to 1. When r = 1 the coefficient is λ λ 3 1 λ = λ 1 + λ < 1

7 3. UNIFORMLY INTEGRABLE EXPONENTIAL MARTINGALES 71 because λ < 1. Hence the coefficient is still less than 1/ if r > 1 but sufficiently close to 1. On the other hand, we have assumed that M is a martingale, hence M σ = E {M T F σ }. By Jensen s inequalty we have e Mσ/ E {e M T/ } F σ. It follows that E exp ( λr ) λ 3 r M σ Ee Mσ/ Ee MT/. 1 λ We therefore have shown that for any λ < 1, there is an r > 1 such that EE(λ) r σ Ee M T/ for all stopping times σ T. This shows that E(λ) is a uniformly integrable martingale, which implies that EE(λ) T = 1 for all λ < 1. We now use the same trick again to show EE(1) T = 1. We have E(λ) T = exp λ M T λ M T exp (λ λ )M T. Using Hölder s inequality with the exponents λ + 1 λ = 1 we have { 1 = EE(λ) T E exp M T 1 } λ { ( )} λ 1 λ M T E exp 1 + λ M T. Because λ/(1 + λ) 1/, the second expectation on the right side can be replaced by E expm T /. Letting λ we obtain E exp M T 1 M T = 1. The condition E expm T / < is not easy to verify because we usually know the quadratic variation M T much better than M T itself. The following weaker criterion can often be used directly. COROLLARY 3.3. (Novikov s criterion) Suppose that M is a martingale. If Eexp M T / is finite, then E exp M T 1 M T = 1. PROOF. We have 1 exp M T 1 = exp M T M T exp 4 M T. By the Cauchy-Schwarz inequality we have 1 E exp M T E exp M T 1 M T E exp 1 M T.

8 7 4. APPLICATIONS OF ITO S FORMULA The factor factor on the right side does not exceed 1. Therefore 1 1 E exp M T E exp M T. Therefore Novikov s condition implies Kamazaki s condition. 4. Girsanov and Cameron-Martin-Maruyama theorems In stochastic analysis, we often need to change the base probability measure from a given P to another measure Q. Suppose that B is a Brownian motion under P. In general it will no longer be Brownian motion under Q. The Girsanov theorem describes the decomposition of B as the same of a martingale (necessarily a Brownian motion) and a process of bounded variation under a class of change of measures. We assume that B is an F -Brownian motion and V a progressively measurable process such that exp V s db s 1 V s ds, t T is a uniformly integrable martingale. This exponential martingale therefore defines a change of measure on F T by dq T dp = exp V s db s 1 T V s ds, t T. This is the class of changes of measures we will consider. THEOREM 4.1. Suppose that Q is the new measure described above. Consider the Brownian motion with a drift (4.1) X t = B t V τ dτ, t T. Then X is Brownian motion under the probability measure Q. PROOF. Let {e s } be the exponential martingale { s e s = exp V τ, db τ 1 s } V τ dτ. Then the density function of Q with respect to P on F s is just e s. From this fact it is easy to verify that if Y is an adapted process then E Q {Y t F s } = e 1 s E P {Y t e t F s }. This means that Y is a (local) martingale under Q if and only if ey = {e s Y s, s } is a (local) martingale under Q. Now e is a martingale and de s = e s V s db s. On the other hand, using Itô s formula we have d(e s X s ) = e s db s + e s X s V s db s.

9 4. GIRSANOV AND CAMERON-MARTIN-MARUYAMA THEOREMS 73 This shows that ex is local martingale under P, hence X is a local martingale under Q. Since Q and P are mutually absolutely continuous, it should be clear that the quadratic variation process X under P and under Q should be the same, i.e., X t = t. We can also verify this fact directly by applying Itô s formula to Z s = e s (X s s). We have dz s = e s d(x s s) + (X s s) de s + X s d e, X s. The second term on the right side is a martingale. We have d(x s s) = X s dx s d X, X s ds = X s db s X s V s ds. On the other hand, from de s = e s V s db s we have It follows that d e, X s = e s V s ds. dz s = e s X s db s + (X s s) de s, which shows that Z is a local martingale. Now we have shown that both X s and Xs s are local martingales under Q. By Lévy s criterion we conclude that X is a Brownian motion under Q. The classical Cameron-Martin-Maruyama theorem is a special case of the Girsanov theorem but stated in a slightly different form. Let µ be the Wiener measure on the path space W(R). Let h W(R) and consider the shift in the path space ξ h w = w + h. The shifted Wiener measure is µ h = µ ξ 1 h. If X is the coordinate process on W(R), then µh is just the law of X + h. We will prove the following dichotomy: either µ h and µ are mutually absolutely continuous or they are mutually singular. In fact we have an explicit criterion for this dichotomy. DEFINITION 4.. A path (function) h W(R) is called Cameron-Martin path (function) if it is absolutely continuous and its derivative is square integrable. The Cameron-Martin norm is defined by h H = ḣs ds. The space of Cameron-Martin paths is denoted by H. It is clear that H is a Hilbert space. THEOREM 4.3. (Cameron-Martin-Maruyama theorem) Suppose that h H. Then the shifted Wiener measure µ h and µ are mutually absolutely continuous and dµ h (w) = exp ḣ s dw s 1 ḣs ds. dµ PROOF. Denote the exponential martingale by e 1 (w). We need to show that for nonnegative measurable function F on W(R), E µh (F) = E µ (Fe 1 ).

10 74 4. APPLICATIONS OF ITO S FORMULA Let X be the coordinate process on W(R). E µ F(X + h). Introduce the measure ν by dν 1 dµ = exp ḣ s dw s 1 We have Then the left side is simply dµ 1 dν = exp ḣ s dw s + 1 = exp ḣ s d(w s + h) 1 ḣs ds ḣs ds. ḣs ds. Hence we can write dµ/dν = e 1 (X + h). It follows that E µh F = E µ F(X + h) = E ν F(X + h) dµ = E ν F(X + h)e 1 (X + h). dν By Girsanov s theorem X + h is a Brownian motion under ν. On the other hand, X is a Brownian motion under µ. Therefore on the right side of the above equality we can replace ν by µ and at the same time replace X + h by X, hence E µh F = E µ F(X)e 1 (X) = E µ (Fe 1 ). THEOREM 4.4. Let h W(R). The shifted Wiener measure µ h mutually absolutely continuous or mutually singular with respect to µ according as h H or h H. PROOF. We need to show that if h H, then µ h is singular with respect to µ. First we need to convert the condition h H into a more convenient condition. A function ḣ to be square integrable if and only if f s ḣ s ds C f for some constant C. Therefore it is conceivable that if h H, then for any C, there is a step function f such that f = n f i (s i s i 1 ) = 1 i=1 and n f s dh s = f i (h si h si 1 ) C. i=1 This characterization of h H can indeed be verified rigorously. Second, the convenient characterization that µ h is singular with respect to µ is the following: for any positive ɛ, there is a set A such that µ h (A) 1 ɛ and µ(a) ɛ.

11 5. MOMENT INEQUALITIES FOR MARTINGALES 75 Consider the random variable Z(w) = f s dw s = n f i (w si w si 1 ). i=1 It is a Gaussian random variable with mean zero and variance f 1 = 1. Therefore it has the standard Gaussian distribution N(, 1). Let A = {Z C/}. We have µ(a) ɛ for sufficiently large C. On the other hand, { µ h (A) = µ w W(R) : Z(w + h) C }. By the definition of Z(w) we have Therefore Z(w + h) = Z(w) + n i=1 f i (h si h si 1 ) Z(w) + C. { µ h (A) = µ Z + C C } { = µ Z C } and µ h (A) 1 ɛ for sufficiently large C. Thus we have shown that µ h and µ are mutually singular. 5. Moment inequalities for martingales Let M be a continuous martingale and Mt = max M s. s t The moment E M p t can be bounded both from above and from below by E M, M p/ t. THEOREM 5.1. Let M be a continuous local martingale. For any p >, there are positive constants c p, C p such that c p E M, M p/ E (Mt ) p C p E M, M p/ t. PROOF. The case p = is obvious. We only prove the case p >. The case < p < is slightly more complicated, see Ikeda and Watanabe 6. By the usual stopping time argument we may assume without loss of generality that M is uniformly bounded, so there is no problem of integrability. We prove the upper bound first. We start with the Doob s submartingale inequality (5.1) E ( M p p t p 1 ) p E M t p.

12 76 4. APPLICATIONS OF ITO S FORMULA We use Itô s formula to M t p. Note that x x p is twice continuously differentiable because p >. This gives M t p = p M s p 1 sgn(m s ) dm s + p(p 1) M s p sgn(m s ) d M s. Take expectation and using the obvious bound M s Ms we have E M t p p(p 1) E M (p ) t M, M t. We use Hölder s inequality on the right side and (5.1) on the left side to obtain E M p { p} (p )/p { } t C E M t E M, M p/ /p t, where C is a constant depending on p. The upper bound follows immediately. The lower bound is slightly trickier. Using Itô s formula we have M t M, M (p )/4 t = M, M (p )/4 s dm s + M s d, M, M (p )/4 s. The first term on the right side is the term we are aiming at because its second moment is precisely the left side of the inequality we wanted to prove. This is the reason why choose the exponent (p )/4. We have M, M (p )/4 s dm s Mt M, M (p )/4 t. Squaring the inequality and taking the expectation, we have after using Hölder s inequality, p E M, M p/ t 4 { E M p } /p { } t E M, M p/ (p )/ t. The lower bound follows immediately. 6. Martingale representation theorem Let B be a standard Brownian motion and Ft B = σ {B s, s t} be its associated filtration of σ fields properly completed so that it satisfies the usual condition. Now suppose M is a square-integrable martingale with respect to this filtration. We will show it can always be represented as a stochastic integral with respect to the Brownian motion, namely, there exists a progressively measurable process H such that M t = H s db s. Thus every martingale with respect to the filtration generated by a Brownian motion is a stochastic integral with respect to this Brownian motion. This is a very important result in stochastic analysis. In this section we will give a proof of this result by an approach we believe is direct, short, and

13 6. MARTINGALE REPRESENTATION THEOREM 77 well motivated. It uses nothing more than Itô s formula in a very elementary way. In the next section we will discuss another approach to this useful theorem. We make a few remarks before the proof. First of all, the martingale representation theorem is equivalent to the following representation theorem: if X is a square integrable random variable measurable with respect to FT B, then it can be represented in the form X = EX + for if we take X = M T, then we have M T = T T H s db s, H s db s. Since both sides are martingales, the equality must also holds if T is replaced by any t T. Second, the representation is unique because if then from E T T H s db s = T G s db s, T T H s db s G s db s = E H s G s ds we have immediately H = G on, T Ω with respect to the measure L P, where L is the Lebesgue measure. Third, if X n X in L (Ω, F, P) and X n = EX n + T H n s db s, then EX n EX n and T Hs m Hs n ds = E X m X n. This shows that {H n } is a Cauchy sequence in L (, T Ω, L P). It therefore converges to a process H and X = EX + T H s db s. The point here is that it is enough to show the representation theorem for a dense subset of random variables. Finally we observe a square integrable martingale with respect to the filtration generated by a Brownian motion is necessarily continuous because every stochastic integral is continuous with respect to its upper time limit. We may assume without loss generality that T = 1. Our method starts with a simple case X = f (B 1 ) for a smooth bounded function f. It is easy to

14 78 4. APPLICATIONS OF ITO S FORMULA prove the theorem in this case and find the explicit formula for the process H. PROPOSITION 6.1. For any bounded smooth function f, f (W 1 ) = E f (W 1 ) + E { f (W 1 ) F s } dws. PROOF. We have f (W 1 ) = f (W t + W 1 W t ). We know that W 1 W t has the normal distribution N(, 1 t) and is independent of F t. Hence, (6.1) E { f (W 1 ) F t } = 1 π(1 t) R 1 f (W t + x)e x /(1 t) dx. Now we regard the right side as a function of B t and t and apply Itô s formula. Since we know that it is a martingale, we only need to find out its martingale part, which is very easy: just differentiate with respect to B t and integrate the derivative with respect to B t. We have f (W 1 ) = E f (W 1 ) + 1 π(1 t) R 1 f (W t + x)e x /(1 t) dx dw s. The difference between the integrand and the right side of (6.1) is simply that f is replaced by its derivative f. This shows that f (W 1 ) = E f (W 1 ) + E { f (W 1 )F t } dt. The general case can be handled by an induction argument. THEOREM 6.. Let X L (Ω, F 1, P). Then there is a progressively measurable process H such that X = E X + H s dw s. PROOF. It is enough to show the representation theorem for random variables of the form X = f (W s1, W s W s1,, W sn W s1 ), where f is a bounded smooth function with bounded first derivatives because the random variables of this type form a dense subset of L (Ω, F 1, P) (see the remarks at the beginning of this section). By the induction hypothesis applied to the Brownian motion {W s W s1, s 1 s 1 }, we have where f (x, W s W s1,, W sn W s1 ) = h(x) + h(x) = E f (x, W s W s1,, W sn W s1 ), s 1 H x s dw s,

15 6. MARTINGALE REPRESENTATION THEOREM 79 and the dependence of H x on x is smooth. Hence, replacing x by W s1 we have X = h(w s1 ) + H s dw s, s 1 where H s = H W s 1 s. We also have h(w s1 ) = E h(w s1 ) + It is clear that E h(w s1 ) = E X, hence X = EX + s1 H s dw s. H s dw s. We now give a general formula for the integrand in the martingale representation theorem for a wide class of random variables. Recall the definition of the Cameron-Martin space H = { h C, 1 : h() =, ḣ L, 1 } and the Cameron-Martin norm h H = ḣs ds. A function F : C, 1 R is called a cylinder function if it has the form F(w) = f (w s1,..., w sl ), where < s 1 < < s l 1 and f : R l R is bounded smooth function. We also assume that f has bounded first derivatives and use the notation F xi (w) = f xi (w s1,..., w sl ). For a cylinder function F and h C, 1, the directional derivative along h W(R) is defined by F(w + th) F(w) D h F(w) = lim. t t If h H, then it is easy to verify that where Note that the derivative D h F(w) = DF(w), h H, DF(w) s = l i=1 min {s, s i } F xi (w). D s F(w) = d {DF(w) s} ds

16 8 4. APPLICATIONS OF ITO S FORMULA is given by l D s F(w) = F xi (w)i,si (s). i=1 We have the following integration by parts formula. THEOREM 6.3. Let H : Ω H be F -adapted and E exp { H H /} is finite. Then the following integration by parts formula holds for a cylinder function F: ED H F = E DF, H = E F Ḣ s dw s. PROOF. By Girsanov s theorem, under the probability Q defined by d Q 1 d P = exp t Ḣ s dw s t 1 Ḣ s ds, the process X s = W s th s, s 1, is a Brownian motion, hence E Q F(X) = E F(W). This means that { E Q F(X) = E F(W th) exp t Ḣ s dw s t 1 } Ḣ s ds is independent of t. Differentiating with respect to t and letting t =, we obtain the integration by parts formula. The explicit martingale representation theorem is given by the following Clark-Ocone formula. THEOREM 6.4. Let F be a cylinder function on the path space W(R). Then F(W) = E F(W) + E {D s F(W) F s } dw s. PROOF. By the martingale representation theorem we have F(W) = E F(W) + where Ḣ is F -adapted Ḣ. By definition, E D G F = E DF, G H. Ḣ s dw s, By the integration by parts formula, the left side is E D G F = E F Ġ s dw s = E Ḣ s dw s Ġ s dw s = E Ḣ s Ġ s ds.

17 7. REFLECTING BROWNIAN MOTION 81 The right side is E DF, G H = E Hence for all F -adapted process Ġ we have E Ġ s D s F ds = E E {D s F F s } Ġ s ds. E {D s F F s } Ġ s ds = E Ḣ s Ġ s ds. Since E {D s F F s } Ḣ s is also adapted, we must have Ḣ s = E {D s F F s }. 7. Reflecting Brownian motion Let B be a one-dimensional Brownian motion starting from zero. The process X t = B t is called reflecting Brownian motion. We have X t = F(B t ), where F(x) = x. We want to apply Itô s formula to F(B t ), but unfortunately F is not C. We approximate F by F ɛ (x) = 1 ɛ x u1 du 1 I ɛ,ɛ (u ) du. The above function is still not C because F ɛ = I ɛ,ɛ /ɛ, which is not continuous, but it is clear that F ɛ (x) x and F ɛ(x) sgn(x) as ɛ. Now let φ be a continuous function and define F φ (x) = x u1 du 1 φ(u ) du. Itô s formula can be applied to F φ (B t ) and we obtain F ɛ (B t ) = F φ(b s ) db s + 1 φ(b s )ds. Now for a fixed ɛ we let φ in the above formula to be the continuous function, if x ɛ + n 1, φ n (x) = ɛ 1, if x ɛ, linear, in the two remaining intervals. Then F φn F ɛ, and F φ n F. It follows that we have In the last term, F ɛ (B t ) = F (B s ) db s + 1 ɛ I ɛ,ɛ (B s ) ds I ɛ,ɛ (B s ) ds. is the amount of time Brownian paths spends in the interval ɛ, ɛ up to time t. Now let ɛ in the above identity for F ɛ (B t ). The term on the left

18 8 4. APPLICATIONS OF ITO S FORMULA side converges to B t and the first term on the right side converges to the stochastic integral sgn(b s ) db s. Hence the limit must exist and we have 1 t L t = lim I ɛ ɛ ( ɛ,ɛ) (B s ) ds B t = sgn(b s ) db s + 1 L t. We see that L t can be interpreted as the amount of time Brownian motion spends in the interval ( ɛ, ɛ) properly normalized. It is called the local time of Brownian motion B at x =. Let W t = sgn(b s ) db s. Then W is a continuous martingale with quadratic variation process W, W t = sgn(b s ) ds = t. Note that Brownian motion spends zero amount of time at x = because EI {} (B s ) = P {B s = } = and E I {} (B s ) ds = EI {} (B s ) ds =. We thus conclude that reflecting Brownian motion B t is submartingale with the decomposition B t = W t + 1 L t. It is interesting to note that W can be expressed in terms of reflecting Brownian motion by where W t = X t 1 L t, 1 (7.1) L t = lim I ɛ ɛ,ɛ (X s ) ds. We now pose the question: Can X and L be expressed in terms of W? That the answer to this question is affirmative is the content of the so-called Skorokhod problem. DEFINITION 7.1. Given a continuous path f : R + R such that f (). A pair of functions (g, h) is the solution of the Skorokhod problem if (1) g(t) for all t ; () h is increasing from h() = and increases only when g = ; (3) g = f + h.

19 7. REFLECTING BROWNIAN MOTION 83 The main result is that the Skorokhod problem can be solved uniquely and explicitly. THEOREM 7.. There exists a unique solution to the Skorokhod equation. PROOF. It is interesting that the solution can be written down explicitly: h(t) = min f (s), s t g(t) = f (t) min f (s). s t Let s assume that f () = for simplicity. If f () >, then h(t) = and g(t) = f (t) before the first time f reaches and after this time it is as if the path starts from. The explicit solution in this case is h(t) = min s t f (s), g(t) = f (t) min f (s). s t It is clear that g(t) for all t and h increases starting from h() =. The equation f = g + h is also obvious. We only need to show that h increases only when g(t) =. This means that as a Borel measure h only charges the zero set {t : g(t) = }. This requirement is often written as h(t) = I {} (g(s)) dh(s). Equivalently, it is enough to show that for any t such that g(t) > there is a neighborhood (t δ, t + δ) of t such that h is constant on there. This should be clear, for if g(t) >, then f (t) > min s t f (s), which means that the minimum must be achieved at a point ξ, t) and f (t) > f (ξ). By continuity a small change of t will not alter this situation, which means that h = f (ξ) in a neighborhood of t. More precisely, from g(t) = f (t) min s t f (s) > and the continuity of f, there is a positive δ such that min f (s) > min f (s). t δ s t+δ s t δ Hence { } min f (s) = min min f (s), min f (s) s t+δ s t δ t δ s t+δ = min s t δ f (s). This means that h(t + δ) = h(t δ), which means that h must constant on (t δ, t + δ) because h is increasing. We now show that the solution to the Skorokhod problem is unique. Suppose that (g 1, h 1 ) and let ξ = h h 1. It is continuous and of bounded variation, hence ξ(t) = ξ(s) dξ(s). On the other hand, ξ(s) = g(s) g 1 (s), hence ξ(t) = {g(s) g 1 (s)} d {h(s) h 1 (s)}. There are four terms on the right side: g(s) dh(s) = g 1 (s) dh 1 (s) = because h increases on when g = and h 1 increases only when g 1 = ;

20 84 4. APPLICATIONS OF ITO S FORMULA g(s) dh 1 (s) and g 1 (s) dh(s) because g(s) and g 1 (s). Putting these observations together we have ξ(t), which means that ξ(t) =. This proves the uniqueness. If we apply Skorokhod equation to Brownian motion by replacing f with Brownian paths, we obtain some interesting results. We have shown that B t = W t + 1 L t, where W is a Brownian motion. From the solution of the Skorokhod problem we conclude that B and L are determined by W: (7.) B t = W t min s t W s, 1 L t = min s t W s. THEOREM 7.3. Let W be a Brownian motion. (1) The processes { } max W s W t, t and { W t, t } s t have the same law, i.e, that of a reflecting Brownian motion. () We have max W 1 s = lim s t ɛ ɛ I,ɛ ( max u s W u W s ) ds. PROOF. The assertions follow immediately from (7.) by replacing W with W, which is also a Brownian motion; in the second assertion, we use the fact that L is the normalized occupation time of reflecting Brownian motion (7.1). REMARK 7.4. We have calculated the joint distribution of max s t W s and W t for a fixed t and we know that max s t W s W s and W t have the same distribution for each fixed t. The above theorem claims much more: they have the same distribution as two stochastic processes. 8. Brownian bridge If we condition a Brownian motion to return to a fixed point x at time t = 1 we obtain Brownian bridge from o to x with time horizon 1. Let L x (R n ) = {w W o (R n ) : w 1 = x}. The law of a Brownian bridge from o to x in time 1 is a probability measure µ x on L x (R n ), which we will call the Wiener measure on L x (R n ). Note that L x (R n ) is a subspace of W o (R n ), thus µ x is also a measure on W o (R n ). By definition, we can write intuitively µ x (C) = µ {C w 1 = x}. Here µ is the Wiener measure on W o (R n ). The meaning of this suggestive formula is as follows. If F is a nice function measurable with respect to B s with s < 1 and f a measurable function on R n, then E µ {F f (X 1 )} = E µ {E µ X 1 (F) f (W 1 )},

21 8. BROWNIAN BRIDGE 85 where W denotes the coordinate process on W o (R n ). Using the Markov property at time s we have E F µ p(1 s, W s, y) f (y) dy = E µ y (F)p(1, o, y) f (y)dy, R n R n where ( ) 1 n/ p(t, y, x) = e y x /t πt is the transition density function of Brownian motion X. This being true for all measurable f, we have for all F B s, (8.1) E µ x p(1 s, F = E µ Ws, x) F. p(1, o, x) Therefore µ x is absolutely continuous with respect to µ on F s for any s < 1 and the Radon-Nikodym density is given by dµ x dµ (w) = p(1 s, w s, x) = e s. Fs p(1, o, x) The process {e s, s < 1} is a necessarily a positive (local) martingale under the probability µ. It therefore must have the form of an exponential martingale, which can be found explicitly by computing the differential of log e s. The density function p(t, y, x) satisfies the heat equation p t = 1 yp in (t, y) for fixed x. This equation gives y log p = log p y log p. t Using this fact and Iô s formula we find easily that d log e s = log p(1 s, w s, x), dw s 1 p(1 s, w s, x) ds. Hence e s is an exponential martingale of the form dµ o s dµ = exp V u, dw u 1 Bs where s V s = y log p(1 s, w s, x). V u du, By Girsanov s theorem, under probability µ x, the process B s = W s s log p(1 τ, W τ, x)dτ, s < 1 is a Brownian motion. The explicit formula for p(t, y, x) gives y log p(1 τ, y, x) = y x 1 τ.

22 86 4. APPLICATIONS OF ITO S FORMULA Under the probability µ x the coordinate process W is a reflecting Brownian motion from o to x in time 1. Therefore we have shown that reflecting Brownian motion is the solution to the following stochastic differential e- quation dw s = db s W s x 1 s ds. This simple equation can be solved explicitly. From the equation we have d(w s x) + W s x 1 s ds = db s. The left side after dividing by 1 s is the differential of (W s x)/(1 s), hence s W s = sx + (1 s) db u 1 u. This formula shows that, like Brownian motion itself, Brownian bridge is a Gaussian procss. The term Brownian bridge is often reserved specifically for Brownian bridge which returns to its starting point at the terminal time. In this case we have dw s = db s W s 1 s ds and s W s = (1 s) db u 1 u. For dimension n = 1 it is easy verify that the covariance function is given by E {W s W t } = min {s, t} st. Using this fact we obtain another representation of Brownian bridge. THEOREM 8.1. If {B s } is a Brownian motion, then the process is a Brownian bridge. W s = B s sb 1 PROOF. (n = 1) Verify directly that W defined above has the correct covariance function. The following heuristic discussion of W s = B s sb 1 is perhaps more instructive. Let F be the filtration of the Brownian motion B. We enlarge the filtration to G s = σ {F s, B 1 }. We compute the Doob-Meyer decomposition of W with respect to G. Of course the martingale part is a Brownian motion because its quadratic variation process will be the same has that of W. Denote this Brownian motion by Ω. Doob s explicit decomposition formula for a semimartingale suggest that s W s = Ω s + E {dw s G s }.

23 9. FOURTH ASSIGNMENT 87 We have dw s = db s B 1 ds. The differential dw s = W s+ds W s is forward differential. We need to project dw s to the L -space generated by G s = σ {B u, u s; B 1 } = {B u, u s; B 1 B s }. Note that σ {B u, u s} is orthogonal to B 1 B s. The differential db s is orthogonal to the first part and its projection to the second part is db s (B 1 B s ) (B 1 1 B s ) = B 1 B s ds. s 1 s The differential B 1 ds is of course in the target space already, hence E {dw s G s } = B 1 B s 1 s It follows that ds B 1 ds = B s sb 1 1 s s W s = Ω s W τ 1 τ dτ, ds = W s ds 1 s. which is exactly the stochastic differential equation for a reflecting Brownian motion. 9. Fourth assignment EXERCISE 4.1. Let φ be a strictly convex function. If both N and φ(n) are continuous local martingales then N is trivial, i.e., there is a constant C such that N t = 1 with probability 1 for all t. EXERCISE 4.. Let M be a continuous local martingale. Show that there is a sequence of partitions 1 3 such that n and with probability 1 the following holds: for all t, lim n i=1 ( M t n i t M t n i 1 t) = M, M t. EXERCISE 4.3. Let B be the standard Brownian motion. Then the reflecting Brownian motion X t = B t is a Markov process. This means { } P X t+s C Fs X = P {X t+s C X s }. What is its transition density function q(t, x, y) = P {X t+s dy X s = x}? dy EXERCISE 4.4. Let L t be the local time of Brownian motion at x =. Show that 8t EL t = π. EXERCISE 4.5. Show that Brownian bridge from o to x in time 1 is a Markov process with transition density is q(s 1, y; s, z) = p(s s 1, y, z)p(1 s, z, x). p(1 s 1, y, x)

24 88 4. APPLICATIONS OF ITO S FORMULA EXERCISE 4.6. The finite-dimensional marginal density for Brownian bridge from o to x in time 1 is p(1, o, x) 1 Convention: x = o, s =. l p(s i+1 s i, x i, x i+1 ). i= EXERCISE 4.7. Let {W s, s 1} be a Brownian bridge at o. Then the reversed process {W 1 s, s 1} is also a Brownian bridge. EXERCISE 4.8. For an a > define the stopping time Show that for µ, σ a = inf {t : B t t = a}. Ee µσ a = e ( 1+µ 1)a. EXERCISE 4.9. Use the result of the previous exercise to show that Ee µσ a is infinite if µ > 1/. EXERCISE 4.1. By the martingale representation theorem there is a process H such that W1 3 = H s dw s. Fin an explicit expression for H.

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

Contents. 1 Preliminaries 3. Martingales

Contents. 1 Preliminaries 3. Martingales Table of Preface PART I THE FUNDAMENTAL PRINCIPLES page xv 1 Preliminaries 3 2 Martingales 9 2.1 Martingales and examples 9 2.2 Stopping times 12 2.3 The maximum inequality 13 2.4 Doob s inequality 14

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Generalized Gaussian Bridges of Prediction-Invertible Processes

Generalized Gaussian Bridges of Prediction-Invertible Processes Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Stochastic Calculus February 11, / 33

Stochastic Calculus February 11, / 33 Martingale Transform M n martingale with respect to F n, n =, 1, 2,... σ n F n (σ M) n = n 1 i= σ i(m i+1 M i ) is a Martingale E[(σ M) n F n 1 ] n 1 = E[ σ i (M i+1 M i ) F n 1 ] i= n 2 = σ i (M i+1 M

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Stochastic Differential Equations

Stochastic Differential Equations Chapter 5 Stochastic Differential Equations We would like to introduce stochastic ODE s without going first through the machinery of stochastic integrals. 5.1 Itô Integrals and Itô Differential Equations

More information

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s. 20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this

More information

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1)

Question 1. The correct answers are: (a) (2) (b) (1) (c) (2) (d) (3) (e) (2) (f) (1) (g) (2) (h) (1) Question 1 The correct answers are: a 2 b 1 c 2 d 3 e 2 f 1 g 2 h 1 Question 2 a Any probability measure Q equivalent to P on F 2 can be described by Q[{x 1, x 2 }] := q x1 q x1,x 2, 1 where q x1, q x1,x

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents

MATH MEASURE THEORY AND FOURIER ANALYSIS. Contents MATH 3969 - MEASURE THEORY AND FOURIER ANALYSIS ANDREW TULLOCH Contents 1. Measure Theory 2 1.1. Properties of Measures 3 1.2. Constructing σ-algebras and measures 3 1.3. Properties of the Lebesgue measure

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define

HILBERT SPACES AND THE RADON-NIKODYM THEOREM. where the bar in the first equation denotes complex conjugation. In either case, for any x V define HILBERT SPACES AND THE RADON-NIKODYM THEOREM STEVEN P. LALLEY 1. DEFINITIONS Definition 1. A real inner product space is a real vector space V together with a symmetric, bilinear, positive-definite mapping,

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

The Cameron-Martin-Girsanov (CMG) Theorem

The Cameron-Martin-Girsanov (CMG) Theorem The Cameron-Martin-Girsanov (CMG) Theorem There are many versions of the CMG Theorem. In some sense, there are many CMG Theorems. The first version appeared in ] in 944. Here we present a standard version,

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

9 Radon-Nikodym theorem and conditioning

9 Radon-Nikodym theorem and conditioning Tel Aviv University, 2015 Functions of real variables 93 9 Radon-Nikodym theorem and conditioning 9a Borel-Kolmogorov paradox............. 93 9b Radon-Nikodym theorem.............. 94 9c Conditioning.....................

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

Some Tools From Stochastic Analysis

Some Tools From Stochastic Analysis W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click

More information

STAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model

STAT331. Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model STAT331 Combining Martingales, Stochastic Integrals, and Applications to Logrank Test & Cox s Model Because of Theorem 2.5.1 in Fleming and Harrington, see Unit 11: For counting process martingales with

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

Representations of Gaussian measures that are equivalent to Wiener measure

Representations of Gaussian measures that are equivalent to Wiener measure Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

3 Integration and Expectation

3 Integration and Expectation 3 Integration and Expectation 3.1 Construction of the Lebesgue Integral Let (, F, µ) be a measure space (not necessarily a probability space). Our objective will be to define the Lebesgue integral R fdµ

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

Integral Jensen inequality

Integral Jensen inequality Integral Jensen inequality Let us consider a convex set R d, and a convex function f : (, + ]. For any x,..., x n and λ,..., λ n with n λ i =, we have () f( n λ ix i ) n λ if(x i ). For a R d, let δ a

More information

Notions such as convergent sequence and Cauchy sequence make sense for any metric space. Convergent Sequences are Cauchy

Notions such as convergent sequence and Cauchy sequence make sense for any metric space. Convergent Sequences are Cauchy Banach Spaces These notes provide an introduction to Banach spaces, which are complete normed vector spaces. For the purposes of these notes, all vector spaces are assumed to be over the real numbers.

More information

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze

More information

Product measure and Fubini s theorem

Product measure and Fubini s theorem Chapter 7 Product measure and Fubini s theorem This is based on [Billingsley, Section 18]. 1. Product spaces Suppose (Ω 1, F 1 ) and (Ω 2, F 2 ) are two probability spaces. In a product space Ω = Ω 1 Ω

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing.

5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing. 5 Measure theory II 1. Charges (signed measures). Let (Ω, A) be a σ -algebra. A map φ: A R is called a charge, (or signed measure or σ -additive set function) if φ = φ(a j ) (5.1) A j for any disjoint

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

An Introduction to Malliavin calculus and its applications

An Introduction to Malliavin calculus and its applications An Introduction to Malliavin calculus and its applications Lecture 3: Clark-Ocone formula David Nualart Department of Mathematics Kansas University University of Wyoming Summer School 214 David Nualart

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS Qiao, H. Osaka J. Math. 51 (14), 47 66 EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS HUIJIE QIAO (Received May 6, 11, revised May 1, 1) Abstract In this paper we show

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

Information and Credit Risk

Information and Credit Risk Information and Credit Risk M. L. Bedini Université de Bretagne Occidentale, Brest - Friedrich Schiller Universität, Jena Jena, March 2011 M. L. Bedini (Université de Bretagne Occidentale, Brest Information

More information

L p Functions. Given a measure space (X, µ) and a real number p [1, ), recall that the L p -norm of a measurable function f : X R is defined by

L p Functions. Given a measure space (X, µ) and a real number p [1, ), recall that the L p -norm of a measurable function f : X R is defined by L p Functions Given a measure space (, µ) and a real number p [, ), recall that the L p -norm of a measurable function f : R is defined by f p = ( ) /p f p dµ Note that the L p -norm of a function f may

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information

Stochastic Integration.

Stochastic Integration. Chapter Stochastic Integration..1 Brownian Motion as a Martingale P is the Wiener measure on (Ω, B) where Ω = C, T B is the Borel σ-field on Ω. In addition we denote by B t the σ-field generated by x(s)

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

REAL AND COMPLEX ANALYSIS

REAL AND COMPLEX ANALYSIS REAL AND COMPLE ANALYSIS Third Edition Walter Rudin Professor of Mathematics University of Wisconsin, Madison Version 1.1 No rights reserved. Any part of this work can be reproduced or transmitted in any

More information

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of

More information

δ xj β n = 1 n Theorem 1.1. The sequence {P n } satisfies a large deviation principle on M(X) with the rate function I(β) given by

δ xj β n = 1 n Theorem 1.1. The sequence {P n } satisfies a large deviation principle on M(X) with the rate function I(β) given by . Sanov s Theorem Here we consider a sequence of i.i.d. random variables with values in some complete separable metric space X with a common distribution α. Then the sample distribution β n = n maps X

More information