Math Probability Theory and Stochastic Process

Size: px
Start display at page:

Download "Math Probability Theory and Stochastic Process"

Transcription

1 Math Probability Theory and Stochastic Process Taught by Horng-Tzer Yau Notes by Dongryul Kim Spring 217 This course was taught by Horng-Tzer Yau. The lectures were given at MWF 12-1 in Science Center 31. The textbook was Brownian Motion, Martingales, and Stochastic Calculus by Jean-François Le Gall. There were 11 undergraduates and 12 graduate students enrolled. There were 5 problem sets and the course assistant was Robert Martinez. Contents 1 January 23, Gaussians Gaussian vector January 25, Gaussian process Conditional expectation for Gaussian vectors January 27, Gaussian white noise Pre-Brownian motion January 3, Kolmogorov s theorem February 1, Construction of a Brownian motion Wiener measure February 3, Blumenthal s zero-one law Strong Markov property Last Update: August 27, 218

2 7 February 6, Reflection principle Filtrations February 8, Stopping time of filtrations Martingales February 1, Discrete martingales - Doob s inequality February 13, More discrete martingales - upcrossing inequality February 15, Levi s theorem Optional stopping for continuous martingales February 17, February 22, Submartingale decomposition February 24, Backward martingales Finite variance processes February 27, Local martingales March 1, Quadratic variation March 3, Consequences of the quadratic variation March 6, Bracket of local martingales Stochastic integration March 8, Elementary process March 2, Review

3 21 March 22, Stochastic integration with respect to a Brownian motion Stochastic integration with respect to local martingales March 24, Dominated convergence for semi-martingales Itô s formula March 27, Application of Itô calculus March 29, Burkholder Davis Gundy inequality March 31, Representation of martingales Girsanov s theorem April 3, Cameron Martin formula April 5, Application to the Dirichlet problem Stochastic differential equations April 7, Existence and uniqueness in the Lipschitz case Kolomogorov forward and backward equations April 1, Girsanov s formula as an SDE Ornstein Uhlenbeck process Geometric Brownian motion April 12, Bessel process April 14, Maximum principle dimensional Brownian motion April 17, More on 2-dimensional Brownian motion April 19, Feynman Kac formula Girsanov formula again

4 34 April 21, Kipnis Varadhan cutoff lemma April 24, Final review

5 Math 288 Notes 5 1 January 23, 217 The textbook is J. F. Le Gall, Brownian Motion, Martingales, and Stochastic Calculus. A Brownian motion is a random function X(t) : R R n. This is a model for the movement of one particle among many in billiard dymanics. For instance, if there are many particles in box and they bump into each other. Robert Brown gave a description in the 19th century. X(t) is supposed to describe the trajectory of one particle, and it is random. To simplify this, we assume X(t) is a random walk. For instance, let { N 1 with probability 1/2 S N = X i, X i = 1 with probability 1/2. i=1 For non-integer values, you interpolate it linearly and let S N (t) = Nt i=1 X i + (Nt Nt )X Nt +1. Then E(SN 2 ) = N. As we let N, we have three properties: ( lim E SN (t) ) 2 = t N N E(S N (t)s N (s)) lim N 2 = t s = min(t, s) N S N (t)/ N N(, t) This is the model we want. 1.1 Gaussians Definition 1.1. X is a standard Gaussian random variable if P (X A) = 1 e x2 /2 dx 2π for a Borel set A R. A For a complex variable z C, E(e zx ) = 1 2π e zx e x2 /2 dx = e z2 /2. This is the moment generating function. When z = iξ, we have ( ξ 2 ) k k= k! = e ξ2 /2 = 1 + iξex + (iξ)2 2 EX2 +.

6 Math 288 Notes 6 From this we can read off the moments of the Gaussian EX n. We have EX 2n = (2n)! 2 n n!, EX2n+1 =. Given X N (, 1), we have Y = σx + m N (m, σ 2 ). Its moment generating function is given by Ee zy = e imξ σ 2 ξ 2 /2. Also, if Y 1 N (m 1, σ 2 1) and Y 2 N (m 2, σ 2 2) and they are independent, then Ee z(y1+y2) = Ee zy1 Ee zy2 = e i(m1+m2)ξ (σ2 1 +σ2 2 )ξ2 /2. So Y 1 + Y 2 N (m 1 + m 2, σ σ 2 2). Proposition 1.2. If X n N (m n, σ 2 n) and X n X in L 2, then (i) X is Gaussian with mean m = lim n m n and variance σ 2 = lim n σ 2 n. (ii) X n X in probability and X n X in L p for p 1. Proof. (i) By definition, E(X n X) 2 and so EX n EX (E X n X 2 ) 1/2. So EX = m. Similarly because L 2 is complete, we have Var X = σ 2. Now the characteristic function of X is Ee iξx = lim = lim n eiξxn n eimnξ σ2 n ξ2 /2 = e imξ σ2 ξ 2 /2. So X N (m, σ 2 ). (ii) Because EX 2p can be expressed in terms of the mean and variance, we have sup n E X n 2p < and so sup n E X n X 2p <. Define Y n = X n X p. Then Y n is bounded in L 2 and hence is uniformly integrable. Also it converges to in probability. It follows that Y n in L 1, which means that X n X in L p. It then follows that X n X in probability. Definition 1.3. We say that X n is uniformly integrable if for every ɛ > there exists a K such that sup n E[ X n 1 Xn >K] < ɛ. Proposition 1.4. A sequence of random variables X n converge to X in L 1 if and only if X n are uniformly integrable and X n X in probability. 1.2 Gaussian vector Definition 1.5. A vector X R n is a Gaussian vector if for every n R n, u, X is a Gaussian random variable. Example 1.6. Define X 1 N (, 1) and ɛ = ±1 with probability 1/2 each. Let X 2 = ɛx 1. Then X = (X 1, X 2 ) is not a Gaussian vector.

7 Math 288 Notes 7 Note that u E[u X] is a linear map and so E[u X] = u m X for some m X R n. We call m X the mean of X. If X = n i=1 X ie i for an orthonormal normal basis e i of R n, then the mean of X is m X = N (EX i )e i. i=1 Likewise, the map u Var u X is a quadratic form and so Var ux = u, q X u for some matrix q X. We all q X the covariant matrix of X. We write q X (n) = u, q X u. Then the characteristic function of X is Ee zi(u X) = exp(izm X z 2 q X (u)/2). Proposition 1.7. If Cov(X j, X k ) 1 j,k n is diagonal, and X = (X 1,..., X n ) is a Gaussian vector, then X i s are independent. Proof. First check that q X (u) = n u i u j Cov(X i, X j ). i,j=1 Then ( E exp(iu X) = exp 1 ) ( 2 q X(u) = exp 1 2 = n i=1 ( exp 1 ) 2 u2 i Var(X i ). n i=1 ) u 2 i Var(X i ) This gives a full description of X. Theorem 1.8. Let γ : R n R n be a positive definite symmetric n n matrix. Then there exists a Gaussian vector X with Cov X = γ. Proof. Diagonalize γ and let λ j be the eigenvalues. Choose v 1,..., v n an orthonormal basis of eigenvectors of R n and let w j = λ j v j = γv j. Choose Y j to be Gaussians N (, 1). Then the vector X = n λ j v j Y j j=1 is a Gaussian vector with the right covariance.

8 Math 288 Notes 8 2 January 25, 217 We start with a probability space (Ω, F, P ) and joint Gaussians (or a Gaussian vector) X 1,..., X d. We have Cov(X j, X k ) = E[X j X k ] E[X j ]E[X k ], γ X = (Cov(X j, X k )) jk = C. Then the probability distribution is given by P X (x) = 1 1 (2π) d/2 det C e x,c x /2 where C > and the characteristic function is 2.1 Gaussian process E[e itx ] = e t,ct /2. A (centered) Gaussian space is a closed subspace of L 2 (Ω, F, P ) containing only Gaussian random variables. Definition 2.1. A Gaussian process indexed by T ( R + ) is one such that {X t1,..., X tk } is always a Gaussian vector. Definition 2.2. For a collection S of random variables, we denote by σ(s) the smallest σ-algebra on Ω such that all variables in S are measurable. Proposition 2.3. Suppose H 1, H 2 H where H is the centered Gaussian space. Then H 1 H 2 if and only if H 1 and H 2 are independent, i.e., the generated σ-algebras σ(h 1 ) and σ(h 2 ) are independent. Example 2.4. If we take X 1 = X to be a Gaussian and X 2 = ɛx where ɛ = ±1 with probability 1/2 each, then E[X 1 X 2 ] = E[ɛX 2 ] =. That is, X 1 and X 2 are uncorrelated. But X 1 and X 2 are dependent. This is because {X 1, X 2 } are not jointly Gaussian. 2.2 Conditional expectation for Gaussian vectors Lt X be a random variable in a Gaussian space H. If H is finite dimensional with (jointly Gaussian) basis X 1,..., X d, then we may write X = j a jx j. Let K H be a closed subspace. We want to look at the conditional expectation E[X σ(k)] = E[X K], where σ(k) is the σ-algebra generated by elements in K. Definition 2.5. The conditional expectation is defined as E[X K] = ϕ(k), the measurable function with respect to K so that for any function g(k) we have E[Xg(K)] = E[ϕ(K)g(K)], or alternatively, E[(X ϕ(k))g(k)] =, i.e., ϕ(k) is the L 2 -orthogonal projection of X to the space σ(k).

9 Math 288 Notes 9 We also write E[X σ(k)] = p K (X). Let Y = X p K (X). Note that Y is a Gaussian that is orthogonal to K. Then E[f(X) σ(k)] = E[f(Y + p K (X)) σ(k)] = = f(y) e (y p K(X)) 2 /2σ 2 dy, 2πσ f(y + p K (X)) e y2 /2σ 2 2πσ where σ 2 is the variance of Y = X p K (X). We remark that if X is a Gaussian with E[X] = and E[X 2 ] = σ 2, then E[Xf(X)] = σ 2 E[f (X)] by integration by parts. It follows that E[X 2n ] corresponds to the number of ways to pair elements in a set of size 2n. There is a more general interpretation. In general, we have E[Y f(x 1,..., X d )] = j Cov(Y, X j )E[ j f].

10 Math 288 Notes 1 3 January 27, 217 For a Gaussian process X s indexed by s R +, let us define Γ(s, t) = Cov(X s, X t ), where being positive semidefinite means ( ) Var α(s)x s ds = α(s)γ(s, t)α(t)dsdt for any α. Theorem 3.1. Suppose Γ(s, t) is a positive semidefinite function. Then there exists a Gaussian process with Γ as the covariant matrix. Theorem 3.2 (Kolmogorov extension theorem). Given a family of consistent finite dimensional distributions, there exists a measure with this family of finite dimensional distributions as the finite dimensional marginals. Consistency means that if we are given the distribution of (X s1, X s2, X s3 ), then the distribution of (X s1, X s3 ) must be the integration. Kolmogorov s theorem is giving you an abstract construction of the measure. So we are not going to be worry too much about the existence of these infinite dimensional measure spaces. Example 3.3. Suppose µ is a probability measure on R. Then Γ(s, t) = e iξ(s t) µ(dξ) is positive semidefinite. This is because ( α(s)e iξ(s t) α(t)dsdtµ(dξ) = α(s)e iξs )( α(t)e iξt )µ(dξ). 3.1 Gaussian white noise Suppose µ is a σ-finite measure. Definition 3.4. We call a Gaussian space G a Gaussian white noise with intensity µ if E(G(f)G(g)) = fgdµ for every f, g L 2 (µ). Informally, this is the same as saying E[G(x)G(y)] = δ(x y) with the definition G(x) = G(δ x ). Then [ ] E[G(f)G(g)] = E G(x)f(x)dx G(y)g(y)dy = f(x)e[g(x)g(y)]g(y)dx dy = f(x)g(x)dx.

11 Math 288 Notes 11 Theorem 3.5. For every µ a σ-finite measure there exists a Gaussian white noise with intensity µ. Proof. Assume L 2 (µ) is separable. There then exists an orthonormal basis in L 2 (µ). For any f there exists a presentation f = i=1 α if i. Now define G(f) = α i X i, i=1 where X 1, X 2,...are independent in N (, 1). Then we obvious have the desired condition. 3.2 Pre-Brownian motion Definition 3.6. Let G be a Gaussian white noise on (R +, dx). Define B t = G(1 [,t] ). Then B t is a pre-brownian motion. Proposition 3.7. The covariance matrix is E[B t B s ] = s t. denotes min(s, t).) Proof. We have E[B t B s ] = E[G([, t])g([, s])] = s t. (Here, s t Because B t = G([, t]), we may formally write db t = G(dt). Then we can also write G(f) = f(s)db s. We can also think G([, t]) = t db s = B t B = B t. Roughly speaking, db t is the white noise. This idea can be rigorously formulated as that if = t < t 1 < < t n then B t1 B t, B t2 B t1,..., B tn B tn 1 are independent Gaussian. Proposition 3.8. The following statements are equivalent: (i) (X t ) is a Brownian motion. (ii) (X t ) is a centered Gaussian with covariance s t. (iii) (X t X s ) is independent of σ(x r, r s) and X t X s N (, t s). (iv) {X ti X ti 1 } are independent Gaussian with variance t i t i 1. Proof. (ii) (iii) We have E[(B t B s )B r ] = t r s r = for r s. The other implications are obvious.

12 Math 288 Notes 12 Note that the probability density of B t1,..., B tn is given by 1 p(x 1,..., x n ) = (2π) n/2 e (xi xi 1)2 /2(t i t i 1). t 1 (t 2 t 1 ) It is important to understand this as a Wiener measure. i

13 Math 288 Notes 13 4 January 3, 217 Let me remind you of some notions. A Gaussian space is a closed linear subspace of L 2 (Ω, P ). A random process is a sequence (X t ) t of random variables. A Gaussian process is a process such that (X t1,..., X tk ) is Gaussian for any choice t 1,..., t k. A Gaussian white noise with intensity µ is a map G : L 2 (µ) L 2 (P ) such that G(f) is Gaussian and E[G(f)G(g)] = fgdµ. A pre-brownian motion is a process with B t = G(1 [,t] ). Sometimes people write B t (ω). In this case, this ω is an element of the probability space Ω. So then B t (ω) is actually something concrete map. 4.1 Kolmogorov s theorem Theorem 4.1 (Kolmogorov). Let I = [, 1] and (X t ) t I be a process. Suppose for all t, s I, E[ X t X s q ] C t s 1+ɛ. Then for all < α < ɛ/q there exists a modification X of X such that almost surely. X t (ω) X s (ω) C α (ω) s t α Here we say that X is a modification of X if for every t T, P ( X t = X t ) = 1. Definition 4.2. X and X are indistinguishable if there exists a set N of probability zero such that for all t and ω N, Xt (ω) = X t (ω). There is a slight difference, in the case of indistinguishability, N is some set that does not depend on t. That is, indistinguishable implies being a modification. If X t and X t have continuous sample paths with probability 1, the converse holds. But nobody worries about these. For a pre-brownian motion, we have X t X s N (, t s) and so E[ X t X s q ] = C α t s q/2. Then letting ɛ = q/2 1 and α < ɛ/q = 1/2 1/q, we get α 1/2 as q. That is, t X t (ω) is Hölder continuous of order ɛ. Corollary 4.3. A Brownian motion sample path is Hölder of order α < 1/2. Proof of Theorem 4.1. Let D n = {/2 n, 1/2 n,..., (2 n 1)/2 n }. We have ( 2 n ) P ( X i/2 n X (i 1)/2 n 2 nα ) 2 n( E X i/2 n X (i 1)/2 n q) 2 nαq i=1 C2 nqα (1+ɛ)n 2 n = C2 nqα ɛn.

14 Math 288 Notes 14 So ɛ qα > implies ( ) P ( X i/2 n X (i 1)/2 n 2 nα ) <. n i Borel Cantelli then implies that there exists a set N with P (N) = such that for every ω N c there exists an n (ω) such that X i/2 n(ω) X (i 1)/2 n(ω) < 2 nα for all n > n (ω). Now almost surely. K α (ω) = sup sup n 1 i We will finish the proof next time. X i/2 n(ω) X (i 1)/2 n(ω) 2 nα <

15 Math 288 Notes 15 5 February 1, 217 We start with Kolmogorov s theorem. 5.1 Construction of a Brownian motion Proof of Theorem 4.1. We have proved that X i/2 n X (i 1)/2 n K α (ω) = sup sup n 1 i 2 nα < almost surely using Markov s inequality and the Borel Cantelli lemma. We look at the set { i } D = 2 n : i 2n, n = 1,.... Suppose we have a functionf : D R such that f(i/2 n ) f((i 1)/2 n ) K2 nα. This condition implies f(s) f(t) 2K t s α 1 2 α for t, s D. You can prove this using successive approximations. Now define X t (ω) = lim s t,s D X s (ω) if K α (ω) < and otherwise just assign an arbitrary value. You can check that X t (ω) X s (ω) C ω t s α for some constant C ω depending only on ω, i.e., Xt is Hölder continuous of order α. Finally we see that and so P (X t = X t ) = 1. P ( X t X t > ɛ) = lim P ( X t X s > ɛ) = s t,s D This gives a construction of a Brownian motion, which is the modification of the pre-brownian motion. We first constructed a Gaussian white noise in an abstract space, then constructed a pre-brownian motion by X t = G([, t]). Then by Kolmogorov s theorem defined a Brownian motion. Then a sample path is Hölder continuous up to order 1/2. There is another construction of a Brownian motion, due to Lévy. This construction is Exercise 1.18.

16 Math 288 Notes Wiener measure The space of continuous paths C(R +, R) can be given a structure of a σ-algebra by declaring that w w(t) is measurable for all t. Now there is a measurable map Ω C(R +, R), w (t B t (ω)). The Wiener measure is the push-forward of this measure. Don t think too hard about what this looks like. Theorem 5.1 (Blumenthal s zero-one law). Let F = σ(b s, s t) and let F + = t> F t. Then F + is trivial, i.e., for any A F +, either P (A) = or P (A) = 1. Example 5.2. Let A = n {sup s n 1 B s > }. Then P (A) = or P (A) = 1. The right answer is P (A) = 1. This is because ( ) P (A) = lim P sup B s > n s n 1 lim n P (B n 1 > ) = 1 2. Sketch of proof. Take a A F + If I can show that and let g be a bounded measurable function. E[1 A g(b t1,..., B tk )] = E[1 A ]E[g(B t1,..., B tk )] then F + σ(b t, t > ). Taking the limit, we see that F + is independent of itself. Then we get the result.

17 Math 288 Notes 17 6 February 3, 217 Using Kolmogorov s theorem we defined a Brownian motion by modifying the pre-brownian motion. A sample path of a Brownian motion is then Hölder continuous with order α for all α < 1/2. Using it we defined the Wiener measure on C(R +, R). 6.1 Blumenthal s zero-one law Let us define F + = F t = σ(b s, s t). t> t> Theorem 6.1 (Blumenthal). The σ-algebra F + is independent of itself. In other words, P (A) = or 1 for every A F +. Proof. Take an arbitrary bounded continuous function g. For an A F +, we claim that E[1 A g(b t1,..., B tk )] = E[1 A ]E[g(B t1,..., B tk )], for < t 1 < < t k. Let us only work with the case k = 1 for simplicity. Note that B t1 B ɛ is independent of F ɛ. So we have E[1 A g(b t1 )] = lim ɛ E[1 A g(b t1 B ɛ )] = lim ɛ P (A)E[g(B t1 )] = E[1 A ]E[g(B t1 )]. This implies that F + is orthogonal to σ(b t, t > ). So F + F t, but F + F t. So F + F +. Proposition 6.2. Define T a = inf{t, B t = a}. Then almost surely for all a R, T a <. That is, lim sup t B t = and lim inf t B t =. Proof. We have ( 1 = P sup s 1 ) ( B s > = lim P δ sup s 1 ) B s > δ. But by the scaling property of the Brownian motion, B t λ 1 B λ 2 t, we can write ( ) ( ) P sup B s > δ = P sup B s > 1. s 1 s δ 2 It follows that P (sup s B s > 1) = 1.

18 Math 288 Notes Strong Markov property Definition 6.3. A random variable T [, ] is a stopping time if the set {T t} is in F t. Intuitively, a stopping time cannot depend on the future. Example 6.4. The random variable T a = inf{t : B t = a} is a stopping time. Definition 6.5. We define the σ-field F T = {A : A {T t} F t for all t } generated by T. Roughly speaking, this is all the information collected up to the stopping time. Theorem 6.6. Define B (T ) t = 1 {T < } (B T +t B T ). Then under the measure P ( : T < ), the process B (T ) t is a Brownian motion independent of F T. We will prove this next time.

19 Math 288 Notes 19 7 February 6, 217 Theorem 7.1. Suppose P (T < ) >. Define B (T ) t = 1 T < (B T +t B T ). Then respect to P ( : T < ), the process B (T ) t is a Brownian motion independent of F T. Here, A F T is defined as A {T t} F t. Proof. Let A F T. We want to show that for any continuous bounded function F, Then E[1 A F (B (T ) t 1,..., B (T ) t p )] = E[1 A ]E[F (B (T ) t 1,..., B (T ) t p ]. Define [t] n = min{k/n 2 : k/n 2 > t}. We have F (B (T ) t ([T ]n) ) = lim n F (B t ). E[1 A F (B (T ) t )] = lim E[1 AF (B ([Tn]) t )] n = lim E[1 A 1 {(k 1)2 n <T k2 n }F (B (k/2n) t)] n k= = lim n k= E[1 A 1 {(k 1)2 n <T k2 n }] E[F (B (k/2n ) t )] = = E[1 A ] E[F (B (T ) t )]. Here we are using the fact that 1 A 1 {(k 1)2 n <T k2 n } is in F k2 n independent to B (k2 n ) t = B k2 n +t B k2 n. and so is 7.1 Reflection principle Let S t = sup s t B s. Theorem 7.2 (Reflection principle). We have P (S t a, B t b) = P (B t 2a b). The idea is that you can reflect with respect to the first tie you meet a. Proof. Let T a = first time S t a. Then B (T ) t is independent of F Ta by the strong Markov property. This is why reflection works. Proposition 7.3. The probability density of (S t = a, B t = b) is The probability density of T a is 2(2a b) 2πt 3 e (2a b)2 /2t 1 (a>,b<a). f(t) = a 2πt 3 e a2 /2t.

20 Math 288 Notes 2 Proof. Exercise. We have P (T a t) = P (S t a) = P (S t a, B t a) + P (S t a, B t a) Then compute the density. = 2P (B t a) = P (tb 2 1 a 2 ). Proposition 7.4. Let < t n < < t n p n = t. Then p n i=1 (B t n i B t n i 1 ) 2 t in L 2 as n and max i t n i tn i 1. This is saying that t (db t) 2 t. Proof. We have [ pn 2 E (B t n i B t n i 1 ) 2 t] = E i=1 = p n i=1 E [ (B t n i [ p n ( (Bt n B i t n )2 (t n i 1 i t n i 1) )] 2 i=1 B t n i 1 ) 2 (t n i t n i 1) ] 2 p n C (t n i t n i 1) 2, since the (B t n i B t n i 1 ) 2 (t n i tn i 1 ) are independent with expectation. 7.2 Filtrations Now we are going to talk about martingales. Definition 7.5. A filtration is an increasing sequence F t. Define F t + = s>t F s F t. i=1 If F t + = F t, then F t is called right continuous. By definition, F t + is right continuous. If you work with F t +, then you are going to be fine.

21 Math 288 Notes 21 8 February 8, 217 The first homework is 1.18, 2.29, 2.31, Last time we talked about the strong Markov property for Brownian motion. This implies the reflection principle, and so we were able to effectively compute the distributions of max s t B s. 8.1 Stopping time of filtrations A filtration consists of an increasing sequence (F t ) t=, and we define F + t = s>t F s = G t. Definition 8.1. X t is adapted with respect to (F t ) if X t is F t measurable. X t is progressively measurable if (ω, s) X s (ω) is measurable with respect to F t B([, t]). Proposition 8.2. If the sample path is right continuous almost surely, then X t is adapted if and only if X t is progressively measurable. Proof. It is obvious that progressively measurable implies adapted. that it is adapted. Fix a time t. Define X n s = X kt/n [ (k 1)t for s, kt ). n n Assume Then X n s is now discrete, and lim n X n s = X s almost surely. Clearly (ω, s) X n s (ω) is measurable with respect to F t B([, t]). A stopping time is a random variable T such that {T t} F t. similarly define F T = {A : A {T t} F t }. We Proposition 8.3. T is a stopping time with respect G t = F t + {T < t} F t for all t >. if and only if Proof. We have {T t} = s<t {T < s}. Likewise s}. {T < t} = s<t {T We define G t = {A : A {T t} F T } = F T +. Here are some facts: If S and T are stopping times, then S T and S T are also stopping times. If S n is a sequence of stopping times and increasing, then S = lim n S n is also a stopping time. If S n is a decreasing sequence of stopping times converging to S, then S is a stopping time with respect to G.

22 Math 288 Notes Martingales Definition 8.4. A process X t is a martingale if E[ X t ] < and E[X t F s ] = X s for every t s. If E[X t F s ] X s, then it is called a submartingale. Example 8.5. B t is a martingale. But B 2 t is not a martingale because E[B 2 t F s ] = (t s) + B 2 s. But B 2 t t is a martingale. Example 8.6. The process e θb2 t θ2 /2t is a martingale for all θ. This is because E[e θb2 t θ2 t/2 F s ] = E[e θ(bt Bs)2 +2θ(B t B s)b s F s ]e θ2 B 2 s θ2 t/2. Example 8.7. For any f(s) L 2 [, ), the process t f(s)db s = lim f(s i 1 )(B si B si 1 ) n i is a martingale. Note that this is well-defined for simple functions f and you can extend for general f L 2 because if f is simple. Furthermore, is also a martingale. ( t ) 2 E f(s)db s = ( t ) 2 f(s)db s t t f(s) 2 ds f(s) 2 ds

23 Math 288 Notes 23 9 February 1, 217 Chapter VII of Shirayayev, Probability is a good reference for discrete martingales. 9.1 Discrete martingales - Doob s inequality We are now going to look at martingales with discrete time. Theorem 9.1 (Optional stopping theorem). Suppose X n is a submartingale and σ τ < (almost surely) are two stopping times. If E[ X n 1 τ>n ] and E[ X τ + X σ ] <, then E[X τ F σ ] X σ. If X n is a martingale, then E[X τ F σ ] = X σ. Proof. To prove E[X τ F σ ] X σ, it suffices to prove E[X τ A] E[X σ A] for all A F σ. In this case, A will look like A = A k, where A k = A 1 σ=k F k. k=1 So it suffices to check for A m where m is fixed. Define τ n = τ n and σ n = σ n. We automatically have E[X σ A m ] = lim n E[X σ n A m ], because the sequence on the right side is eventually constant. Also E[X τ A m ] = E[X τ 1 τ>n A m ] + E[X τ 1 τ n A m ] = E[X τ 1 τ>n A m ] + E[X τn 1 τ n A m ] = E[X τ 1 τ>n A m ] E[X n 1 τ>n A m ] + E[X τn A m ]. Here as n, the first two terms goes to because of the condition. It is also clear that E[X σn A m ] E[X τn A m ], because we can only care about finitely many outcomes of σ and τ. So E[X σ A m ] = lim n E[X σ n A m ] lim n E[X τ n A m ] = E[X τ A m ]. Theorem 9.2 (Doob inequality). Let X j be a nonnegative submartingale, and let M n = max j n X j. (1) For a >, ap (M n > a) E[1 Mn ax n ]. (2) for p > 1, ( p ) pe[ Xn M n p p p ]. p 1 Proof. (1) Let τ = min{j n : X j a}. This is a stopping time, and τ = n if M n < a. Then E[X n ] E[X τ ] = E[1 Mn ax τ ] + E[1 Mn<aX τ ] ap (M n a) + E[1 Mn<aX n ]. (2) It follows from the following lemma.

24 Math 288 Notes 24 Lemma 9.3. If λp (Y λ) E[X1 Y λ ] for X, Y, then Proof. We have E[Y p ] ( p ) pe[x p ]. p 1 E[Y p ] = pλ p 1 P (Y λ)dλ E Xpλ p 2 1 Y λ dλ [ = E X p p 1 Y p 1] p p 1 X p Y p p 1. Theorem 9.4 (Doop upcrossing inequality). Suppose X n is a submartingale. Let β(a, b) be the number of upcrossings, i.e., the number times the martingale goes from below a to above b. Then E[β(a, b)] E[max(X n a, )]. b a Note that if X is a submartingale, then max(x a, ) is a submartingale. This is because E[f(X n ) F n 1 ] f(e[x n F n 1 ]) f(x n 1 ).

25 Math 288 Notes 25 1 February 13, More discrete martingales - upcrossing inequality In the optional stopping theorem, we needed the stopping time τ to satisfy the inequality E[X n 1 τ>n ] as n. Proposition 1.1. If X n is uniformly integrable, i.e., lim sup X n =, M n then E[X n 1 τ>n ]. Proof. We have lim n X n M X n 1 τ>n lim M1 τ>n + lim n n lim n X n 1 Xn M =. X n 1 Xn M 1 τ>n Corollary 1.2. Let X n be a submartingale. Suppose (1) E[ X ] <, (2) E[τ] <, and (3) sup n E[ X n+1 X n F n ] C. Then E[X τ ] E[X ] and E[X τ F ] X. Proof. We want to check the three conditions in the optional stopping inequality. We have E[ X τ X m ] E[ X n+1 X n 1 τ n+1 ] = E[1 τ n+1 E[ X n+1 X n F n ] C n=m E[1 τ n+1 ] CE[τ] <. n=m This shows E[ X τ ] <. We also have n=m lim E[ X n 1 τ n+1 ] lim E[ X τ X n ] + lim E[ X τ 1 τ n+1 ] =. n n n Theorem 1.3 (Upcrossing inequality). For a < b, let β(a, b) be the number of upcrossings in (X j ) n j=1. Then E[β(a, b)] E[max(X n a, )]. b a Proof. Replace X j by max(x j a, ). This is still a submartingale, and so we can assume X i and a =. Define a random variable φ i such that φ i = 1 if there exists a k i such that X k = and X j < b for all k j i. Then we get n 1 E[β(a, b)] b 1 E b 1 i i=1 φ i (X i+1 X i ) = b 1 i E[X i+1 X i ] = b 1 E[X n ], E[φ i E[X i+1 X i F i ]]

26 Math 288 Notes 26 because E[X i+1 X i F i ] and so we can replace φ i by 1. Theorem 1.4 (Doob s convergence theorem). If X n is a submartingale with sup i E[ X i ] <, then X n converges almost surely to some X. Proof. We have P (lim sup X n > lim inf X n ) = P by the upcrossing inequality. (lim sup X n > b > a > lim inf X n ) = a<b a,b Q Note that X n X in L 1 may fail. For instance, take Y i be independent Bernoulli with {, 2} and X n = Y 1 Y n. Then EX n = 1. Proposition 1.5. (1) Suppose X n X almost surely and {X n } is uniformly integrable. Then X n X in L 1. (2) Suppose X n, X n X almost surely, and EX n = EX. Then X n X in L 1. Theorem 1.6 (Levi). Suppose X L 1 be a random variable and let F n be a filtration with F = F n. Then E[X F n ] E[X F ] almost surely and L 1. Conversely, if {X n } is uniformly integrable and F is the full σ-algebra F, then there exists a X L 1 such that X n = E[X F n ].

27 Math 288 Notes February 15, 217 So far we have looked that Doob s inequality, the optional stopping time, and Doob s upcrossing inequality. There is a supermartingale version of the upcrossing inequality: Theorem For a discrete supermartingale X, the number of upcrossings Mab n has expectation value [ max{, a E[Mab(x)] n xn )} ] E. b a Proof. Let τ 1 = n min{j > : x j a}, and so forth. Then all τ k are stopping times. Let τ 2 = n min{j > τ 1 : x j b}, D = (X τ2 X τ 1 ) + (X τ4 X τ3 ) +. Then (b a)mab n + R D, where R = if the tail is a down-crossing and R = X n a otherwise. Because X is a supermartinagle, we have E[X τ2 X τ1 ] and so 11.1 Levi s theorem (b a)em n ab ER E[max(, a X n )]. Theorem 11.2 (Levi). Suppose X L 1 and F n be a filtration, with F = Fn. Then E[X F n ] E[X F ] in L 1 and almost surely. Conversely, if a martingale X n is uniformly integrable, then there is an X such that E[X F n ] = X n. Proof. Let X n = E[X F n ]. We claim that X n is uniformly integrable, and you can check this. Also X n is a martingale. So the convergence theorem tells us that X n X almost surely and in L 1. Now the question is whether X = E[X F ]. Because X n is a martingale, for any A F n and m n, X n dµ = X m dµ. Taking m, we get Xdµ = A A A A X n dµ = A X dµ because X m X in L 1. So this is true for all A F. So X = E[X F ]. Now let us do the second half. Suppose X n is uniformly integrable. Then X n X for some X almost surely and in L 1. So basically by the same argument, X n dµ = lim X m dµ = Xdµ. A m A A This shows that X n = E[X F n ].

28 Math 288 Notes 28 So essentially all martingales come from taking a filtration and looking at the conditional expectations Optional stopping for continuous martingales Theorem Suppose X t is a submartingale with respect to a right continuous filtration. If σ τ C, then E[X τ F σ ] X σ. Proof. Let τ n = ( nτ + 1)/n. Then τ n is a stopping time and F τn F τ as n, because F is right continuous. By the discrete version of optional stopping time, we see that X τn X implies that E[X C F τn ] X τn. Then X τn is uniformly integrable and X τn X τ almost surely and in L 1. Similarly X σn X σ. So E[X τ+ɛ F σ ] = lim m E[X τ+ɛ F σm ] = lim m lim n E[X τ n+ɛ F σm ] lim m X σ m = X σ.

29 Math 288 Notes February 17, 217 We proved the upcrossing inequality and Doob s inequality. We also proved optional stopping theorem. There is also the martingale convergence theorem that says that if (X j ) is a submartingale or a supermartingale and sup j E[ X j ] C, then lim J X j = X for some X almost surely. If (X j ) is uniformly integrable, then X j X in L 1. Theorem 12.1 (3.18). Suppose F t is right continuous and complete. Let X t be a super-martingale and assume that t E[t] is right continuous. Then X has a modification such that the sample paths are right continuous with left limits, which is also a supermartingale. Lemma 12.2 (3.16). Let D be a dense set and f be defined on D such that (i) f is bounded on D [, T ], (ii) M f ab (D [, T ]) <. Then f + (t) = lim s t f(s) and f (t) = lim s t f(s) exist for all t D [, T ]. Define g(t) = f + (t). Then g(t) is right continuous with left limits. Theorem 12.3 (3.21). Suppose a martingale (X t ) has right continuous sample paths. Then the following are equivalent: (i) X t is closed, i.e., there exists a Z L 1 such that E[Z F t ] = X t. (ii) X t is uniformly integrable. (iii) X t W almost surely and L 1 as t for some W. There is an application. Let T a = inf{t : B t = a} for a > and λ >. Proposition E[e (λ2 /2)T a ] = e λa.

30 Math 288 Notes 3 13 February 22, 217 Definition Let (X t ) by a submartingale or a supermartingale with right continuous sample paths. Assume X t X almost surely. Suppose T is a stopping time which can take with positive probability. We define X T (ω) = 1 (T (ω)< ) X T (ω) + 1 T (ω)= ) X (ω). Theorem 13.2 (Optional stopping theorem). Let (X t ) be a submartingale. Let S T almost surely and P (T < ) = 1. (1) If the submartingale (X t ) is uniformly integrable, then E[ X n 1 (T >n) ] and E[ X T + X S ] <. (2) If E[ X n 1 (T >n) ] and E[ X T + X S ] < then E[X T F S ] X S. (3) If S T C almost surely, then E[X T F S ] X S. (4) Suppose a martingale (X n ) is uniformly integrable, and let S T be stopping times taking values in R { }. Then E[ X n 1 (T >n) ] and E[ X T + X S ] <. Theorem 13.3 (Levi). Let (X n ) be a uniformly integrable martingale. Then there exists a X L 1 such that E[X F n ] = X n. If X n is a submartingale and T is a stopping time with T R { }, then X T n is a submartingale. That is, for m < n, E[X T n F m ] X T m. Example Let X 1 = 1 and X n be a symmetric random walk. Let T be the first hitting time of and let Y n = X T n. Then E[Y n ] = E[Y 1 ] = 1 but Y n almost surely. This means that Y n does not converge to in L 1, and hence Y n is not uniformly integrable Submartingale decomposition Theorem 13.5 (Doob submartingale decomposition). Suppose (X n ) is a submartingale. Then there exists a martingale M n and a predictable increasing sequence A n such that X n = M n + A n. Here we say that A n is predictable if and only if A n is a measurable with respect to F n 1. If X n is a supermartingale, then A n is decreasing. Furthermore, such a decomposition is unique. Proof. Because we want X 2 = M 2 + A 2, we get E[X F 1 ] = M 1 + A 2 = X 1 + A 2. So A 2 = E[X 2 F 1 ] X 1. Similarly we can define A k = E[X k F k 1 ] M k 1, M k = X k A k. You can check that A k increases. For instance, A 2 = E[X 2 F 1 ] M 1 = E[X 2 F 1 ] [X 1 A 1 ] = E[X 2 F 1 ] X 1 + A 1.

31 Math 288 Notes February 24, 217 There is another version of optional stopping time Theorem 14.1 (Le Gall 3.25). Suppose X is a nonnegative supermartingale with right continuous sample paths. Let S T be two stopping times. Then E[X T F S ] X S. Proof. Consider only the discrete case, since we can approximate the continuous case with the discrete case. A nonnegative supermartingale is in L 1 in the sense that E[X j ] C <. The upcrossing inequality tells us that X j X = X almost surely (but maybe not in L 1 ). For any M >, we know that S M T M are stopping times. You can check that X S M X S almost surely. Then by Fatou s lemma, lim E[X S M ] E[X S ]. M So X S L 1. For any A F X, we want to prove E[1 A X S ] E[1 A X T ]. immediately imply E[X T F S ] X S. Define the stopping time { S A S(ω) if ω A (ω) = if ω / A. This would and similarly T A (ω). Then because both are finite. Now E[X S A M ] E[X T A M ] E[1 A {S M} X S M ] + E[1 A {S>M} X M ] + E[1 A cx M ] = E[X SA M ] E[X T A M ] = E[1 A {S M} X T M ] + E[1 A {S>M} X M ] + E[1 A cx M ]. Hence we get E[1 A {S M} X S ] E[1 A {S M} X T M ]. Taking the limit M, we get E[1 A {S< } X S ] = lim E[1 A {S M}X S ] lim E[1 A {S M}X T M ] M M E[1 A {S< } X T ] by dominated convergence and Fatou s lemma. On the other hand, we have E[1 A {S= } X S ] = E[1 A {S= } X T ] clearly. This implies that E[1 A X S ] E[1 A X T ] Backward martingales In this case, we have a filtration F 3 F 2 F 1 F.

32 Math 288 Notes 32 Definition A process (X n ) is a backward supermartingale if for any m n. E[X n F m ] X m Theorem if sup n< E[ X n ] C then there exists an X L 1 such that X n X almost surely and in L 1. Note that we also have L 1 convergence, which we did not have in the forward case. Proof. The almost surely convergence is almost identical to the forward case, as a consequence of the upcrossing inequality. To show convergence in L 1, define A n = E[X 1 X 2 F 2 ] + + E[X (n 1) X n F (n 1) ]. Note that this is negative. Then M n = X n + A n is a martingale. Then A n A almost surely and in L 1, which comes from the monotone convergence theorem. Now M n being a martingale simply means M n = E[M 1 F n ]. This is just the setting of Levi s theorem and so immediately we have L 1 convergence of M n Finite variance processes We assume that sample paths are continuous. Definition A function a : [, T ] R is of finite variance if there exists a signed measure µ with finite µ such that a(t) = µ([, t]) for all t T. Definition A process (A t ) is called a finite variance process if all sample paths have finite variance. Definition A sequence X t with X = is called a local martingale if there exist increasing stopping times T n such that X t Tn is a uniform integrable martingale for all n. If X, then X t is a local martingale if X t X is a local martingale.

33 Math 288 Notes February 27, 217 For a finite variance process A s (ω), we can write A s (ω) = µ([, s]) for some finite measure µ. So we can define H s (ω)da s (ω) = H s (ω)µ(ds)(ω). Here we require H s (ω) da s (ω) <. This is Section 4.1 and you can read about it Local martingales We are always going to assume that everything is adapted to F t and sample paths are continuous. Definition A sequence (M t ) is a local martingale with M = if there exists an increasing sequence of stopping times T n such that M t Tn is an uniformly integrable martingale for each n. If M, then M t is called a local martingale if M t M = N t is a local martingale. Proposition (i) If A is a nonnegative local martingale with M L 1, then it is a supermartingale. (ii) If there exists a Z L 1 such that M t Z for all t, then the local martingale M t is a uniformly integrable martingale. (iii) If M =, then T n = inf{t : M t n} reduces the martingale, i.e., satisfies the condition in the definition. Proof. (i) Let us write M s = M + N s. By definition, there exists a T n such that lim E[M + N t Tn F s ] = lim M + N s Tn = M s. n n Then by Fatou s lemma, we get E[M t F s ] M s. (ii) This is clear because M t is uniformly integrable. (iii) This basically truncating the local martingale. Proposition A finite variance local martingale with M = is identical to. (Again we are assuming continuous sample paths.) Proof. Define a stopping time τ n = inf{t : t dm s n},

34 Math 288 Notes 34 and let N t = M τn t. Then we have N t For any t and = t < < t p = t, ( ) 2 E[Nt 2 ] = E δn i = E i i t dm s n. [ (δn i ) 2 = E sup δn i i j ] [ δn j ne Taking the mesh to go to, we get that the right hand side goes to. sup i ] δn i. Theorem Suppose that M t is a local martingale. Then there exists a unique increasing process M, M t such that M 2 t M, M t is a local martingale. This M, M t is called the quadratic variation of M. Example If M t = B t then M, M t = t. Then B 2 t t is a martingale. Proof. Let us first prove uniqueness. Suppose Mt 2 A t and Mt 2 A t are local martingales. Then A t A t is also a local martingale. By definition, A t and A t are increasing and less than almost surely. So A t A t is of finite variance. This shows that A t A t = for all t. Proving existence is rather difficult. Suppose M = and M t is bounded, i.e., sup t M t < C almost surely. Fix and interval [, k] and subdivide it to = t < < t n = K. Define X n t = i M ti 1 (M ti t M ti 1 t). You can check that this is a local martingale, and you can also check j Mt 2 j 2Xt n j = (M ti M ti 1 ) 2. i=1 Now define M, M t = lim (M ti M ti 1 ) 2. n i We are going to continue next time.

35 Math 288 Notes March 1, Quadratic variation As always, we assume continuous sample paths. This is the theorem we are trying to prove: Theorem Let M be a local martingale. Then there exists an increasing process M, M t such that M 2 t M, M t is a local martingale. We proved uniqueness using the fact that if there are two such A, A then A A is a finite variation process that is also a local martingale. For existence, fix K and subdivide [, K] into = t < < t n = K. We define X n t = M (M t1 t M t ) + + M tn 1 (M tn t M tn 1 t). Then we immediately have M 2 t j 2X n t j = j (M ti M ti 1 ) 2. i=1 Lemma Assuming that M t C (so that M is actually a martingale), lim m,n,mesh E[(Xn k Xk m ) 2 ] =. Proof. Let us consider the simple case where n = 5 and m = 3 and the subsequence is given by t 1, t 3, t 5. Then we compute [{ 5 } 2 ] E[(Xk n Xk m ) 2 ] = E M j 1 (M j m j 1 ) M 3 (M 5 M 3 ) M 1 (M 3 M 1 ) j=1 = E[((M 4 M 3 )(M 5 M 4 ) + (M 2 M 1 )(M 3 M 2 )) 2 ] = E[(M 4 M 3 ) 2 (M 5 M 4 ) 2 + (M 2 M 1 ) 2 (M 3 M 2 ) 2 ] [ E max M j M j 1 2 (M j M j 1 ) 2] j j [( E[max(M j M j 1 ) 4 ] 1/2 E (M j M j 1 ) 2) 2] 1/2. j

36 Math 288 Notes 36 Here, we see that [( ) 2 ] E (M j M j 1 ) 2 j j E[(M j M j 1 ) 4 ] 5 + 2E(M 1 M ) 2 (M j M j 1 ) 2 + j=2 4C 2 j + 2 j = 4C 2 j E[(M j M j 1 ) 2 ] [ [ n E (M j M j 1 ) 2 E k=j+1 (M k M k 1 ) 2 F j ]] E[(M j M j 1 ) 2 ] + 2E[(M j M j 1 ) 2 E[(M K M j ) 2 F j ]] [ ] 12C 2 E (M j M j 1 ) 2 = 12C 2 E[(M K M ) 2 ] 48C 4. j So taking n, m we get E[max(M j M j 1 ) 4 ]. Now let us go back to the original theorem. We check that Mt 2 2Xt n is an increasing process. Take n. By the lemma, Xt K n Y t K in L 2. You can also check that Y t K has continuous sample paths. Define If {t i } is a subdivision of [, t], we have M, M t = M 2 t K 2Y t K. n (M t n i M t n i 1 ) 2 M, M t i=1 in L 2. Also, X t K is a local martingale. This construction works for all fixed K. For every K, we get a A K t such that Mt K 2 AK t K is a martingale. Then by uniqueness, AK+1 t = A K t if t K almost surely. This means that we can just take the union of all the A K t.

37 Math 288 Notes March 3, 217 Last class we almost finished proved the following theorem. Theorem Let M be a local martingale. Then there exists a unique increasing process M, M t such that M 2 t M, M t is a local martingale. In the proof, we haven t check that the L 2 limit Xt n sample paths. To show this, it suffices to check that [ E Xs m Xs n 2] sup s t X t has continuous as n, m. By Doob s inequality, [ E Xs m Xs n 2] 4E[ Xt m Xt n 2 ]. sup s t Then this is clear from the L 2 convergence. For the general case, define T n = inf{t : M t }. Then M Tn is a bounded martingale and so there exists an increasing process A n t such that (M Tn ) 2 A n t is a martingale. By uniqueness, A n+1 t T n = A n t. Now we can take the limit A t = lim n A n t. Then we have (M Tn t) 2 A Tn t = (M Tn ) 2 t A n t T n is a martingale. This means that M 2 t A t is a local martingale Consequences of the quadratic variation Corollary Let M be a local martingale with M =. Then M = if and only if M, M t =. Proof. Suppose M, M t =. Then M 2 is a nonnegative local martingale. Then M 2 t is a supermartingale. Since M =, we immediately get M =. Corollary Assume M is a local martingale and M L 2. Then E M, M < if and only if M is a true martingale and EM 2 t < C for all t. Proof. Suppose M is a martingale with M = and EM 2 t < C. Then Let S n = inf{t : M, M t n}. Then E sup Mt 2 4EMT 2 4C. t T M S n t M, M S n t sup Mt 2. t But sup t M 2 t is in L 1. Then lim E[M S 2 n n t M, M S n t] = E[M 2 M, M ] =.

38 Math 288 Notes 38 By monotone convergence, we have E M, M t = EM 2 t < C. Conversely, suppose E M, M <. Let T n = inf{t : M t n}. We have M 2 T n t M, M Tn t n 2 L 1. Then M 2 T n t M, M T n t is a uniformly integrable martingale and so lim E[M T 2 n n t M, M Tn t] = and so E[M 2 t ] E[ M, M t ] < C. This also implies that M is a true martingale.

39 Math 288 Notes March 6, 217 Let us try to finish this chapter. Theorem If sup t E[M 2 t ] <, then E M, M < and M 2 t M, M t is a uniformly integrable martingale. If M is a local martingale and E M, M < then sup t E[M 2 t ] < Bracket of local martingales Definition If M and N are two local martingales, we define their bracket as M, N t = 1 [ M + N, M + N M, M N, N ]. 2 Proposition 18.3 (Le Gall 4.15). In probability, lim i (M ti M ti 1 )(N ti N ti 1 ) = M, N t. Furthermore, if M and N are two L 2 martingales, then E M, N = E[M N ]. Definition We say that M is orthogonal to N if M, N =. Theorem 18.5 (Kunita Watanabe). If M and N are two local martingales, then ( ) 1/2 ( ) 1/2 H s (ω)k s (ω) d M, N s H s d M, M s K s 2 d N, N s. Proof. You can just subdivide the intervals finely and use the formula for M, N t. Definition A process X t is called a semimartingale if X t = M t + A t where M t is a local martingale and A t is a finite variance process. Note that such a decomposition is unique since a finite variance local martingale is zero. Proposition For two semimartingales X = M + A and Y = N + B, we define X, Y = M, N. Then (X ti X ti 1 )(Y ti Y ti 1 ) X, Y = M, N. i Proof. We only have to check that (N ti N ti 1 )(A ti A ti 1 ) i We use the Schwartz inequality. We have (A ti A ti 1 ) 2 sup A ti A ti 1 A ti A ti 1. i i i The first term goes to zero and the second term is bounded.

40 Math 288 Notes Stochastic integration We define the space H 2 = { } M a continuous martingale with M = and sup t EMt 2 <. We then have M, M = EM 2 and [M F t ] = M t. We define the inner product as (M, N) = E[ M, N ] = EM N. Also define the space of elementary processes as { E = H s (ω) = H i (ω)1 (ti,t i+1](s), H i F ti }. i We can then define the integral as H s (ω)dm s = i H i (ω)(m ti M ti 1 ). This is just like Riemann integration. Theorem H 2 is a Hilbert space. Proof. Suppose there is a Cauchy sequence E[(M n M ) m 2 ]. Then M n Z in L 2. Our goal is to find a M t with continuous sample paths such that M t in some sense. M n t

41 Math 288 Notes March 8, 217 We define H 2 to be the space of martingales with sup t E[M 2 t ] <. The inner product is defined as (M, N) = E[M N ] = E M, N. Proposition H 2 is a Hilbert space. Proof. Suppose lim m,n E[(M m, M n )] = so that M n is a Cauchy sequence. Then there exists a Z such that M n Z in L 2 (P ). We also have, for an fixed t, E[(M m t M n t ) 2 ] and so M t. What we now need to show is that M has continuous sample paths. We have k=1 E[sup(Mt n Mt m ) 2 ] 4E[(M n M ) m 2 ]. t This shows that you can find a subsequence such that [ ] E sup M n k+1 t M n k t 4 E[(M n k+1 M n k ) 2 ] 1/2 <. t In particular, k=1 Y t = k (M n k+1 t M n k t ) is a martingale with continuous sample paths. Also Y = Z and so M n Y in H Elementary process Definition We define the space E of elementary processes as E = { p 1 H i (ω)1 (ti,t i+1] : H i F ti, with i= We also define L 2 (M) = { H s such that i } Hi 2 ( M, M ti+1 M, M ti ) <. } Hs 2 d M, M s <. Proposition If M H 2, then the space E is dense in L 2 (M). Theorem For any M H 2, we can define p 1 (H M) t = H i (ω)(m ti+1 t M ti t) = i= t HdM H 2.

42 Math 288 Notes 42 This map E H 2 given by H H M is an isometry from E to H: H M H 2 = Hs 2 d M, M s. So there exists a unique extension from L 2 (M) H, denoted by HdM. Furthermore, (1) H M, N = H M, N, (2) (1 [,T ] H) M = (H M) T = H M T. Roughly speaking, stochastic integration is just Riemann integration on elementary functions, which has a unique extension. You can check that H M, H M t = i H 2 i ( M, M ti+1 t M, M ti t). Proof. Since (H M) 2 H 2 M, M is a martingale, we have E[(H M) 2 ] = E H 2 s d M, M s. This shows that stochastic integration is and L 2 isometry. Then there exists a unique extension. Now we want to check that H M, N = H M, N. Note that M, N is a finite variance process M, M and N, N are increasing and M + N, M + N is decreasing, and also E M, N <. We want to check that t Hd M, N t = For elementary processes, we have t H s dm s, Then it extends to all processes. t t dn s = Hd M, N. t H s d M, N s.

43 Math 288 Notes 43 2 March 2, Review Today we will review, because all the properties are hard to remember. processes have continuous sample paths and are adapted. All 1. Finite variation processes: A t is a finite variation process if its sample paths have finite variation almost surely. Classical theory of integration applies here. Then t H s (ω)da s (ω) is defined by Lebesgue integration. (Note that da s is not positive.) 2. Continuous local martingales: Usually we assume M =. M t is a local martingale if there exists an increasing sequence T n almost surely such that (M Tn ) t = M Tn t is an uniformly integrable martingale. In this case, we say that T n reduces the local martingale. Why do we look at these objects? If X n is a sub/supermartingale, and sup n E X n p < for some p 1, then X n X almost surely. Moreover if p > 1 then X n X in L p (by Doob s inequality). Furthermore, if X n is uniformly integrable then X n X also in L 1. We really need uniform integrability to run the theory, and so we have to stick in a stopping time to make things uniformly integrability. 3. Quadratic variation: If M is a local martingale, then there exists a unique increasing process M, M t such that M 2 t M, M t is a local martingale. This is constructed as M, M t = lim i (M ti+1 M ti ) 2. For M = B a Brownian motion, we have B, B t = t. 4. Some properties of local martingales: A nonnegative local martingale M with M L 1 is a submartingale. (This follows from Fatou s lemma.) If there exists a Z L 1 such that M t < Z for all t, then M is a uniformly integrable martingale. If M with M = is both a local martingale and a finite variation process, then M =. For M a local martingale with M =, M, M = if and only if M =.

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Part II Probability and Measure

Part II Probability and Measure Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

CONVERGENCE OF RANDOM SERIES AND MARTINGALES CONVERGENCE OF RANDOM SERIES AND MARTINGALES WESLEY LEE Abstract. This paper is an introduction to probability from a measuretheoretic standpoint. After covering probability spaces, it delves into the

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Hardy-Stein identity and Square functions

Hardy-Stein identity and Square functions Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /

More information

1.1 Definition of BM and its finite-dimensional distributions

1.1 Definition of BM and its finite-dimensional distributions 1 Brownian motion Brownian motion as a physical phenomenon was discovered by botanist Robert Brown as he observed a chaotic motion of particles suspended in water. The rigorous mathematical model of BM

More information

Stochastic analysis. Paul Bourgade

Stochastic analysis. Paul Bourgade Stochastic analysis Paul Bourgade These are lecture notes from the lessons given in the fall 21 at Harvard University, and fall 216 at New York University s Courant Institute. These notes are based on

More information

ECON 2530b: International Finance

ECON 2530b: International Finance ECON 2530b: International Finance Compiled by Vu T. Chau 1 Non-neutrality of Nominal Exchange Rate 1.1 Mussa s Puzzle(1986) RER can be defined as the relative price of a country s consumption basket in

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Stochastic Calculus. Alan Bain

Stochastic Calculus. Alan Bain Stochastic Calculus Alan Bain 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral and some of its applications. They

More information

STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, = (2n 1)(2n 3) 3 1.

STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, = (2n 1)(2n 3) 3 1. STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, 26 Problem Normal Moments (A) Use the Itô formula and Brownian scaling to check that the even moments of the normal distribution

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Martingales. Chapter Definition and examples

Martingales. Chapter Definition and examples Chapter 2 Martingales 2.1 Definition and examples In this chapter we introduce and study a very important class of stochastic processes: the socalled martingales. Martingales arise naturally in many branches

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012 Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of

More information

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

Introduction to stochastic analysis

Introduction to stochastic analysis Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA 2139-437, USA. Abstract These lectures notes are notes in progress designed for

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

Representations of Gaussian measures that are equivalent to Wiener measure

Representations of Gaussian measures that are equivalent to Wiener measure Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

The Heine-Borel and Arzela-Ascoli Theorems

The Heine-Borel and Arzela-Ascoli Theorems The Heine-Borel and Arzela-Ascoli Theorems David Jekel October 29, 2016 This paper explains two important results about compactness, the Heine- Borel theorem and the Arzela-Ascoli theorem. We prove them

More information

Martingale Theory for Finance

Martingale Theory for Finance Martingale Theory for Finance Tusheng Zhang October 27, 2015 1 Introduction 2 Probability spaces and σ-fields 3 Integration with respect to a probability measure. 4 Conditional expectation. 5 Martingales.

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION A la mémoire de Paul-André Meyer Francesco Russo (1 and Pierre Vallois (2 (1 Université Paris 13 Institut Galilée, Mathématiques 99 avenue J.B. Clément

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

University of Regina. Lecture Notes. Michael Kozdron

University of Regina. Lecture Notes. Michael Kozdron University of Regina Statistics 252 Mathematical Statistics Lecture Notes Winter 2005 Michael Kozdron kozdron@math.uregina.ca www.math.uregina.ca/ kozdron Contents 1 The Basic Idea of Statistics: Estimating

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

Stochastic Calculus. Michael R. Tehranchi

Stochastic Calculus. Michael R. Tehranchi Stochastic Calculus Michael R. Tehranchi Contents Chapter 1. A possible motivation: diffusions 5 1. Markov chains 5 2. Continuous-time Markov processes 6 3. Stochastic differential equations 6 4. Markov

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information