Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory 1 / 81
Outline 1 Stochastic processes 2 Definition and construction of the Wiener process 3 First properties 4 Martingale property 5 Markov property 6 Pathwise properties Samy T. Brownian motion Probability Theory 2 / 81
Outline 1 Stochastic processes 2 Definition and construction of the Wiener process 3 First properties 4 Martingale property 5 Markov property 6 Pathwise properties Samy T. Brownian motion Probability Theory 3 / 81
Stochastic processes Definition 1. Let: (Ω, F, P) probability space. I R + interval. {X t ; t I} family of random variables, R n -valued Then: 1 If ω X t (ω) measurable, X is a stochastic process 2 t X t (ω) is called a path 3 X is continuous if its paths are continuous a.s. Samy T. Brownian motion Probability Theory 4 / 81
Modifications of processes Definition 2. Let X and Y be two processes on (Ω, F, P). 1 X is a modification of Y if P (X t = Y t ) = 1, for all t I 2 X et Y are non-distinguishable if P (X t = Y t for all t I) = 1 Remarks: (i) Relation (2) implicitly means that (X t = Y t for all t I) F (ii) (2) is much stronger than (1) (iii) If X and Y are continuous, (2) (1) Samy T. Brownian motion Probability Theory 5 / 81
Filtrations Filtration: Increasing sequence of σ-algebras, i.e If s < t, then F s F t F. Interpretation: F t summarizes an information obtained at time t Negligible sets: N = {F F; P(F ) = 0} Complete filtration: Whenever N F t for all t I Stochastic basis: (Ω, F, (F t ) t I, P) with a complete (F t ) t I Remark: Filtration (F t ) t I can always be thought of as complete One replaces F t by ˆF t = σ(f t, N ) Samy T. Brownian motion Probability Theory 6 / 81
Adaptation Definition 3. Let (Ω, F, (F t ) t I, P) stochastic basis {X t ; t I} stochastic process We say that X is F t -adapted if for all t I: X t : (Ω, F t ) (R m, B(R m )) is measurable Remarks: (i) Let F X t = σ{x s ; s t} the natural filtration of X. Process X is always F X t -adapted. (ii) A process X is F t -adapted iff F X t F t Samy T. Brownian motion Probability Theory 7 / 81
Outline 1 Stochastic processes 2 Definition and construction of the Wiener process 3 First properties 4 Martingale property 5 Markov property 6 Pathwise properties Samy T. Brownian motion Probability Theory 8 / 81
Definition of the Wiener process Notation: For a function f, δf st f t f s Definition 4. Let (Ω, F, P) probability space {W t ; t 0} stochastic process, R-valued We say that W is a Wiener process if: 1 W 0 = 0 almost surely 2 Let n 1 et 0 = t 0 < t 1 < < t n. The increments δw t0 t 1, δw t1 t 2,..., δw tn 1 t n are independent 3 For 0 s < t we have δw st N (0, t s) 4 W has continuous paths almost surely Samy T. Brownian motion Probability Theory 9 / 81
Illustration: chaotic path Brownian motion -1.5-1.0-0.5 0.0 0.5 1.0 1.5 0.0 0.2 0.4 0.6 0.8 1.0 Time Samy T. Brownian motion Probability Theory 10 / 81
Illustration: random path Brownian motion -1.5-1.0-0.5 0.0 0.5 1.0 1.5 0.0 0.2 0.4 0.6 0.8 1.0 Time Samy T. Brownian motion Probability Theory 11 / 81
Existence of the Wiener process Theorem 5. There exists a probability space (Ω, F, P) on which one can construct a Wiener process. Classical constructions: Kolmogorov s extension theorem Limit of a renormalized random walk LÈvy-Ciesilski s construction Samy T. Brownian motion Probability Theory 12 / 81
Haar functions Definition 6. We define a family of functions {h k : [0, 1] R; k 0}: h 0 (t) = 1 h 1 (t) = 1 [0,1/2] (t) 1 (1/2,1] (t), and for n 1 and 2 n k < 2 n+1 : h k (t) = 2 n/2 1 [ k 2 n 2 n, k 2n +1/2 2 n ](t) 2n/2 1 ( k 2 n +1/2 2 n, k 2n +1 2 n ](t) Lemma 7. The functions {h k : [0, 1] R; k 0} form an orthonormal basis of L 2 ([0, 1]). Samy T. Brownian motion Probability Theory 13 / 81
Haar functions: illustration 2 2 1 h 1 1 8 1 4 3 8 1 2 5 8 3 4 7 8 1 1 2 2 Samy T. Brownian motion Probability Theory 14 / 81
Haar functions: illustration 2 2 h 2 h 3 1 h 1 1 8 1 4 3 8 1 2 5 8 3 4 7 8 1 1 2 2 Samy T. Brownian motion Probability Theory 14 / 81
Haar functions: illustration 2 2 h 4 h 5 h 6 h 7 h 2 h 3 1 h 1 1 8 1 4 3 8 1 2 5 8 3 4 7 8 1 1 2 2 Samy T. Brownian motion Probability Theory 14 / 81
Proof Norm: For 2 n k < 2 n+1, we have 1 0 k 2 n +1 hk(t) 2 dt = 2 n 2 n k 2 n 2 n dt = 1. Orthogonality: If k < l, we have two situations: (i) Supp(h k ) Supp(h l ) =. Then trivially h k, h l L 2 ([0,1]) = 0 (ii) Supp(h l ) Supp(h k ). Then if 2 n k < 2 n+1 we have: 1 h k, h l L 2 ([0,1]) = ±2 n/2 h l (t) dt = 0. 0 Samy T. Brownian motion Probability Theory 15 / 81
Proof (2) Complete system: Let f L 2 ([0, 1]) s.t f, h k = 0 for all k. We will show that f = 0 almost everywhere. Step 1: Analyzing the relations f, h k = 0 We show that t s f (u) du = 0 for dyadic r, s. Step 2: Since t s f (t) = t ( t f (u) du = 0 for dyadic r, s, we have 0 ) f (u) du = 0, almost everywhere, according to Lebesgue s derivation theorem. Samy T. Brownian motion Probability Theory 16 / 81
Schauder functions Definition 8. We define a family of functions {s k : [0, 1] R; k 0}: s k (t) = t 0 h k (u) du Lemma 9. Functions {s k : [0, 1] R; k 0} satisfy for 2 n k < 2 n+1 : 1 Supp(s k ) = Supp(h k ) = [ k 2n 2 n, k 2n +1 2 n ] 2 s k = 1 2 n/2+1 Samy T. Brownian motion Probability Theory 17 / 81
Schauder functions: illustration 1 2 s 1 2 3/2 s 2 s 3 1 8 1 4 3 8 1 2 5 8 3 4 7 8 1 Samy T. Brownian motion Probability Theory 18 / 81
Gaussian supremum Lemma 10. Let {X k ; k 1} i.i.d sequence of N (0, 1) r.v. We set: M n sup { X k ; 1 k n}. Then ( ) M n = O ln(n) almost surely Samy T. Brownian motion Probability Theory 19 / 81
Proof Gaussian tail: Let x > 0. We have: P ( X k > x) = 2 (2π) 1/2 c 1 e x2 4 x x e z2 4 e z2 4 dz e z2 4 dz c2 e x2 4. Application of Borel-Cantelli: Let A k = ( X k 4(ln(k)) 1/2 ). According to previous step we have: P (A k ) c k 4 = P (A k ) < = P (lim sup A k ) = 0 k=1 Conclusion: ω-a.s there exists k 0 = k 0 (ω) such that X k (ω) 4[ln(k)] 1/2 for k k 0. Samy T. Brownian motion Probability Theory 20 / 81
Concrete construction on [0, 1] Proposition 11. Let We set: {s k ; k 0} Schauder functions family {X k ; k 0} i.i.d sequence of N (0, 1) random variables. W t = k 0 X k s k (t). Then W is a Wiener process on [0, 1] In the sense of Definition 4. Samy T. Brownian motion Probability Theory 21 / 81
Proof: uniform convergence Step 1: Show that k 0 X k s k (t) converges Uniformly in [0, 1], almost surely. This also implies that W is continuous a.s Problem reduction: See that for all ε > 0 there exists n 0 = n 0 (ω) such that for all n 0 m < n we have: 2 n 1 X k s k ε. k=2 m Samy T. Brownian motion Probability Theory 22 / 81
Proof: uniform convergence (2) Useful bounds: 1 Let η > 0. We have (Lemma 10): X k c k η, with c = c(ω) 2 For 2 p k < 2 p+1, functions s k have disjoint support. Thus 2 p+1 1 s k 1 k=2 p 2. p 2 +1 Samy T. Brownian motion Probability Theory 23 / 81
Proof: uniform convergence (3) Uniform convergence: for all t [0, 1] we have: 2 n X k s k (t) k=2 m p m ( p m 2 p+1 1 k=2 p X k s k (t) sup 2 p l 2 p+1 1 1 c 1 p m 2 p( 1 η) 2 c 2 2 m( 1 η), 2 which shows uniform convergence. X l ) 2 p+1 1 s k (t) k=2 p Samy T. Brownian motion Probability Theory 24 / 81
Proof: law of δw rt Step 2: Show that δw rt N (0, t s) for 0 r < t. Problem reduction: See that for all λ R, E [ e ıλ δwrt ] = e (t r)λ2 2. Recall: δw rt = k 0 X k (s k (t) s k (r)) Computation of a characteristic function: Invoking independence of X k s and dominated convergence, E [ e ıλ δwrt ] = E [ e ] ıλ X k(s k (t) s k (r)) k 0 = e λ 2 (sk (t) s k (r)) 2 2 = e λ2 2 k 0 (s k(t) s k (r)) 2 k 0 Samy T. Brownian motion Probability Theory 25 / 81
Proof: law of δw rt (2) Inner product computation: For 0 r < t we have hk, 1 [0,r] hk, 1 [0,t] = 1[0,r], 1 [0,t] = r. k 0 s k (r) s k (t) = k 0 Thus: [s k (t) s k (r)] 2 = t r. k 0 Computation of a characteristic function (2): We get E [ e ıλ δwrt ] = e λ2 2 k 0 (s k(t) s k (r)) 2 = e (t r)λ2 2. Samy T. Brownian motion Probability Theory 26 / 81
Proof: increment independence Simple case: For 0 r < t, we show that W r δw rt Computation of a characteristic function: for λ 1, λ 2 R, E [ e ] ı(λ 1W r +λ 2 δw rt) = E [ e ] ı X k[λ 1 s k (r)+λ 2 (s k (t) s k (r))] k 0 = e 1 2 k 0 [λ 1s k (r)+λ 2 (s k (t) s k (r))] 2 = e 1 2 [λ2 1 r+λ2 2 (t r)] Conclusion: We have, for all λ 1, λ 2 R, and thus W r δw rt. E [ e ı(λ 1W r +λ 2 δw rt) ] = E [ e ıλ 1W r ] E [ e ıλ 2 δw rt ], Samy T. Brownian motion Probability Theory 27 / 81
Effective construction on [0, ) Proposition 12. Let: For k 1, a space (Ω k, F k, P k ) On which a Wiener process W k on [0, 1] is defined Ω = k 1 Ω k, F = k 1 F k, P = k 1 P k We set W 0 = 0 and recursively: W t = W n + W n+1 t n, if t [n, n + 1]. Then W is a Wiener process on R + Defined on ( Ω, F, P). Samy T. Brownian motion Probability Theory 28 / 81
Partial proof Aim: See that δw st N (0, t s) with m s < m + 1 n t < n + 1 Decomposition of δw st : We have ( n m ) δw st = W1 k + Wt n n+1 W1 k + Ws m m+1 = Z 1 + Z 2 + Z 3, k=1 k=1 with Z 1 = n k=m+2 W k 1, Z 2 = W m+1 1 W m+1 s m, Z 3 = W n+1 t n. Law of δw st : The Z j s are independent centered Gaussian. Thus δw st N (0, σ 2 ), with: σ 2 = n (m + 1) + 1 (s m) + t n = t s. Samy T. Brownian motion Probability Theory 29 / 81
Wiener process in R n Definition 13. Let (Ω, F, P) probability space {W t ; t 0} R n -valued stochastic process We say that W is a Wiener process if: 1 W 0 = 0 almost surely 2 Let n 1 and 0 = t 0 < t 1 < < t n. The increments δw t0 t 1, δw t1 t 2,..., δw tn 1 t n are independent 3 For 0 s < t we have δw st N (0, (t s)id R n) 4 W continuous paths, almost surely Remark: One can construct W from n independent real valued Brownian motions. Samy T. Brownian motion Probability Theory 30 / 81
Illustration: 2-d Brownian motion Brownian motion 2-1.5-1.0-0.5 0.0 0.5 1.0 1.5 0.0 0.5 1.0 1.5 2.0 Brownian motion 1 Samy T. Brownian motion Probability Theory 31 / 81
Wiener process in a filtration Definition 14. Let (Ω, F, (F t ) t 0, P) stochastic basis {W t ; t 0} stochastic process with values in R n We say that W is a Wiener process with respect to (Ω, F, (F t ) t 0, P) if: 1 W 0 = 0 almost surely 2 Let 0 s < t. Then δw st F s. 3 For 0 s < t we have δw st N (0, (t s)id R n) 4 W has continuous paths almost surely Remark: A Wiener process according to Definition 13 is a Wiener process in its natural filtration. Samy T. Brownian motion Probability Theory 32 / 81
Outline 1 Stochastic processes 2 Definition and construction of the Wiener process 3 First properties 4 Martingale property 5 Markov property 6 Pathwise properties Samy T. Brownian motion Probability Theory 33 / 81
Gaussian property Definition 15. Let (Ω, F, P) probability space {X t ; t 0} stochastic process, with values in R We say that X is a Gaussian process if for all 0 t 1 < < t n we have: (X t1,..., X tn ) Gaussian vector. Proposition 16. Let W be a real Brownian motion. Then W is a Gaussian process. Samy T. Brownian motion Probability Theory 34 / 81
Proof Notation: For 0 = t 0 t 1 < < t n we set X n = (W t1,..., W tn ) Y n = (δw t0 t 1,..., δw tn 1 t n ) Vector Y n : Thanks to independence of increments of W Y n is a Gaussian vector. Vecteur X n : There exists M R n,n such that X n = MY n X n Gaussian vector Covariance matrix: We have E[W s W t ] = s t. Thus (W t1,..., W tn ) N (0, Γ n ), with Γ ij n = t i t j. Samy T. Brownian motion Probability Theory 35 / 81
Consequence of Gaussian property Characterization of a Gaussian process: Let X Gaussian process. The law of X is characterized by: µ t = E[X t ], and ρ s,t = Cov(X s, X t ). Another characterization of Brownian motion: Let W real-valued Gaussian process with Then W is a Brownian motion. µ t = 0, and ρ s,t = s t. Samy T. Brownian motion Probability Theory 36 / 81
Brownian scaling Proposition 17. Let W real-valued Brownian motion. A constant a > 0. We define a process W a by: W a t = a 1/2 W at, for t 0. Then W a is a Brownian motion. Proof: Gaussian characterization of Brownian motion. Samy T. Brownian motion Probability Theory 37 / 81
Canonical space Proposition 18. Let E = C([0, ); R n ). We set: where d(f, g) = k 1 f g,k 2 k (1 + f g,k ) f g,k = sup { f t g t ; t [0, k]}. Then E is a separable complete metric space. Samy T. Brownian motion Probability Theory 38 / 81
Borel σ-algebra on E Proposition 19. Let E = C([0, ); R n ). For m 1 we consider: 0 t 1 < < t m A 1,..., A m B(R n ) Let A be the σ-algebra generated by rectangles: R t1,...,t m (A 1,..., A m ) = {x E; x t1 A 1,..., x tm A m }. Then A = B(E), Borel σ-algebra on E. Samy T. Brownian motion Probability Theory 39 / 81
Wiener measure Proposition 20. Let Then: W R n -valued Wiener process, defined on (Ω, F, P). T : (Ω, F) (E, A), such that T (ω) = {W t (ω); t 0}. 1 The application T is measurable 2 Let P 0 = P T 1, measure on (E, A). P 0 is called Wiener measure. 3 Under P 0, the canonical process ω can be written as: ω t = W t, where W Brownian motion. Samy T. Brownian motion Probability Theory 40 / 81
Proof Inverse image of rectangles: We have T 1 (R t1,...,t m (A 1,..., A m )) = (W t1 A 1,..., W tm A m ) F. Conclusion: T measurable, since A generated by rectangles. Samy T. Brownian motion Probability Theory 41 / 81
Outline 1 Stochastic processes 2 Definition and construction of the Wiener process 3 First properties 4 Martingale property 5 Markov property 6 Pathwise properties Samy T. Brownian motion Probability Theory 42 / 81
Martingale property Definition 21. Let (Ω, F, (F t ) t 0, P) stochastic basis {X t ; t 0} stochastic process, with values in R n We say that X is a F t -martingale if 1 X t L 1 (Ω) for all t 0. 2 X is F t -adapted. 3 E[δX st F s ] = 0 for all 0 s < t. Proposition 22. Let W a F t -Brownian motion. Then W is a F t -martingale. Samy T. Brownian motion Probability Theory 43 / 81
Stopping time Definition 23. Let (Ω, F, (F t ) t 0, P) stochastic basis. S random variable, with values in [0, ]. We say that S is a stopping time if for all t 0 we have: (S t) F t Interpretation 1: If we know X [0,t] One also knows if T t or T > t Interpretation 2: T instant for which one stops playing Only depends on information up to current time. Samy T. Brownian motion Probability Theory 44 / 81
Typical examples of stopping time Proposition 24. Let: X process with values in R d, F t -adapted and continuous. G open set in R d. F closed set in R d. We set: T G = inf {t 0; X t G}, T F = inf {t 0; X t F }. Then: T F is a stopping time. T G is a stopping time when X is a Brownian motion. Samy T. Brownian motion Probability Theory 45 / 81
Proof for T G First aim: prove that for t > 0 we have (T G < t) F t (1) Problem reduction for (1): We show that (T G < t) = s Q [0,t) (X s G). (2) Since s Q [0,t) (X s G) F t, this proves our claim. First inclusion for (2): (X s G) (T G < t) : trivial. s Q [0,t) Samy T. Brownian motion Probability Theory 46 / 81
Proof for T G (2) Second inclusion for (2): If T G < t, then Then: There exist s < t such that X s G. We set X s x. Let ε > 0 such that B(x, ε) G There exists δ > 0 such that X r B(x, ε) for all r (s δ, s + δ). In particular, there exists q Q (s δ, s] such that X q G. Since (Q (s δ, s]) (Q [0, t)), we have the second inclusion. Samy T. Brownian motion Probability Theory 47 / 81
Proof for T G (3) Optional times: We say that T : Ω [0, ] is an optional time if (T < t) F t. Remark: Relation (1) proves that T G is optional. Optional times and stopping times: A stopping time is an optional time. An optional time satisfies (T t) F t+ ε>0 F t+ε. When X is a Brownian motion, F t+ = F t (by Markov prop.). When X is a Brownian motion, optional time = stopping time. Conclusion: When X is a Brownian motion, T G is a stopping time. Samy T. Brownian motion Probability Theory 48 / 81
Simple properties of stopping times Proposition 25. Let S et T two stopping times. Then 1 S T 2 S T are stopping times. Proposition 26. If T is a deterministic time (T = n almost surely) then T is a stopping time. Samy T. Brownian motion Probability Theory 49 / 81
Information at time S Definition 27. Let (Ω, F, (F t ) t 0, P) stochastic basis. S stopping time. The σ-algebra F S is defined by: F S = {A F; [A (S t)] F t for all t 0}. Interpretation: F S Information up to time S. Samy T. Brownian motion Probability Theory 50 / 81
Optional sampling theorem Theorem 28. Let (Ω, F, (F t ) t 0, P) stochastic basis. S, T two stopping times, with S T. X continuous martingale. Hypothesis: {X t T ; t 0} uniformly integrable martingale. Then: E [X T F S ] = X S. In particular: E [X T ] = E [X S ] = E [X 0 ]. Samy T. Brownian motion Probability Theory 51 / 81
Remarks Strategy of proof: One starts from known discrete time result. X is approximated by a discrete time martingale Y m X tm, with t m = m 2 n. Checking the assumption: Set Y t = X t T. {Y t ; t 0} uniformly integrable martingale in following cases: Y t M with M deterministic constant independent of t. sup t 0 E[ Y t 2 ] M. sup t 0 E[ Y t p ] M with p > 1. Samy T. Brownian motion Probability Theory 52 / 81
Example of stopping time computation Proposition 29. Let: Then: B standard Brownian motion, with B 0 = 0. a < 0 < b T a = inf{t 0 : B t = a} and T b = inf{t 0 : B t = b}. T = T a T b. P (T a < T b ) = b, and E [T ] = ab. b + a Samy T. Brownian motion Probability Theory 53 / 81
Proof Optional sampling for M t = B t : yields P (T a < T b ) = b b + a. Optional sampling for M t = B 2 t t: yields, for a constant τ > 0, E[B 2 T τ] = E [T τ]. Limiting procedure: by dominated and monotone convergence, Conclusion: we get E[B 2 T ] = E [T ]. E [T ] = E[B 2 T ] = a 2 P (T a < T b ) + b 2 P (T b < T a ) = ab. Samy T. Brownian motion Probability Theory 54 / 81
Outline 1 Stochastic processes 2 Definition and construction of the Wiener process 3 First properties 4 Martingale property 5 Markov property 6 Pathwise properties Samy T. Brownian motion Probability Theory 55 / 81
Wiener measure indexed by R d Proposition 30. Let: x R d. There exists a probability measure P x on (E, A) such that: Under P x the canonical process ω can be written as: ω t = x + W t, where W Brownian motion. Notations: We consider {P x ; x R d }. Expected value under P x : E x. Samy T. Brownian motion Probability Theory 56 / 81
Shift on paths Definition 31. Let E = C([0, ); R d ), equipped with Borel A σ-algebra. t 0. We set: θ t : E E, {ω s ; s 0} {ω t+s ; s 0} Shift and future: Let Y : E R measurable. Then Y θ s depends on future after s. Example: For n 1, f measurable and 0 t 1 < < t n, Y (ω) = f (ω t1,..., ω tn ) = Y θ s = f (W s+t1,..., W s+tn ). Samy T. Brownian motion Probability Theory 57 / 81
Markov property Theorem 32. Let: W Wiener process, with values in R d. Y : E R bounded measurable. s 0. Then: E x [Y θ s F s ] = E Ws [Y ]. Interpretation: Future after s can be predicted with value of W s only. Samy T. Brownian motion Probability Theory 58 / 81
Pseudo-proof Very simple function: Consider Y f (W t ), let Y θ s = f (W s+t ). For 0 s < t, independence of increments for W gives E x [Y θ s F s ] = E x [f (W s+t ) F s ] = p t f (W s ) = E Ws [f (W t )] = E Ws [Y ], with p h : C(R d ) C(R d ), Rd p h f (x) f (y) exp ( ) y x 2 2h dy. (2πh) d/2 Extension: 1 Random variable Y = f (W t1,..., W tn ). 2 General random variable: by π-λ-systems. Samy T. Brownian motion Probability Theory 59 / 81
Links with analysis Heat semi-group: We have set p t f (x) = E x [f (W t )]. Then: The family {p t ; t 0} is a semi-group of operators. Generator of the semi-group:, with Laplace operator. 2 Feynman-Kac formula: Let f C b (R d ) and PDE on R d : t u(t, x) = 1 u(t, x), 2 u(0, x) = f (x). Then u(t, x) = E x [f (W t )] = p t f (x) Samy T. Brownian motion Probability Theory 60 / 81
Strong Markov property Theorem 33. Let: W Wiener process in R d. Y : R + E R bounded measurable. S stopping time. Then: E x [Y S θ S F S ] 1 (S< ) = E WS [Y S ] 1 (S< ). Particular case: If S finite stopping time a.s. we have E x [Y S θ S F S ] = E WS [Y S ] Samy T. Brownian motion Probability Theory 61 / 81
Reflection principle Theorem 34. Let: W real-valued Brownian motion. a > 0. T a = inf{t 0; W t = a}. Then: P 0 (T a < t) = 2 P 0 (W t > a). Samy T. Brownian motion Probability Theory 62 / 81
Intuitive proof Independence: If W reaches a for s < t W t W Ta F Ta. Consequence: Furthermore: P 0 (T a < t, W t > a) = 1 2 P 0 (T a < t) (W t > a) (T a < t) = P 0 (T a < t, W t > a) = P 0 (W t > a). Thus: P 0 (T a < t) = 2 P 0 (W t > a). Samy T. Brownian motion Probability Theory 63 / 81
Rigorous proof Reduction: We have to show P 0 (T a < t, W t > a) = 1 2 P 0 (T a < t) Functional: We set (with inf = ) S = inf {s < t; W s = a}, Y s (ω) = 1 (s<t, ω(t s)>a). Then: 1 (S < ) = (T a < t). 2 Y S θ S = 1 (S<t) 1 Wt>a = 1 (Ta<t) 1 Wt>a. Samy T. Brownian motion Probability Theory 64 / 81
Rigorous proof (2) Application of strong Markov: E 0 [Y S θ S F S ] 1 (S< ) = E WS [Y S ] 1 (S< ) = ϕ(w S, S), (3) with ϕ(x, s) = E x [ 1Wt s >a] 1(s<t). Conclusion: Since W S = a if S < and E a [1 Wt s >a] = 1 2, ϕ(w S, S) = 1 2 1 (S<t). Taking expectations in (3) we end up with: P 0 (T a < t, W t > a) = 1 2 P 0 (T a < t). Samy T. Brownian motion Probability Theory 65 / 81
Outline 1 Stochastic processes 2 Definition and construction of the Wiener process 3 First properties 4 Martingale property 5 Markov property 6 Pathwise properties Samy T. Brownian motion Probability Theory 66 / 81
Hölder continuity Definition 35. Let f : [a, b] R n. We say that f is γ-hölder if f γ < with: f γ = sup s,t [a,b], s t δf st t s γ. Remark: 1 γ is a semi-norm. 2 Notation: C γ Samy T. Brownian motion Probability Theory 67 / 81
Regularity of Brownian motion Theorem 36. Let τ > 0 W Wiener process on [0, τ] γ (0, 1/2) Then there exists a version Ŵ of W such that almost surely: The paths of Ŵ are γ-hölder. Remark: Ŵ and W are usually denoted in the same way. Samy T. Brownian motion Probability Theory 68 / 81
Kolmogorov s criterion Theorem 37. Let X = {X t ; t [0, τ]} process defined on (Ω, F, P), such that: E [ δx st α ] c t s 1+β, for s, t [0, τ], c, α, β > 0 Then there exists a modification ˆX of X satisfying Almost surely ˆX C γ for all γ < β/α, i.e: P ( ω; ˆX(ω) γ < ) = 1. Samy T. Brownian motion Probability Theory 69 / 81
Proof of Theorem 36 Law of δb st : We have δb st N (0, t s). Moments of δb st : Using expression for the moments of N (0, 1) for m 1, we have E [ δb st 2m] = c m t s m i.e E [ δb st 2m] = c m t s 1+(m 1) Application of Kolmogorov s criterion: B is γ-hölder for γ < m 1 = 1 1 2m 2 2m Taking limits m, the proof is finished. Samy T. Brownian motion Probability Theory 70 / 81
Levy s modulus of continuity Theorem 38. Let τ > 0 and W Wiener process on [0, τ]. Then almost surely W satisfies: lim sup δ 0 + sup 0 s<t τ, t s δ δw st = 1. 1/2 (2δ ln(1/δ)) Interpretation: W has Hölder-regularity = 1 2 up to a logarithmic factor. at each point Samy T. Brownian motion Probability Theory 71 / 81
Variations of a function Interval partitions: Let a < b two real numbers. We denote by π a set {t 0,..., t m } with a = t 0 <... < t m = b We say that π is a partition of [a, b]. Write Π a,b for the set of partitions of [a, b]. Definition 39. Let a < b and f : [a, b] R. The variation of f on [a, b] is: V a,b (f ) = lim π Π a,b, π 0 t i,t i+1 π δf ti t i+1. If V a,b (f ) <, we say that f has finite variation on [a, b]. Samy T. Brownian motion Probability Theory 72 / 81
Quadratic variation Definition 40. Let a < b and f : [a, b] R. The quadratic variation of f on [a, b] is: V 2 a,b(f ) = lim π Π a,b, π 0 t i,t i+1 π δf ti t i+1 2. If V 2 a,b(f ) <, We say that f has a finite quadratic variation on [a, b]. Remark: The limits for V a,b (f ) and V 2 a,b(f ) do not depend on the sequence of partitions Samy T. Brownian motion Probability Theory 73 / 81
Variations of Brownian motion Theorem 41. Let W Wiener process. Then almost surely W satisfies: 1 For 0 a < b < we have V 2 a,b(w ) = b a. 2 For 0 a < b < we have V a,b (W ) =. Interpretation: The paths of W have: Infinite variation Finite quadractic variation, on any interval of R +. Samy T. Brownian motion Probability Theory 74 / 81
Proof Notations: Let π = {t 0,..., t m } Π a,b. We set: S π = m 1 k=0 δw t k t k+1 2. X k = δw tk t k+1 2 (t k+1 t k ). Y k = X k t k+1 t k. Step 1: Show that Decomposition: We have L 2 (Ω) lim π 0 S π = b a. S π (b a) = m 1 X k k=0 Samy T. Brownian motion Probability Theory 75 / 81
Proof (2) Variance computation: The r.v X k are centered and indep. Thus E [ (S π (b a)) 2] = Var Since = m 1 k=0 δw tk t k+1 (t k+1 t k ) 1/2 N (0, 1), we get: ( m 1 ) X k k=0 Var (X k ) = m 1 k=0 (t k+1 t k ) 2 Var (Y k ) E [ (S π (b a)) 2] m 1 = 2 (t k+1 t k ) 2 2 π (b a). k=0 Conclusion: We have, for a subsequence π n, L 2 (Ω) lim π 0 S π = b a = a.s lim n S πn = b a. Samy T. Brownian motion Probability Theory 76 / 81
Proof (3) Step 2: Verify V a,b (W ) = for fixed a < b. We proceed by contradiction. Let: ω Ω 0 such that P(Ω 0 ) = 1 and lim n S πn (ω) = b a > 0. Assume V a,b (W (ω)) <. Bound on increments: Thanks to continuity of W : 1 sup n 1 mn 1 k=0 δw t k t k+1 (ω) c(ω) 2 lim n max 0 k mn 1 δw tk t k+1 (ω) = 0 Thus S πn (ω) m max δw n 1 t k t k+1 (ω) δw tk t k+1 (ω) 0. 0 k m n 1 k=0 Contradiction: with lim n S πn (ω) = b a > 0. Samy T. Brownian motion Probability Theory 77 / 81
Proof (4) Step 3: Verify V a,b (W ) = for all couple (a, b) R 2 + It is enough to check V a,b (W ) = for all couple (a, b) Q 2 + Recall: For all couple (a, b) Q 2 +, we have found: Ω a,b s.t P(Ω a,b ) = 1 and V a,b (W (ω)) = for all ω Ω a,b. Full probability set: Let Then: P(Ω 0 ) = 1. Ω 0 = (a,b) Q 2 + Ω a,b. If ω Ω 0, for all couple (a, b) Q 2 + we have V a,b (W (ω)) =. Samy T. Brownian motion Probability Theory 78 / 81
Irregularity of W Proposition 42. Let: W Wiener process γ > 1/2 and 0 a < b Then almost surely W does not belong to C γ ([a, b]). Samy T. Brownian motion Probability Theory 79 / 81
Proof Strategy: Proceed by contradiction. Let: ω Ω 0 such that P(Ω 0 ) = 1 and lim n S πn (ω) = b a > 0. Suppose W C γ with γ > 1/2, i.e δw st L t s γ With L random variable. Bound on quadratic variation: We have: m n 1 S πn (ω) L 2 k=0 t k+1 t k 2γ L 2 π n 2γ 1 (b a) 0. Contradiction: with lim n S πn (ω) = b a > 0. Samy T. Brownian motion Probability Theory 80 / 81
Irregularity of W at each point Theorem 43. Let: Then W Wiener process γ > 1/2 and τ > 0 1 Almost surely the paths of W are not γ-hölder continuous at each point s [0, τ]. 2 In particular, W is nowhere differentiable. Samy T. Brownian motion Probability Theory 81 / 81