Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Similar documents
Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Lecture 12. F o s, (1.1) F t := s>t

1. Stochastic Processes and filtrations

µ X (A) = P ( X 1 (A) )

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Lecture 17 Brownian motion as a Markov process

ELEMENTS OF PROBABILITY THEORY

Verona Course April Lecture 1. Review of probability

n E(X t T n = lim X s Tn = X s

Solutions to the Exercises in Stochastic Analysis

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

{σ x >t}p x. (σ x >t)=e at.

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Wiener Measure and Brownian Motion

Applications of Ito s Formula

Exercises in stochastic analysis

A Concise Course on Stochastic Partial Differential Equations

Lecture 22 Girsanov s Theorem

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Gaussian Processes. 1. Basic Notions

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Harmonic Functions and Brownian motion

Selected Exercises on Expectations and Some Probability Inequalities

9 Brownian Motion: Construction

Introduction to stochastic analysis

Universal examples. Chapter The Bernoulli process

Brownian Motion and the Dirichlet Problem

Lecture 21 Representations of Martingales

Brownian Motion and Stochastic Calculus

(B(t i+1 ) B(t i )) 2

Stochastic Processes

Some Tools From Stochastic Analysis

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

On pathwise stochastic integration

Hardy-Stein identity and Square functions

An Introduction to Stochastic Processes in Continuous Time

Interest Rate Models:

Reflected Brownian Motion

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

Exercises. T 2T. e ita φ(t)dt.

Harmonic Functions and Brownian Motion in Several Dimensions

Stochastic Differential Equations.

Stochastic integration. P.J.C. Spreij

Topics in fractional Brownian motion

Brownian Motion. Chapter Definition of Brownian motion

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

for all subintervals I J. If the same is true for the dyadic subintervals I D J only, we will write ϕ BMO d (J). In fact, the following is true

MATH 6605: SUMMARY LECTURE NOTES

A Short Introduction to Diffusion Processes and Ito Calculus

I forgot to mention last time: in the Ito formula for two standard processes, putting

Brownian Motion and Conditional Probability

Convergence at first and second order of some approximations of stochastic integrals

Continuous martingales and stochastic calculus

Stochastic Integration.

1.1 Definition of BM and its finite-dimensional distributions

Chapter 1. Poisson processes. 1.1 Definitions

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20

Stochastic Calculus. Michael R. Tehranchi

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

Doléans measures. Appendix C. C.1 Introduction

A D VA N C E D P R O B A B I L - I T Y

Jump Processes. Richard F. Bass

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

4 Sums of Independent Random Variables

Convergence of Feller Processes

ERRATA: Probabilistic Techniques in Analysis

Exponential martingales: uniform integrability results and applications to point processes

Metric Spaces and Topology

Random Process Lecture 1. Fundamentals of Probability

(A n + B n + 1) A n + B n

Part II Probability and Measure

Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term

BROWNIAN MOTION AND LIOUVILLE S THEOREM

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

Empirical Processes: General Weak Convergence Theory

A Fourier analysis based approach of rough integration

FE 5204 Stochastic Differential Equations

Regularity of the density for the stochastic heat equation

Lecture 19 L 2 -Stochastic integration

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Lecture 19 : Brownian motion: Path properties I

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Stochastic Analysis I S.Kotani April 2006

Brownian Motion. Chapter Stochastic Process

Part III Stochastic Calculus and Applications

Poisson random measure: motivation

THEOREMS, ETC., FOR MATH 515

Probability and Measure

Lecture 4: Introduction to stochastic processes and stochastic calculus

Transcription:

Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process................................. 4 2.1.1 Abstract argument............................ 4 2.1.2 Isonormal Gaussian process....................... 5 2.2 The continuity property............................. 6 2.2.1 Abstract argument............................ 6 2.2.2 Constructive argument.......................... 7 3 Donsker s invariance principle 7 4 Some selected properties of Brownian Motion 8 4.1 Sample path properties.............................. 8 4.1.1 Law of Large Numbers.......................... 8 4.1.2 Hitting time................................ 8 4.1.3 Supremum and infimum......................... 8 4.1.4 Zeros.................................... 9 4.2 Quadratic variation................................ 9 4.3 Some applications of the Markov property................... 10 4.3.1 Blumenthal 0 1 law........................... 10 4.3.2 Reflection principle............................ 10 4.3.3 Zeros.................................... 11 5 PDEs and Brownian Motion 11 5.1 The heat equation on R d............................. 11 5.2 A martingale................................... 12 5.3 Some consequences................................ 12 5.3.1 The heat equation on R d returns.................... 12 5.3.2 Harmonic functions............................ 12 1

5.3.3 The Dirichlet problem.......................... 13 5.3.4 Exit problem............................... 13 1 Definition 1.1 Brownian Motion Definition 1. A (real-valued) Brownian Motion is a stochastic process (t, ω) R + Ω B t (ω), such that: 1. B 0 = 0 a.s.; 2. the increments are stationary: for any 0 s t, B t B s B t s ; 3. the increments are independent: for any n N, for any 0 = t 0 t 1... t n, (B ti+1 B ti ) 0 i n 1 are independent; 4. for any t 0, B t N (0, t); 5. the trajectories are almost surely continuous: for P-almost every ω Ω, the mapping t R + B t (ω) is continuous. One may define Brownian Motion with respect to a filtration, this may be useful in some situations. To simplify, in these notes, we only consider natural filtrations. Proposition 1. A Brownian Motion is a Gaussian process: for any n N, for any 0 t 1... t n and any λ 1,..., λ n R, λ 1 B t1 +... + λ n B tn is a real Gaussian random variable. As a Gaussian process, it is characterized by its mean and covariance: for all t, s R + E[B t ] = 0, Cov(B t, B s ) = min(s, t). Theorem 1. Let ( B t )t R + be a Brownian Motion. It is both a martingale and a Markov process: for all 0 s t, and any bounded measurable function f : R R E[B t F s ] = B s, E[f(B t ) F s ] = E[f(B t ) B s ], with F t = σ ( B r, 0 r t ). In addition, ( B s+t B s is a Brownian Motion, independent of F )t R + s. Moreover, ( Bt 2 t) t R + is a martingale. Definition 2. Let ( ) B t be a Brownian Motion, and let ( F t R + t denote the natural )t R + filtration: F t = σ ( B r, 0 r t ) for all t 0. 2

Let T be a [0, ]-valued random variable. It is called a stopping time if for every t R + Let denote the associated σ-field. {T t} F t. F T = { A F ; A {T t} F t t R +} Theorem 2 (Strong Markov property). Let ( B t )t R + be a Brownian Motion, and let T be a stopping time. Assume that T is almost surely finite: P(T = ) = 0. Then ( B t+t B T )t R + is a Brownian Motion, independent of F T. Proposition 2 (Scaling). Let ( B t be a Brownian Motion, and let a (0, ). )t R + Define B (a) t = a 1 B a 2 t for all t R +. Then ( B (a) ) t is a Brownian Motion. t R + Definition 3. Let d N. A d-dimensional Brownian Motion is a R d -valued process, with the notation B t = ( (Bt 1,..., Bt d ) ), such that ( ) B l t R + t are independent Brownian Motion, t R + for l {1,..., d}. Proposition 3 (Isotropy). Let d N, and R O(d) be an orthogonal matrix. If ( B t )t R + is a d-dimensional Brownian Motion, then ( RB t )t R + is also a d-dimensional Brownian Motion. Theorem 3 (Lévy s characterization). Let M = ( M(t) ) t R + process. The following statements are equivalent: be a continuous stochastic (i) M is a Brownian Motion, (ii) ( M(t) ) t R + and ( M(t) 2 t ) t R + are martingales. 1.2 Wiener measure Definition 4. Let C = C(R +, R) denote the space of continuous functions from R + to R. Define the distance Subtelty. d(f, g) = n=1 1 2 min( 1, sup f(t) g(t) ). n 0 t n On the one hand, a stochastic process ( X t is a random variable with values in )t R + the space R R+ of all functions from R + to R, equipped with the product σ-field. 3

On the other hand, C = C(R +, R) is not a measurable subset of R R+ for this product σ-field. Care is needed to define the law of Brownian Motion in the space of continuous functions. Proposition 4. The mapping d is a distance on the set C, it corresponds with uniform convergence on compact sets. Moreover, (C, d) is a Polish space: it is complete and separable. The associated Borel σ-field B(C) coincides with the σ-field σ ( w C w(t), t R +). As a consequence, the mapping is measurable. Φ : ω (Ω, F) ( B t (ω) ) t R + (C, B(C)) Definition 5. The Wiener measure W 0 is the image of the probability distribution P by the mapping Φ. In this context, C is called the Wiener space. The Wiener measure only depends on the finite-dimensional marginals of the process. Alternative construction: define W 0 the law of the process ( B t, considered as a ( )t R + R R+ -valued random variable. Observe that W 0 R R + \ C ) = 0. For any Borel set Γ B(C), define W 0 (Γ) = W 0 (Γ), where Γ R R+ is any measurable set such that Γ = Γ C. One then checks that the probability measure W 0 is well-defined on C. Definition 6 (Canonical process). Let (Ω, F, P) = (C, B(C), W 0 ). Set B t (ω) = ω(t), for all t R + and ω C. Then ( B t )t R + defines a Brownian Motion, referred to as the canonical version of Brownian Motion. Brownian Motion is often called the Wiener process. For any x R, Brownian Motion starting at x is the process ( x+b t )t R +. The associated Wiener measure, starting at x, is denoted by W x. 2 Construction Let I = [0, 1] or I = R +. 2.1 Gaussian process 2.1.1 Abstract argument Use of the Kolmogorov extension theorem. Indeed, the family of marginals µ t1,...,t n = P ( B t1 dx 1,..., B tn ) dx n, for arbitrary 0 t 1... t n, is consistent. 4

This ensures the existence of a unique probability distribution, µ, on the space R R+, endowed with the product σ-field, with these marginals. Consider B t = ω(t) (the canonical process). All the properties of Brownian Motion, except the continuity of trajectories, are satisfied. 2.1.2 Isonormal Gaussian process Note that H = L 2 (I) is a separable, infinite dimensional, Hilbert space, with scalar product f, g = f(t)g(t)dt. Observe the key identity: for all s, t I, Cov(B s, B t ) = min(s, t) = 1 [0,s], 1 [0,t]. Let ( ) e n be (any) complete orthonormal system of H, and ( ξ n N n be a family of independent standard real-valued Gaussian random variables, i.e. ξ n N (0, 1). )n N Then set, for all t I, B t = ξ n 1 [0,t], e n. n N These random variables are well-defined, as limits in the L 2 (Ω) sense of Gaussian random variable. All the properties of Brownian Motion, except the continuity of trajectories, are satisfied. Note that the definition above does not depend on the choice of the complete orthonormal system. At a formal level, for every t 0, B t = 1 [0,t], ξ, where ξ = n N ξ n e n. However, almost surely, ξ L 2 (I) =. This object is not an element of L 2 (I) (it is only a distribution with negative regularity ). The quantity ξ is often interpreted as White Noise. It may be seen as the derivative of Brownian Motion; conversely, Brownian Motion may be seen as the antiderivative of White Noise. This interpretation suggests that Brownian Motion is not differentiable... More generally: let H be any separable, infinite dimensional, Hilbert space. Definition 7. An H-isonormal Gaussian process is a mapping W : H L 2 (Ω) (or equivalently a family of random variables (W(h)) h H ) such that: for any n N, and any (h 1,..., h n ) H n, (W(h 1 ),..., W(h n )) is a Gaussian random vector, i.e. for any (λ 1,..., λ n ) R n, the real random variable λ 1 W(h 1 )+...+λ n W(h n ) has a (possibly degenerate) gaussian law and is centered; 5

for any h 1, h 2 H, the covariance of W(h 1 ) and W(h 2 ) is given by E[W(h 1 )W(h 2 )] =< h 1, h 2 > H. At a formal level, this mapping is constructed as W(h) = h, ξ, with ξ given as above. In our context, B t = W(1 [0,t] ). For an arbitrary function h H = L 2 ([0, 1]), the random variable W(h) is often called the Wiener integral, and denoted by 1 h(t)db 0 t. Indeed, check that N 1 i=0 h( i N )( B i+1 N ) B i W(h), N N in distribution, say for continuous function h : [0, 1] R. In particular, if H = L 2 (I D), for some domain D R d, this definition provides a construction of space-time white noise, which is useful for the study of Stochastic Partial Differential Equations (SPDEs). 2.2 The continuity property 2.2.1 Abstract argument Use of the Kolmogorov-Centsov regularity criterion. Definition 8. Let X = ( ) X t and X = ( Xt denote two stochastic processes. t I )t I The process X is a modification of X, if for every t I, P(X t = X t ) = 1. Theorem 4. Assume there exist α, β, C (0, ) such that for all t, s I E [ X t X s α] C t s 1+β. Then X admits a modification X, with almost surely continuous trajectories. In addition, the trajectories of X are almost surely Hölder continuous with exponent γ, for all γ (0, β α ). Note that the assumption β 0 is essential: the inequality is satisfied for a Poisson process, for instance, which of course does not admit a continuous modification. Application for Brownian Motion: E B t B s 2 = t s, which implies (Gaussian random variables) E B t B s 2p C p t s p for every p N. Thus there exists a modification B of B, with almost surely continuous trajectories; more precisely, its trajectories are Hölder continuous with exponent γ for all γ (0, 1 2 ). Indeed, p 1 2p 1. p 2 6

2.2.2 Constructive argument In fact, it is possible to directly prove the almost sure continuity of trajectories, thanks to an appropriate choice of the complete orthonormal system of H = L 2 ([0, 1]). Indeed, for all t [0, 1], and n N, 0 k 2 n 1, h 0 (t) = 1, h k n = 2 n/2 φ(2 n t k), with φ = 1 (0, 1 2 ] 1 ( 1,1]. 2 The family ( ) h 0, h k n is a complete orthormal system of n N,0 k 2 n 1 L2 ([0, 1]): it is called the Haar basis. Define the antiderivatives H 0 (t) = h 0, 1 [0,t], Hn(t) k = h k n, 1 [0,t] : the Schauder functions. Using the associated isonormal Gaussian process construction, B t = ηh 0 (t) + n=0 (2 n 1 k=0 ) ξ n,k Hn(t) k, as an equality of random variables, for fixed t 0, with independent standard Gaussian random variables ( η, ξ n,k )n N,0 k 2 n 1. Proposition 5. Almost surely, the series converges uniformly for t [0, 1]. Hence, as uniform limits of continuous functions, trajectories t B t are continuous, almost surely. 3 Donsker s invariance principle Let ( X n be a sequence of independent and identically distributed, real-valued, squareintegrable, random variables. )n N Assume E[X n ] = 0, and let σ 2 = Var(X n ). Assume σ 0. For any N N, define S (N) t = 1 ( Nt σ (1 {Nt}) X n + {Nt} X Nt ), N where s N 0 is the integer part of s and {s} = s s [0, 1). For each N N, S (N) is a rescaled version of the piecewise linear extrapolation of the standard random walk. Theorem 5. When N, the C-valued random variable S (N) converges to Brownian Motion, in distribution in C. This means that for any bounded continuous function F : C R, E [ F ( S (N))] E [ F (B) ] = F dw 0. N 7 n=1

4 Some selected properties of Brownian Motion 4.1 Sample path properties In other words: almost sure properties for trajectories of Brownian Motion. 4.1.1 Law of Large Numbers Theorem 6. Let ( B t )t R + be a Brownian Motion. Then almost surely B t t 0. t Corollary 7 (Time-inversion). Set B 0 = 0 and, for t (0, ), B t = tb 1/t. Then ( Bt )t R + is a Brownian Motion. 4.1.2 Hitting time For all a (0, ), T a = inf {t 0; B t = a}. Proposition 6. For any a (0, ), T a is a stopping time. Moreover, T a < almost surely. Finally, for every λ R +, E[e λta ] = e 2λa. Sketch of proof: since trajectories are almost surely continuous, for every t 0 {T a t} = {X τ a ɛ} F t. ɛ Q (0, ) τ Q [0,t] apply the optional stopping theorem (at time t T a ) for the martingale ( exp(θb t, for every θ (0, ). Let first t, then θ 0. θ 2 2 t)) t 0 4.1.3 Supremum and infimum For every t R +, define S t = sup B s. 0 s t Proposition 7. For every t 0, P(S t > 0) = 1. Sketch of proof: consider P(S t > 1 ). Use the scaling property of Brownian Motion, n T 1 < almost surely, and let n. Corollary 8. Almost surely, for every t (0, ), sup B s > 0 and 0 s t Corollary 9. Almost surely, on any interval, t B t is not monotonic. 8 inf B s < 0. 0 s t

Proposition 8. Almost surely, lim supb t = + and lim inf B t =. t t Corollary 10. Let t 0 R +. Almost surely, lim sup h 0 not differentiable at time t 0. B t0 +h B t0 h =. In particular, almost surely the trajectories are The statement that almost surely trajectories are nowhere differentiable also holds true, but the proof requires more subtle arguments. 4.1.4 Zeros Proposition 9. Define the random set χ = {t 0; B t = 0}. Almost surely, χ is closed and unbounded, χ has 0 Lebesgue measure, 0 is an accumulation point of χ. 4.2 Quadratic variation For a subdivision π = {t 0 = 0 < t 1 <... < t i <...}, locally finite, define Vt π = ( Bti+1 B ) 2. t i t i π [0,t] Theorem 11. When π = sup i t i+1 t i 0, then Vt π 0, in probability and in L 2, for every t 0. Moreover, if ( π k )k N is a sequence of subdivisions such that k N π k <, then convergence holds in the almost sure sense. Corollary 12. Let γ > 1 2 and 0 T 1 < T 2. Almost surely, trajectories of Brownian Motion are not γ-hölder continuous on [T 1, T 2 ]. More generally: Definition 9. Let ( X(t) ) be a stochastic process. It is of finite quadratic variation if t 0 there exists a finite process ( X t, such that for every t 0, )t 0 V π t (X) π 0 X t, in probability, with Vt π (X) = ( t i π [0,t] Xti+1 X ) 2 t i Hence Brownian Motion is of finite quadratic variation, with B t = t. 9

Proposition 10. For every t 0, one has the following convergence, in L 2 : ( ) 2B ti Bti+1 B ti Bt 2 t. π 0 However, observe that t i π [0,t] t i π [0,t] [B ti + B ti+1 ] ( ) B ti+1 B ti Bt 2. π 0 The stochastic integral with respect to Brownian Motion is a subtle object. 4.3 Some applications of the Markov property 4.3.1 Blumenthal 0 1 law Theorem 13 (Blumenthal). Let F t = σ ( B s, 0 s t ), and F + 0 = t>0 F t. The σ-field F + 0 is trivial: for every A F + 0, P(A) {0, 1}. Sketch of proof One shows that A is independent of B t1,..., B tn, for all 0 < t 1 <... < t N. For all ɛ < t 1, A F ɛ, and B t1 B ɛ,..., B tn B ɛ is independent of F ɛ. Pass to the limit ɛ 0. One obtains P(A) = P(A) 2, thus P(A) {0, 1}. 4.3.2 Reflection principle Proposition 11. For any a (0, ), t R +, with S t = sup B s. 0 s t P(S t a) = P(T a t) = 2P(B t a) = P( B t a), Remark 14. The equality S t B t in distribution holds only for a single time t, it is not an equality in law for the processes: trajectories t S t are almost surely non-decreasing. Sketch of proof P(S t a) = P(S t a, B t > a) + P(S t a, B t < a) P(S t a, B t < a) = P(T a t, B Ta+(t Ta) B Ta Markov property at T a <. < 0) = P(Ta t) 2, thanks to the strong 10

Corollary 15. For every a (0, ), T a admits the density f a (t) = a a2 2πt exp( )1 3 t t>0. In particular E[T a ] =. Remark 16. Be careful, by the martingale property E[B 2 t T a (t T a )] = 0, but taking the limit t is not useful. On the contrary, one may prove E[T a T a ] = a 2 with this strategy, where T a = inf {t 0; B t = a}, thus T a T a = inf {t 0; B t = a}. 4.3.3 Zeros Proposition 12. The set χ = {t 0; B t = 0} has no isolated point, almost surely. In particular, χ is not countable. Sketch of proof For every q Q +, set d q = inf {t > q; B t = 0}. Note that d q χ, and that d q is a stopping time. Thanks to the strong Markov property, d q is an accumulation point of χ (i.e. is a limit of points in χ \ {d q }). Set N = {d q accumulation point of χ}. Then P(N ) = 0. q Q + On Ω \ N, any point h is a limit of points in χ \ {h}: indeed, consider a non-decreasing sequence q n Q +, with q n h. Then either h = d qn for some N, and d qn is an accumulation point; or d qn < h for all n, with d qn χ and d qn 0. 5 PDEs and Brownian Motion 5.1 The heat equation on R d Consider the PDE { u(t,x) = 1 u(t, x) = 1 d 2 u(t,x) t 2 2 i=1, t > 0, x R d x 2 i u(0, x) = f(x), x R d. Assume that the initial condition f : R d R is bounded and continuous. The initial condition is interpreted in the following sense: u(t, ) f( ), uniformly on t 0 compact sets. Theorem 17. Define u(t, x) = f(y)p(t, x, y)dy = R d 1 ( ) d/2 2πt f(y)e y x 2 2t R d dy. 11

Then u is of class C on (0, ) R d. Moreover, it is solution of the heat equation, with the initial condition f. This solution admits a probabilistic interpretation: 5.2 A martingale u(t, x) = E[f(x + B t )] = E x [f(b t )]. Theorem 18. Let v : (t, x) R + R d u(t, x) R, be of class C 1,2 b : this means: v is continuously differentiable with respect to the time variable t, v is twice continuously differentiable with respect to the space variable x, v and the associated derivatives v t, v x i, 2 v x i x j are bounded. Define, for all t 0 M t = v(t, B t ) v(0, B 0 ) t 0 ( t + 1 ) 2 u(s, B s )ds. Then ( M t )t 0 is a continuous martingale. A probabilistic proof will be obtained using the so-called Itô s formula. An analytic proof, using the Markov property of Brownian Motion, and properties of the heat kernel, can be performed. 5.3 Some consequences 5.3.1 The heat equation on R d returns Let u be a solution of the heat equation, of class C 1,2 b. Let T > 0 be given, and define v(t, ) = u(t t, ) for t [0, T ]. Then v satisfies + 1 = 0. Thus ( v(t, B t 2 t ) v(0, B 0 ) ) is a martingale, in particular t [0,T ] u(t, x) = v(0, x) = E x [v(0, B 0 )] = E x [v(t, B T )] = E x [u(0, B T )], where E x means that B 0 = x almost surely. 5.3.2 Harmonic functions Definition 10. Let D R d be an open domain. A function u : D R is called harmonic if it is of class C 2, and satisfies u = 0 on D. 12

Definition 11. Let D R d be an open, domain. A function u : D R satisfies the mean value property if u(a) = u(x)dµ a,r (x) B(a,r) for every a D, r > 0 such that the ball B(a, r) D, and µ a,r is the uniform distribution on the sphere B(a, r). Theorem 19. Assume that u : D R is harmonic. Then ( u(b t ) u(b 0 ) ) t 0 continuous martingale. is a centered Theorem 20. Let B(a, r) D. Assume B 0 = a, and let τ a,r = inf {t > 0; B t B(a, r)}. Then τ a,r < almost surely, and B τa,r µ a,r is uniformly distributed on the sphere. Corollary 21. If a function is harmonic, then it satisfies the mean value property. 5.3.3 The Dirichlet problem Let D R d be an open bounded domain, and f : D R be a continuous function. Definition 12. A function u : D R is solution of the Dirichlet problem (D, f) if u is continuous on D, u is of class C 2 on D, u is solution of the PDE { u(x) = 0, x D, u(x) = f(x), x D. Theorem 22. Assume that u is solution of the Dirichlet problem (D, f). Then for every x D, u(x) = E x [f(b τd )] with τ D = inf {t 0; B t / D}. In addition, τ D < almost surely. Conversely, under an additional regularity on the domain D, the function x E x [f(b τd )] is solution of the Dirichlet problem (D, f). 5.3.4 Exit problem Let ( B t )t R + be a one-dimensional Brownian Motion, and a, b > 0. Let T a,b = inf {t 0; X t / ( a, b)}. Recall that almost surely, T a,b <. Proposition 13. One has P(B Ta,b = b) = a a + b, P(B T a,b = a) = b a + b, E[T a,b] = ab. 13

Sketch of proof: Setting u(x) = x+a b+a solves the PDE { u(x) = 0, x ( a, b), u( a) = 0, u(b) = 1. Setting v(x) = (x+a)(b x) 2 solves the PDE { u(x) = 1, u( a) = u(b) = 0. x ( a, b), Applying the optional stopping theorem for the associated martingales yields the results. 14