Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3
|
|
- Albert Cain
- 5 years ago
- Views:
Transcription
1 Brownian Motion Contents 1 Definition Brownian Motion Wiener measure Construction Gaussian process Abstract argument Isonormal Gaussian process The continuity property Abstract argument Constructive argument Donsker s invariance principle 7 4 Some selected properties of Brownian Motion Sample path properties Law of Large Numbers Hitting time Supremum and infimum Zeros Quadratic variation Some applications of the Markov property Blumenthal 0 1 law Reflection principle Zeros PDEs and Brownian Motion The heat equation on R d A martingale Some consequences The heat equation on R d returns Harmonic functions
2 5.3.3 The Dirichlet problem Exit problem Definition 1.1 Brownian Motion Definition 1. A (real-valued) Brownian Motion is a stochastic process (t, ω) R + Ω B t (ω), such that: 1. B 0 = 0 a.s.; 2. the increments are stationary: for any 0 s t, B t B s B t s ; 3. the increments are independent: for any n N, for any 0 = t 0 t 1... t n, (B ti+1 B ti ) 0 i n 1 are independent; 4. for any t 0, B t N (0, t); 5. the trajectories are almost surely continuous: for P-almost every ω Ω, the mapping t R + B t (ω) is continuous. One may define Brownian Motion with respect to a filtration, this may be useful in some situations. To simplify, in these notes, we only consider natural filtrations. Proposition 1. A Brownian Motion is a Gaussian process: for any n N, for any 0 t 1... t n and any λ 1,..., λ n R, λ 1 B t λ n B tn is a real Gaussian random variable. As a Gaussian process, it is characterized by its mean and covariance: for all t, s R + E[B t ] = 0, Cov(B t, B s ) = min(s, t). Theorem 1. Let ( B t )t R + be a Brownian Motion. It is both a martingale and a Markov process: for all 0 s t, and any bounded measurable function f : R R E[B t F s ] = B s, E[f(B t ) F s ] = E[f(B t ) B s ], with F t = σ ( B r, 0 r t ). In addition, ( B s+t B s is a Brownian Motion, independent of F )t R + s. Moreover, ( Bt 2 t) t R + is a martingale. Definition 2. Let ( ) B t be a Brownian Motion, and let ( F t R + t denote the natural )t R + filtration: F t = σ ( B r, 0 r t ) for all t 0. 2
3 Let T be a [0, ]-valued random variable. It is called a stopping time if for every t R + Let denote the associated σ-field. {T t} F t. F T = { A F ; A {T t} F t t R +} Theorem 2 (Strong Markov property). Let ( B t )t R + be a Brownian Motion, and let T be a stopping time. Assume that T is almost surely finite: P(T = ) = 0. Then ( B t+t B T )t R + is a Brownian Motion, independent of F T. Proposition 2 (Scaling). Let ( B t be a Brownian Motion, and let a (0, ). )t R + Define B (a) t = a 1 B a 2 t for all t R +. Then ( B (a) ) t is a Brownian Motion. t R + Definition 3. Let d N. A d-dimensional Brownian Motion is a R d -valued process, with the notation B t = ( (Bt 1,..., Bt d ) ), such that ( ) B l t R + t are independent Brownian Motion, t R + for l {1,..., d}. Proposition 3 (Isotropy). Let d N, and R O(d) be an orthogonal matrix. If ( B t )t R + is a d-dimensional Brownian Motion, then ( RB t )t R + is also a d-dimensional Brownian Motion. Theorem 3 (Lévy s characterization). Let M = ( M(t) ) t R + process. The following statements are equivalent: be a continuous stochastic (i) M is a Brownian Motion, (ii) ( M(t) ) t R + and ( M(t) 2 t ) t R + are martingales. 1.2 Wiener measure Definition 4. Let C = C(R +, R) denote the space of continuous functions from R + to R. Define the distance Subtelty. d(f, g) = n=1 1 2 min( 1, sup f(t) g(t) ). n 0 t n On the one hand, a stochastic process ( X t is a random variable with values in )t R + the space R R+ of all functions from R + to R, equipped with the product σ-field. 3
4 On the other hand, C = C(R +, R) is not a measurable subset of R R+ for this product σ-field. Care is needed to define the law of Brownian Motion in the space of continuous functions. Proposition 4. The mapping d is a distance on the set C, it corresponds with uniform convergence on compact sets. Moreover, (C, d) is a Polish space: it is complete and separable. The associated Borel σ-field B(C) coincides with the σ-field σ ( w C w(t), t R +). As a consequence, the mapping is measurable. Φ : ω (Ω, F) ( B t (ω) ) t R + (C, B(C)) Definition 5. The Wiener measure W 0 is the image of the probability distribution P by the mapping Φ. In this context, C is called the Wiener space. The Wiener measure only depends on the finite-dimensional marginals of the process. Alternative construction: define W 0 the law of the process ( B t, considered as a ( )t R + R R+ -valued random variable. Observe that W 0 R R + \ C ) = 0. For any Borel set Γ B(C), define W 0 (Γ) = W 0 (Γ), where Γ R R+ is any measurable set such that Γ = Γ C. One then checks that the probability measure W 0 is well-defined on C. Definition 6 (Canonical process). Let (Ω, F, P) = (C, B(C), W 0 ). Set B t (ω) = ω(t), for all t R + and ω C. Then ( B t )t R + defines a Brownian Motion, referred to as the canonical version of Brownian Motion. Brownian Motion is often called the Wiener process. For any x R, Brownian Motion starting at x is the process ( x+b t )t R +. The associated Wiener measure, starting at x, is denoted by W x. 2 Construction Let I = [0, 1] or I = R Gaussian process Abstract argument Use of the Kolmogorov extension theorem. Indeed, the family of marginals µ t1,...,t n = P ( B t1 dx 1,..., B tn ) dx n, for arbitrary 0 t 1... t n, is consistent. 4
5 This ensures the existence of a unique probability distribution, µ, on the space R R+, endowed with the product σ-field, with these marginals. Consider B t = ω(t) (the canonical process). All the properties of Brownian Motion, except the continuity of trajectories, are satisfied Isonormal Gaussian process Note that H = L 2 (I) is a separable, infinite dimensional, Hilbert space, with scalar product f, g = f(t)g(t)dt. Observe the key identity: for all s, t I, Cov(B s, B t ) = min(s, t) = 1 [0,s], 1 [0,t]. Let ( ) e n be (any) complete orthonormal system of H, and ( ξ n N n be a family of independent standard real-valued Gaussian random variables, i.e. ξ n N (0, 1). )n N Then set, for all t I, B t = ξ n 1 [0,t], e n. n N These random variables are well-defined, as limits in the L 2 (Ω) sense of Gaussian random variable. All the properties of Brownian Motion, except the continuity of trajectories, are satisfied. Note that the definition above does not depend on the choice of the complete orthonormal system. At a formal level, for every t 0, B t = 1 [0,t], ξ, where ξ = n N ξ n e n. However, almost surely, ξ L 2 (I) =. This object is not an element of L 2 (I) (it is only a distribution with negative regularity ). The quantity ξ is often interpreted as White Noise. It may be seen as the derivative of Brownian Motion; conversely, Brownian Motion may be seen as the antiderivative of White Noise. This interpretation suggests that Brownian Motion is not differentiable... More generally: let H be any separable, infinite dimensional, Hilbert space. Definition 7. An H-isonormal Gaussian process is a mapping W : H L 2 (Ω) (or equivalently a family of random variables (W(h)) h H ) such that: for any n N, and any (h 1,..., h n ) H n, (W(h 1 ),..., W(h n )) is a Gaussian random vector, i.e. for any (λ 1,..., λ n ) R n, the real random variable λ 1 W(h 1 )+...+λ n W(h n ) has a (possibly degenerate) gaussian law and is centered; 5
6 for any h 1, h 2 H, the covariance of W(h 1 ) and W(h 2 ) is given by E[W(h 1 )W(h 2 )] =< h 1, h 2 > H. At a formal level, this mapping is constructed as W(h) = h, ξ, with ξ given as above. In our context, B t = W(1 [0,t] ). For an arbitrary function h H = L 2 ([0, 1]), the random variable W(h) is often called the Wiener integral, and denoted by 1 h(t)db 0 t. Indeed, check that N 1 i=0 h( i N )( B i+1 N ) B i W(h), N N in distribution, say for continuous function h : [0, 1] R. In particular, if H = L 2 (I D), for some domain D R d, this definition provides a construction of space-time white noise, which is useful for the study of Stochastic Partial Differential Equations (SPDEs). 2.2 The continuity property Abstract argument Use of the Kolmogorov-Centsov regularity criterion. Definition 8. Let X = ( ) X t and X = ( Xt denote two stochastic processes. t I )t I The process X is a modification of X, if for every t I, P(X t = X t ) = 1. Theorem 4. Assume there exist α, β, C (0, ) such that for all t, s I E [ X t X s α] C t s 1+β. Then X admits a modification X, with almost surely continuous trajectories. In addition, the trajectories of X are almost surely Hölder continuous with exponent γ, for all γ (0, β α ). Note that the assumption β 0 is essential: the inequality is satisfied for a Poisson process, for instance, which of course does not admit a continuous modification. Application for Brownian Motion: E B t B s 2 = t s, which implies (Gaussian random variables) E B t B s 2p C p t s p for every p N. Thus there exists a modification B of B, with almost surely continuous trajectories; more precisely, its trajectories are Hölder continuous with exponent γ for all γ (0, 1 2 ). Indeed, p 1 2p 1. p 2 6
7 2.2.2 Constructive argument In fact, it is possible to directly prove the almost sure continuity of trajectories, thanks to an appropriate choice of the complete orthonormal system of H = L 2 ([0, 1]). Indeed, for all t [0, 1], and n N, 0 k 2 n 1, h 0 (t) = 1, h k n = 2 n/2 φ(2 n t k), with φ = 1 (0, 1 2 ] 1 ( 1,1]. 2 The family ( ) h 0, h k n is a complete orthormal system of n N,0 k 2 n 1 L2 ([0, 1]): it is called the Haar basis. Define the antiderivatives H 0 (t) = h 0, 1 [0,t], Hn(t) k = h k n, 1 [0,t] : the Schauder functions. Using the associated isonormal Gaussian process construction, B t = ηh 0 (t) + n=0 (2 n 1 k=0 ) ξ n,k Hn(t) k, as an equality of random variables, for fixed t 0, with independent standard Gaussian random variables ( η, ξ n,k )n N,0 k 2 n 1. Proposition 5. Almost surely, the series converges uniformly for t [0, 1]. Hence, as uniform limits of continuous functions, trajectories t B t are continuous, almost surely. 3 Donsker s invariance principle Let ( X n be a sequence of independent and identically distributed, real-valued, squareintegrable, random variables. )n N Assume E[X n ] = 0, and let σ 2 = Var(X n ). Assume σ 0. For any N N, define S (N) t = 1 ( Nt σ (1 {Nt}) X n + {Nt} X Nt ), N where s N 0 is the integer part of s and {s} = s s [0, 1). For each N N, S (N) is a rescaled version of the piecewise linear extrapolation of the standard random walk. Theorem 5. When N, the C-valued random variable S (N) converges to Brownian Motion, in distribution in C. This means that for any bounded continuous function F : C R, E [ F ( S (N))] E [ F (B) ] = F dw 0. N 7 n=1
8 4 Some selected properties of Brownian Motion 4.1 Sample path properties In other words: almost sure properties for trajectories of Brownian Motion Law of Large Numbers Theorem 6. Let ( B t )t R + be a Brownian Motion. Then almost surely B t t 0. t Corollary 7 (Time-inversion). Set B 0 = 0 and, for t (0, ), B t = tb 1/t. Then ( Bt )t R + is a Brownian Motion Hitting time For all a (0, ), T a = inf {t 0; B t = a}. Proposition 6. For any a (0, ), T a is a stopping time. Moreover, T a < almost surely. Finally, for every λ R +, E[e λta ] = e 2λa. Sketch of proof: since trajectories are almost surely continuous, for every t 0 {T a t} = {X τ a ɛ} F t. ɛ Q (0, ) τ Q [0,t] apply the optional stopping theorem (at time t T a ) for the martingale ( exp(θb t, for every θ (0, ). Let first t, then θ 0. θ 2 2 t)) t Supremum and infimum For every t R +, define S t = sup B s. 0 s t Proposition 7. For every t 0, P(S t > 0) = 1. Sketch of proof: consider P(S t > 1 ). Use the scaling property of Brownian Motion, n T 1 < almost surely, and let n. Corollary 8. Almost surely, for every t (0, ), sup B s > 0 and 0 s t Corollary 9. Almost surely, on any interval, t B t is not monotonic. 8 inf B s < 0. 0 s t
9 Proposition 8. Almost surely, lim supb t = + and lim inf B t =. t t Corollary 10. Let t 0 R +. Almost surely, lim sup h 0 not differentiable at time t 0. B t0 +h B t0 h =. In particular, almost surely the trajectories are The statement that almost surely trajectories are nowhere differentiable also holds true, but the proof requires more subtle arguments Zeros Proposition 9. Define the random set χ = {t 0; B t = 0}. Almost surely, χ is closed and unbounded, χ has 0 Lebesgue measure, 0 is an accumulation point of χ. 4.2 Quadratic variation For a subdivision π = {t 0 = 0 < t 1 <... < t i <...}, locally finite, define Vt π = ( Bti+1 B ) 2. t i t i π [0,t] Theorem 11. When π = sup i t i+1 t i 0, then Vt π 0, in probability and in L 2, for every t 0. Moreover, if ( π k )k N is a sequence of subdivisions such that k N π k <, then convergence holds in the almost sure sense. Corollary 12. Let γ > 1 2 and 0 T 1 < T 2. Almost surely, trajectories of Brownian Motion are not γ-hölder continuous on [T 1, T 2 ]. More generally: Definition 9. Let ( X(t) ) be a stochastic process. It is of finite quadratic variation if t 0 there exists a finite process ( X t, such that for every t 0, )t 0 V π t (X) π 0 X t, in probability, with Vt π (X) = ( t i π [0,t] Xti+1 X ) 2 t i Hence Brownian Motion is of finite quadratic variation, with B t = t. 9
10 Proposition 10. For every t 0, one has the following convergence, in L 2 : ( ) 2B ti Bti+1 B ti Bt 2 t. π 0 However, observe that t i π [0,t] t i π [0,t] [B ti + B ti+1 ] ( ) B ti+1 B ti Bt 2. π 0 The stochastic integral with respect to Brownian Motion is a subtle object. 4.3 Some applications of the Markov property Blumenthal 0 1 law Theorem 13 (Blumenthal). Let F t = σ ( B s, 0 s t ), and F + 0 = t>0 F t. The σ-field F + 0 is trivial: for every A F + 0, P(A) {0, 1}. Sketch of proof One shows that A is independent of B t1,..., B tn, for all 0 < t 1 <... < t N. For all ɛ < t 1, A F ɛ, and B t1 B ɛ,..., B tn B ɛ is independent of F ɛ. Pass to the limit ɛ 0. One obtains P(A) = P(A) 2, thus P(A) {0, 1} Reflection principle Proposition 11. For any a (0, ), t R +, with S t = sup B s. 0 s t P(S t a) = P(T a t) = 2P(B t a) = P( B t a), Remark 14. The equality S t B t in distribution holds only for a single time t, it is not an equality in law for the processes: trajectories t S t are almost surely non-decreasing. Sketch of proof P(S t a) = P(S t a, B t > a) + P(S t a, B t < a) P(S t a, B t < a) = P(T a t, B Ta+(t Ta) B Ta Markov property at T a <. < 0) = P(Ta t) 2, thanks to the strong 10
11 Corollary 15. For every a (0, ), T a admits the density f a (t) = a a2 2πt exp( )1 3 t t>0. In particular E[T a ] =. Remark 16. Be careful, by the martingale property E[B 2 t T a (t T a )] = 0, but taking the limit t is not useful. On the contrary, one may prove E[T a T a ] = a 2 with this strategy, where T a = inf {t 0; B t = a}, thus T a T a = inf {t 0; B t = a} Zeros Proposition 12. The set χ = {t 0; B t = 0} has no isolated point, almost surely. In particular, χ is not countable. Sketch of proof For every q Q +, set d q = inf {t > q; B t = 0}. Note that d q χ, and that d q is a stopping time. Thanks to the strong Markov property, d q is an accumulation point of χ (i.e. is a limit of points in χ \ {d q }). Set N = {d q accumulation point of χ}. Then P(N ) = 0. q Q + On Ω \ N, any point h is a limit of points in χ \ {h}: indeed, consider a non-decreasing sequence q n Q +, with q n h. Then either h = d qn for some N, and d qn is an accumulation point; or d qn < h for all n, with d qn χ and d qn 0. 5 PDEs and Brownian Motion 5.1 The heat equation on R d Consider the PDE { u(t,x) = 1 u(t, x) = 1 d 2 u(t,x) t 2 2 i=1, t > 0, x R d x 2 i u(0, x) = f(x), x R d. Assume that the initial condition f : R d R is bounded and continuous. The initial condition is interpreted in the following sense: u(t, ) f( ), uniformly on t 0 compact sets. Theorem 17. Define u(t, x) = f(y)p(t, x, y)dy = R d 1 ( ) d/2 2πt f(y)e y x 2 2t R d dy. 11
12 Then u is of class C on (0, ) R d. Moreover, it is solution of the heat equation, with the initial condition f. This solution admits a probabilistic interpretation: 5.2 A martingale u(t, x) = E[f(x + B t )] = E x [f(b t )]. Theorem 18. Let v : (t, x) R + R d u(t, x) R, be of class C 1,2 b : this means: v is continuously differentiable with respect to the time variable t, v is twice continuously differentiable with respect to the space variable x, v and the associated derivatives v t, v x i, 2 v x i x j are bounded. Define, for all t 0 M t = v(t, B t ) v(0, B 0 ) t 0 ( t + 1 ) 2 u(s, B s )ds. Then ( M t )t 0 is a continuous martingale. A probabilistic proof will be obtained using the so-called Itô s formula. An analytic proof, using the Markov property of Brownian Motion, and properties of the heat kernel, can be performed. 5.3 Some consequences The heat equation on R d returns Let u be a solution of the heat equation, of class C 1,2 b. Let T > 0 be given, and define v(t, ) = u(t t, ) for t [0, T ]. Then v satisfies + 1 = 0. Thus ( v(t, B t 2 t ) v(0, B 0 ) ) is a martingale, in particular t [0,T ] u(t, x) = v(0, x) = E x [v(0, B 0 )] = E x [v(t, B T )] = E x [u(0, B T )], where E x means that B 0 = x almost surely Harmonic functions Definition 10. Let D R d be an open domain. A function u : D R is called harmonic if it is of class C 2, and satisfies u = 0 on D. 12
13 Definition 11. Let D R d be an open, domain. A function u : D R satisfies the mean value property if u(a) = u(x)dµ a,r (x) B(a,r) for every a D, r > 0 such that the ball B(a, r) D, and µ a,r is the uniform distribution on the sphere B(a, r). Theorem 19. Assume that u : D R is harmonic. Then ( u(b t ) u(b 0 ) ) t 0 continuous martingale. is a centered Theorem 20. Let B(a, r) D. Assume B 0 = a, and let τ a,r = inf {t > 0; B t B(a, r)}. Then τ a,r < almost surely, and B τa,r µ a,r is uniformly distributed on the sphere. Corollary 21. If a function is harmonic, then it satisfies the mean value property The Dirichlet problem Let D R d be an open bounded domain, and f : D R be a continuous function. Definition 12. A function u : D R is solution of the Dirichlet problem (D, f) if u is continuous on D, u is of class C 2 on D, u is solution of the PDE { u(x) = 0, x D, u(x) = f(x), x D. Theorem 22. Assume that u is solution of the Dirichlet problem (D, f). Then for every x D, u(x) = E x [f(b τd )] with τ D = inf {t 0; B t / D}. In addition, τ D < almost surely. Conversely, under an additional regularity on the domain D, the function x E x [f(b τd )] is solution of the Dirichlet problem (D, f) Exit problem Let ( B t )t R + be a one-dimensional Brownian Motion, and a, b > 0. Let T a,b = inf {t 0; X t / ( a, b)}. Recall that almost surely, T a,b <. Proposition 13. One has P(B Ta,b = b) = a a + b, P(B T a,b = a) = b a + b, E[T a,b] = ab. 13
14 Sketch of proof: Setting u(x) = x+a b+a solves the PDE { u(x) = 0, x ( a, b), u( a) = 0, u(b) = 1. Setting v(x) = (x+a)(b x) 2 solves the PDE { u(x) = 1, u( a) = u(b) = 0. x ( a, b), Applying the optional stopping theorem for the associated martingales yields the results. 14
Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationLecture 12. F o s, (1.1) F t := s>t
Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationµ X (A) = P ( X 1 (A) )
1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationLecture 17 Brownian motion as a Markov process
Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is
More informationELEMENTS OF PROBABILITY THEORY
ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable
More informationVerona Course April Lecture 1. Review of probability
Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is
More informationn E(X t T n = lim X s Tn = X s
Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:
More informationSolutions to the Exercises in Stochastic Analysis
Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional
More informationLecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1
Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).
More informationIn terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.
1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t
More information{σ x >t}p x. (σ x >t)=e at.
3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ
More informationp 1 ( Y p dp) 1/p ( X p dp) 1 1 p
Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p
More informationFunctional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals
Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico
More informationPreliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012
Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 202 The exam lasts from 9:00am until 2:00pm, with a walking break every hour. Your goal on this exam should be to demonstrate mastery of
More informationWiener Measure and Brownian Motion
Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u
More informationApplications of Ito s Formula
CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale
More informationExercises in stochastic analysis
Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with
More informationA Concise Course on Stochastic Partial Differential Equations
A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationFiltrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition
Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,
More information6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )
6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined
More informationLECTURE 2: LOCAL TIME FOR BROWNIAN MOTION
LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in
More informationWeak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij
Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real
More informationStochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik
Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2
More informationGaussian Processes. 1. Basic Notions
Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian
More informationStochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes
BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents
More informationHarmonic Functions and Brownian motion
Harmonic Functions and Brownian motion Steven P. Lalley April 25, 211 1 Dynkin s Formula Denote by W t = (W 1 t, W 2 t,..., W d t ) a standard d dimensional Wiener process on (Ω, F, P ), and let F = (F
More informationSelected Exercises on Expectations and Some Probability Inequalities
Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ
More information9 Brownian Motion: Construction
9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of
More informationIntroduction to stochastic analysis
Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA 2139-437, USA. Abstract These lectures notes are notes in progress designed for
More informationUniversal examples. Chapter The Bernoulli process
Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial
More informationBrownian Motion and the Dirichlet Problem
Brownian Motion and the Dirichlet Problem Mario Teixeira Parente August 29, 2016 1/22 Topics for the talk 1. Solving the Dirichlet problem on bounded domains 2. Application: Recurrence/Transience of Brownian
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationBrownian Motion and Stochastic Calculus
ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s
More information(B(t i+1 ) B(t i )) 2
ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1
More informationStochastic Processes
Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space
More informationSome Tools From Stochastic Analysis
W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click
More informationGAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM
GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)
More informationOn pathwise stochastic integration
On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic
More informationHardy-Stein identity and Square functions
Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /
More informationAn Introduction to Stochastic Processes in Continuous Time
An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................
More informationInterest Rate Models:
1/17 Interest Rate Models: from Parametric Statistics to Infinite Dimensional Stochastic Analysis René Carmona Bendheim Center for Finance ORFE & PACM, Princeton University email: rcarmna@princeton.edu
More informationReflected Brownian Motion
Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide
More informationSTOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI
STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More informationHarmonic Functions and Brownian Motion in Several Dimensions
Harmonic Functions and Brownian Motion in Several Dimensions Steven P. Lalley October 11, 2016 1 d -Dimensional Brownian Motion Definition 1. A standard d dimensional Brownian motion is an R d valued continuous-time
More informationStochastic Differential Equations.
Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)
More informationStochastic integration. P.J.C. Spreij
Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................
More informationTopics in fractional Brownian motion
Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in
More informationBrownian Motion. Chapter Definition of Brownian motion
Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier
More informationA NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES
A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection
More informationfor all subintervals I J. If the same is true for the dyadic subintervals I D J only, we will write ϕ BMO d (J). In fact, the following is true
3 ohn Nirenberg inequality, Part I A function ϕ L () belongs to the space BMO() if sup ϕ(s) ϕ I I I < for all subintervals I If the same is true for the dyadic subintervals I D only, we will write ϕ BMO
More informationMATH 6605: SUMMARY LECTURE NOTES
MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic
More informationA Short Introduction to Diffusion Processes and Ito Calculus
A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationBrownian Motion and Conditional Probability
Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical
More informationConvergence at first and second order of some approximations of stochastic integrals
Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456
More informationContinuous martingales and stochastic calculus
Continuous martingales and stochastic calculus Alison Etheridge January 8, 218 Contents 1 Introduction 3 2 An overview of Gaussian variables and processes 5 2.1 Gaussian variables.........................
More informationStochastic Integration.
Chapter Stochastic Integration..1 Brownian Motion as a Martingale P is the Wiener measure on (Ω, B) where Ω = C, T B is the Borel σ-field on Ω. In addition we denote by B t the σ-field generated by x(s)
More information1.1 Definition of BM and its finite-dimensional distributions
1 Brownian motion Brownian motion as a physical phenomenon was discovered by botanist Robert Brown as he observed a chaotic motion of particles suspended in water. The rigorous mathematical model of BM
More informationChapter 1. Poisson processes. 1.1 Definitions
Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s
More informationThe Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20
The Lévy-Itô decomposition and the Lévy-Khintchine formula in the dual of a nuclear space. Christian Fonseca-Mora School of Mathematics and Statistics, University of Sheffield, UK Talk at "Stochastic Processes
More informationStochastic Calculus. Michael R. Tehranchi
Stochastic Calculus Michael R. Tehranchi Contents Chapter 1. A possible motivation: diffusions 5 1. Markov chains 5 2. Continuous-time Markov processes 6 3. Stochastic differential equations 6 4. Markov
More informationOPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS
APPLICATIONES MATHEMATICAE 29,4 (22), pp. 387 398 Mariusz Michta (Zielona Góra) OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS Abstract. A martingale problem approach is used first to analyze
More informationDoléans measures. Appendix C. C.1 Introduction
Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration
More informationA D VA N C E D P R O B A B I L - I T Y
A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2
More informationJump Processes. Richard F. Bass
Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov
More informationBernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012
1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.
More informationSolution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have
362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications
More information4 Sums of Independent Random Variables
4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables
More informationConvergence of Feller Processes
Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes
More informationERRATA: Probabilistic Techniques in Analysis
ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,
More informationExponential martingales: uniform integrability results and applications to point processes
Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda
More informationMetric Spaces and Topology
Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies
More informationRandom Process Lecture 1. Fundamentals of Probability
Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus
More information(A n + B n + 1) A n + B n
344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals
More informationPart II Probability and Measure
Part II Probability and Measure Theorems Based on lectures by J. Miller Notes taken by Dexter Chua Michaelmas 2016 These notes are not endorsed by the lecturers, and I have modified them (often significantly)
More informationStrong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term
1 Strong uniqueness for stochastic evolution equations with possibly unbounded measurable drift term Enrico Priola Torino (Italy) Joint work with G. Da Prato, F. Flandoli and M. Röckner Stochastic Processes
More informationBROWNIAN MOTION AND LIOUVILLE S THEOREM
BROWNIAN MOTION AND LIOUVILLE S THEOREM CHEN HUI GEORGE TEO Abstract. Probability theory has many deep and surprising connections with the theory of partial differential equations. We explore one such
More informationBernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010
1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes
More informationlim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),
1 Part I Exercise 1.1. Let C n denote the number of self-avoiding random walks starting at the origin in Z of length n. 1. Show that (Hint: Use C n+m C n C m.) lim n C1/n n = inf n C1/n n := ρ.. Show that
More informationEmpirical Processes: General Weak Convergence Theory
Empirical Processes: General Weak Convergence Theory Moulinath Banerjee May 18, 2010 1 Extended Weak Convergence The lack of measurability of the empirical process with respect to the sigma-field generated
More informationA Fourier analysis based approach of rough integration
A Fourier analysis based approach of rough integration Massimiliano Gubinelli Peter Imkeller Nicolas Perkowski Université Paris-Dauphine Humboldt-Universität zu Berlin Le Mans, October 7, 215 Conference
More informationFE 5204 Stochastic Differential Equations
Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture
More informationRegularity of the density for the stochastic heat equation
Regularity of the density for the stochastic heat equation Carl Mueller 1 Department of Mathematics University of Rochester Rochester, NY 15627 USA email: cmlr@math.rochester.edu David Nualart 2 Department
More informationLecture 19 L 2 -Stochastic integration
Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes
More informationDefinition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem
Definition: Lévy Process Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes David Applebaum Probability and Statistics Department, University of Sheffield, UK July
More informationLecture 19 : Brownian motion: Path properties I
Lecture 19 : Brownian motion: Path properties I MATH275B - Winter 2012 Lecturer: Sebastien Roch References: [Dur10, Section 8.1], [Lig10, Section 1.5, 1.6], [MP10, Section 1.1, 1.2]. 1 Invariance We begin
More informationLecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.
Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal
More informationThe concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.
The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes
More informationStochastic Analysis I S.Kotani April 2006
Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic
More informationBrownian Motion. Chapter Stochastic Process
Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic
More informationPart III Stochastic Calculus and Applications
Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly
More informationPoisson random measure: motivation
: motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps
More informationTHEOREMS, ETC., FOR MATH 515
THEOREMS, ETC., FOR MATH 515 Proposition 1 (=comment on page 17). If A is an algebra, then any finite union or finite intersection of sets in A is also in A. Proposition 2 (=Proposition 1.1). For every
More informationProbability and Measure
Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability
More informationLecture 4: Introduction to stochastic processes and stochastic calculus
Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London
More information