Introduction to stochastic analysis

Size: px
Start display at page:

Download "Introduction to stochastic analysis"

Transcription

1 Introduction to stochastic analysis A. Guionnet 1 2 Department of Mathematics, MIT, 77 Massachusetts Avenue, Cambridge, MA , USA. Abstract These lectures notes are notes in progress designed for course which gives an introduction to stochastic analysis. They are designed to reflect exactly the content of the course, rather than propose original material. I am very much indebted to Nadine Guillotin who lend me her latex files of her own (french) lectures notes which I have used thoroughly. I also used Nathanael Beresticki lectures notes, as well as books by Daniel revuz and Marc Yor (Continuous martingales and Brownian motion), Jean-Francois Le Gall (Mouvement Brownien, martingales et calcul stochastique), by Ioannis Karatzas and Steven E. Shreve (Brownian motion and Stochastic calculus), by Bernt Oksendal (Stochastic Differential equations). Contents notations, classical (admitted) notions 1 1. Brownian motion and stochastic processes 2 2. Processes with independent increments Martingales Stopping time Finite variation process and Stieltjes integral Continuous local martingales Stochastic Integral Stochastic differential equations (SDE) 74 1 guionnet@math.mit.edu 1

2 2 INTRODUCTION TO STOCHASTIC ANALYSIS 9. Appendix 84 notations, classical (admitted) notions A measurable space (Ω, G) is given by -the sample space Ω, an arbitrary non-empty set, -the σ-algebra G(also called σ-field) a set of elements of Ω such that: G contains the empty set, G is closed under complements (if A F a, Ω\A G), G is closed under countable unions (if A i G, i A i G) A function f : X Y between two measurable spaces (X, G) and (Y, G) is measurable iff for all B G, f 1 (B) G. We will use that pointwise limits of measurable functions are measurable (exercise). Convergence in law : A sequence µ n, n of probability measures on a measurable space (Ω, G) converges in law towards a probability measure µ iff for any bounded continuous function F on (Ω, G) lim n F dµ n = F dµ The monotone convergence theorem asserts that if f n, f n f n+1 grows P-as to f then f n dp = fdp lim n The bounded convergence theorem asserts that if f n is a sequence of uniformly bounded functions converging P as to f then f n dp = fdp. lim n Borel-Cantelli lemma states that if A n is a sequence of measurable sets of a measurable space (Ω, G) equipped with a probability measure P such that P(A c n) <, then P(lim sup A n ) = 1 lim sup A n = n p n A p. denotes asymptotic equality (in general, A n B n iff A n B n goes to zero, but it can also mean that A n /B n goes to one) 1. Brownian motion and stochastic processes Stochastic processes theory is the study of random phenomena depending on a time variable. Maybe the most famous is the Brownian motion first described by R. Brown, who observed around 1827 that tiny particles of pollen in water have an extremely erratic motion. It was observed by Physicists that this was due to an important number of random shocks undertaken by the particles from the

3 INTRODUCTION TO STOCHASTIC ANALYSIS 3 (much smaller) water molecules in motion in the liquid. A. Einstein established in 195 the first mathematical basis for Brownian motion, by showing that it must be an isotropic Gaussian process. The first rigorous mathematical construction of Brownian motion is due to N. Wiener in 1923, after the work of L. Bachelier in 19 who is considered to be the first to introduce this notion Microscopic approach. In order to motivate the introduction of this object, we first begin by a microscopical depiction of Brownian motion. Suppose (X n, n ) is a sequence of R d valued random variables with mean and covariance matrix σ 2 I, which is the identity matrix in d dimensions, for some σ 2 >. Namely, if X 1 = (X1 1,..., Xd 1 ), we have E[X 1 i ] =, E[X 1 i X 1 j ] = σ 2 δ ij, 1 i, j d We interpret X n as the spatial displacement resulting from the shocks due to water molecules during the n-th time interval, and the fact that the covariance matrix is scalar stands for an isotropy assumption (no direction of space is privileged). From this, we let S n = X X 1 n and we embed this discrete-time process into continuous time by letting B (n) t := ( 1 n S [nt], t ) Let. 2 be the Euclidean norm on R d and for t > and x R d, define p t (x) = 1 (2πt) d/2 exp( x 2 2 2t ) which is the density of the Gaussian distribution N(, tid) with mean and covariance matrix tid. By convention, the Gaussian law N(m, ) is the Dirac mass at m. Proposition 1.1. Let < t 1 t 2 < < t k. Then the finite marginal distributions of B (n) with respect to times t 1,..., t k converge weakly as n goes to infinity. More precisely, if F is a bounded continuous function, and letting x =, t =, lim n E[F (B(n) t 1,..., B (n) t k )] = F (x 1,..., x k ) 1 i k Proof. The proof boils done to the central limit theorem as B (n) t i B (n) t i 1 = 1 n [nt i+1] j=[nt i]+1 X j p σ 2 (t i t i 1)(x i x i 1)dx i are independent and converges in law towards centered Gaussian vectors with covariance σ 2 (t i+1 t i ) by the central limit theorem. The latter can be checked by

4 4 INTRODUCTION TO STOCHASTIC ANALYSIS computing the Fourier transform given for any real parameters ξ j R d by while E[e i k j=1 ξi.(b(n) t B (n) j lim n E[eiξj.(B t j 1 ) ] = k j=1 (n) t B (n) j t ) j 1 ] = e σ2 E[e iξj(b(n) t B (n) j t ) j 1 ] 2 (tj+1 tj) as can easily be checked (at least if the X i s have a moment of order 2 + ɛ for some ɛ > ) as [nt i+1] E[e iξi.(b(n) t B (n) i t ) i 1 ] = j=[nt i]+1 [nt i+1] j=[nt i]+1 E[e 1 n ξ i.x j ] (1 n 1 σ 2 ξ i 2 2/2) e σ2 2 (ti+1 ti) This suggests that B (n) should converge to a process B whose increments are independent and Gaussian with covariances dictated by the above formula. The precise sense of this convergence as well as the state space in which the limit should live is the object of the next subsections. The limit of B (n) should be described as follows: Definition 1.2. An R d -valued stochastic process (B t, t ) is called a standard Brownian motion if it is a continuous process, that satisfies the following conditions: (1) B = a.s., (2) for every = t t 1 t 2 t k, the increments (B t1 B t, B t2 B t1,..., B tk B tk 1 ) are independent, (3) for every t, s, the law of B t+s B t is Gaussian with mean and covariance sid. The properties (1), (2), (3) exactly amount to say that the finite-dimensional marginals of a Brownian motion are given by the formula of Proposition 1.1. Therefore the law of the Brownian motion is uniquely determined Equivalent processes, indistinguishable processes. The previous section yields several remarks; how can we construct a random continuous process with given marginals? how does it compares to other constructions? How can we speak about the law of the Brownian motion? etc etc In this section we make all these definitions more precise. We will denote throughout (Ω, G, P) a probability space. T will be the space time, often T = R +. (E, E) is the measurable space of state.

5 INTRODUCTION TO STOCHASTIC ANALYSIS 5 Definition 1.3. A stochastic process with values in (E, E) based on (Ω, G, P) is a family (X t ) t T of random variables from (Ω, G, P) into (E, E). To any ω Ω, we associate the map T E t X t (ω) called the trajectory of (X t ) t T associated with ω. To simplify, we will hereafter restrict ourselves to the case T = R +, E = R d and E = B(R d ). We say that (X t ) t T is P-a.s right (resp. P-a.s left, resp. P-a.s) continuous if for almost all ω Ω, the trajectory of (X t ) t T associated with ω is right (resp. left, resp.) continuous. We will say that two stochastic processes describe the same random phenomenon if they are equivalent in the following sense Definition 1.4. Let (X t ) t T and (X t) t T be two processes with values in the same state space (E, E) with (X t ) t T (resp. (X t) t T ) based on (Ω, G, P)(resp. (Ω, G, P )). We say that (X t ) t T and (X t) t T are equivalent if for all n 1, for all t 1,..., t n T, all B 1,, B n E, P (X t1 B 1,..., X tn B n ) = P ( X t 1 B 1,..., X t n B n ). We also say that these processes are a version of each other or a version of the same process. Note that this defines an equivalence relation. The family of the random variables (X t1,..., X tn ) for t i T is called the family of the finite dimensional marginals of (X t ) t T. Two processes are equivalent if they have same finite marginal distributions. Note however that this does not imply in general that X t = X t almost surely for every t as the set of parameters T is not countable, unless the processes under study possess some regularity. The latter property refers to indistinguishable processes Definition 1.5. Two processes (X t ) t T and (X t) t T defined on the same probability space (Ω, G, P) are indisguishable if P(X t (ω) = X t(ω) t T) = 1 Note that, up to indistinguishability there exists at most one continuous modification of a given process (X t, t ). We will say that a process X is a modification of another process X if they are indisguishable.

6 6 INTRODUCTION TO STOCHASTIC ANALYSIS 1.3. Kolmogorov s criterion. Kolmogorov s criterion is a fundamental result which guarantees the existence of a continuous version (but not necessarily indistinguishable version) based solely on an L p control of the two-dimensional distributions. We will apply it to Brownian motion below, but it is useful in many other contexts. Theorem 1.6. (Kolmogorov s continuity criterion) Let X t, t R + be a stochastic process with values in R d. Suppose there exist α >, β >, C > so that E[ X t X s α ] C t s 1+β for some norm. on R d. Then, there exists a modification (X t) t R + of (X t ) t R + which is almost surely continuous, and even ε Hölder for ε < β/α. As a direct application we deduce that Corollary 1.7. If (X t ) t R + is a d-dimensional Brownian motion defined by Definition 1.2, there exists a a modification (X t) t R + of (X t ) t R + with continuous (and even ε-hölder with ε < 1/2) trajectories. Indeed it follows from the fact that for all integer number n, E[ X t X s 2n 2 ] = C n (t s) n with C n the 2nth moment of a centered d-dimensionnal Gaussian variable with variance one, so that Kolmogorov Theorem holds with ε < (n 1)/2n. Proof. It is enough to restrict ourselves to T = [, 1] up to put [t] B t = B1 i + B [t] t [t] i=1 with B i independent copies of the Brownian motion on [, 1]. Let D n = {k2 n, k 2n} denote the dyadic numbers of [, 1] with level n, so D n increases as n increases. Then letting ε < β/α, Tchebychev s inequality gives for k 2 n P ( X k2 n X (k+1)2 n 2 nε) 2 nεα E[ X k2 n X (k+1)2 n α ] C2 nεα n(1+β) Summing over k we deduce that ( ) P max X k 2 n k2 n X (k+1)2 n 2 nε C2 n 2 nεα n(1+β) C2 nεα nβ which is summable. Therefore, Borel Cantelli s lemma implies that there exists N(ω) almost surely finite so that for n N(ω), max k 2 n X k2 n X (k+1)2 n 2 nε. We claim that this implies that for every s, t D = D n, X s X t M(ω) s t ε

7 INTRODUCTION TO STOCHASTIC ANALYSIS 7 for some almost surely finite constant M(ω). Indeed take s, t D so that 2 r 1 s t 2 r for some r N(ω). We can always write the diadyc decomposition of t and s m p t = η 2 r + 2 r i η i s = η 2 r 2 r i η i i=1 for some η i, η i {, 1} and set t j = k2 r + j 2 r i η i s j = k2 r i=1 i=1 j 2 r i η i to deduce from the triangle inequality that, as X t = X t + m i=1 (X t i X ti 1 ), with X t = X s, X t X s X t X s + m 2 (r+i)ε + i=1 i=1 m X ti X ti 1 + i=1 l i=1 2 (r+i)ε C(ε)2 rε C(ε) t s ε l X si X si 1 i=1 as t s 2 r 1. Therefore the process (X t (ω), t D) is uniformly continuous, and even ε-hölder, for all ω such that N(ω) < Since D is an everywhere dense set in [,1], this process admits a unique continuous extension X(ω) on [, 1], which is also ε-hölder. It is defined by X t (ω) = lim n X tn (ω), where (t n, n ) is any D- valued sequence converging to t. On the exceptional set where (X d, d D) is not uniformly continuous (that is N(ω) = + ), we let X t (ω) = so X(ω) is continuous. It remains to show that X is a version of X. But by Fatou s lemma, if t n is a sequence of diadyc numbers converging to t, we have E[ X t X t p ] lim inf E[ X tn X t p ] = So that indeed the finite marginals of X coincide with those of X. From now on we will consider exclusively a continuous modification of Brownian motion, which is unique up to indistinguishability. Hence, we have constructed a Brownian motion B which can be seen as an application from a probability space (Ω, P) into the space C(R +, R) of continuous function from R + into R. The Wiener measure, or law of the Brownian motion, is by definition the image of P by this application; it is therefore a probability measure on C(R +, R). In the next part, we study this measure, as a warming up to what we will soon develop for more general processes.

8 8 INTRODUCTION TO STOCHASTIC ANALYSIS 1.4. Behaviour of Brownian motion trajectories. In this paragraph we are given a Brownian motion (B t ) t : Ω C(R +, R) and study the properties of its trajectories Generic properties. Here we derive some information on the shape of the trajectories. A very useful result is the so-called -1 Blumenthal law which states as follows. Lemma 1.8. For all t let F t be the sigma algebra generated by {B s, s t}, that is the smallest σ-algebra on Ω that contains all pre-images of measurable subsets of Ω for times s t. Let F + = s>f s. Then any A F + is such that P (A) = or 1. Proof. Take A F + and < t 1 < < t n, and f : R n R a bounded continuous function. Then, by continuity as B ɛ goes to zero with ɛ E[1 A f(b t1,..., B tn )] = lim ɛ E[1 A f(b t1 B ɛ,..., B tn B ɛ )] = lim ɛ P (A)E[f(B t1 B ɛ,..., B tn B ɛ )] = P (A)E[1 A f(b t1,..., B tn )] where we used the Markov property (see the second point in Definition 1.2). Hence, F + is independent of σ(b t 1,..., B tn ), and thus of σ(b s, s > ). Finally, σ(b s, s > ) = σ(b s, s ) as B is the limit of B t as t goes to zero. On the other hand F + σ(b s, s > ), and therefore we have proved that F + is independent of itself, which yields the result. As a corollary, we derive the following property Proposition 1.9. We almost surely have for all ε > sup B s >, s ε inf B s <. s ε For all a R, let T a = inf{t : B t = a} (with the convention that this is infinite if {B t = a} =). Then almost surely T a is finite for all a R. As a consequence lim inf B s =, lim sup B s = + s s Proof. Note that sup <s ε B s is measurable as B s is continuous so that sup s [,ε] B s = sup s [,ε] Q B s. This type of argument will be repeated in many places hereafter. We put for some sequence ε p going to zero with p A = { sup s ε p B s > } A belongs to F + as a decreasing intersection of events in F ε p and P (A) = lim P( sup B s > ) lim P(B ε p > ) 1/2 p s ε p p

9 INTRODUCTION TO STOCHASTIC ANALYSIS 9 implying with Blumenthal law that P (A) = 1. By changing B into B we obtain the statement for the inf. To prove the last result observe that we have proved 1 = P( sup B s > ) = lim P( sup B s > δ) = lim P( sup B sδ 4 > δ 1 ) s 1 δ s 1 δ s 1 where we used that (cb t/ c, t ) has the law of the Brownian motion for any c > (see Exercise 2.19). But P( sup B sδ 4 > δ 1 ) = P( sup B s > δ 1 ) P( sup B s > δ 1 ) s 1 s δ 4 s and hence we conclude that P( sup B s > δ 1 ) = 1 s for all δ >. The same result with the infimum is derived by replacing B by B. The fact that T a is almost surely finite follows from the continuity of the trajectories, which takes all values in (, + ) Regularity. Note that in fact Corollary 1.7 is optimal in the sense that Theorem 1.1. Let B be a continuous modification of the Brownian motion. Let γ > 1/2. Then ( ) B t+h B t P t : lim sup h h + γ = + = 1. Proof. We first observe that { } B t+h B t t : lim sup h h + γ < + p,k,m=1 { t [, m] : B t+h B t ph γ, h (, 1/k)} so that it is enough to show that for any δ > P ( t [, m] : B t+h B t ph γ, h (, δ)) = and in turn if A i,n = { t [i/n, (i + 1)/n] : B t+h B t ph γ, h (, δ)}, mn 1 lim n i= P (A i,n ) =. Fix a large constant K > to be chosen suitably later. We wish to exploit the fact that on the event A i,n many increments must be small. The trick is to be able to fix in advance the times at which these increments will be too small. More precisely, on A i,n, as long as n (K + 1)/δ, for all 1 j K so that t (i + j)/n K/n δ B t B i+j p( K + 1 n n )γ

10 1 INTRODUCTION TO STOCHASTIC ANALYSIS and therefore by the triangular inequality Hence B i+j 1 n ( P(A i,n ) P K i+j 1 j=2{ B n ( = P B i+j 1 n B i+j 2p( K + 1 n n )γ B i+j 2p( K + 1 n B i+j 2p( K + 1 n n )γ ) n )γ } ) K 1 with B i+j 1 n ( P { B i+j 1 n B i+j n with law N/ n for a standard Gaussian variable N so that B i+j 2p( K + 1 n n )γ } ) = P( N n2p( K + 1 n )γ ) C n2p( K + 1 n )γ for some finite constant C as long as n2p( K+1 n )γ is small, that is γ > 1/2. Hence, keeping K fixed we find a finite constant C so that P(A i,n ) C K n ( 1 2 γ)(k 1) and therefore mn 1 i=1 P(A i,n ) C K n ( 1 2 γ)(k 1) mn which goes to zero when n goes to infinity as soon as K is chosen big enough. We will later spend a lot of time to give a precise and rigorous construction of the stochastic integral, for as large a class of processes as possible, subject to continuity. This level of generality has a price, which is that the construction can appear quite technical without shedding any light on the sort of processes we are talking about. The real difficulty in the construction of the integral is in how to make sense of an integral against the Brownian motion, denoted H s db s as B. is at best Hölder 1/2. To do that we will need to use randomness and martingale theory. We will enlarge our scope and consider more general processes than the Brownian motion soon. Before doing so we introduce (and hopefully motivate) some notions that we will discuss later in a wider scope, namely strong Markov property and stopping times.

11 INTRODUCTION TO STOCHASTIC ANALYSIS Strong Markov property. We have already seen that the Wiener law satisfies the Markov property: For all s, the process B t+s B s, t is a Brownian motion independent of σ(b r, r s). The goal of this paragraph would be to extend this result to the case where s is itself a random variable. To do so, we need to resterict ourselves to the so-called stopping times; a random variable T with values in [, ] is a stopping time if, for all t, {T t} F t = σ(b s, s t). We define We set F T = {A F : t, A {T t} F t } { BT 1 T < B T (ω) = (ω) (ω) if T (ω) < otherwise. 1 T < B T is F T measurable. Indeed by continuity of the trajectories 1 T < B T = lim 1 k2 n T <(k+1)2 nb k2 n = lim 1 T <(k+1)2 n(1 k2 n T B k2 n), n k=1 where 1 k2 n T B k2 n is F T measurable. n k=1 Theorem (Strong Markov Property) Let T be a stopping time such that P(T < ) >. Then, the process B (T ) t = 1 T < (B T +t B T ), t is a Brownian motion independent of F T under P(. T < ). Proof. We first assume T < a.s. and show that if A F T, for all bounded continuous function f E[1 A f(b (T ) t 1,..., B (T ) t k )] = P(A)E[1 A f(b t1,..., B tk )] which is enough to prove the statement. We denote [T ] n = ([2 n T ] + 1)2 n with [a] the integer part of a real a. Observe that by continuity of the trajectories, f(b (T ) t 1,..., B (T ) t k ) = lim n f(b([t ]n) t 1 ([T ]n),..., B t k ) so that by dominated convergence theorem, for all bounded continuous function f E[1 A f(b (T ) t 1,..., B (T ) t k )] = lim E[1 Af(B n = lim n k= ([T ]n) t 1 ([T ]n),..., B t k )] E[1 A 1 (k 1)2 n <T k2 nf(b t 1+k2 n B k2 n,..., B t k +k2 n B k2 n)]

12 12 INTRODUCTION TO STOCHASTIC ANALYSIS For A F T, A {(k 1)2 n < T k2 n } = (A {T k2 n }) {T (k 1)2 n } c is F k2 n measurable. Hence, the usual Markov property implies that E[1 A 1 (k 1)2 n <T k2 nf(b t 1+k2 n B k2 n,..., B t k +k2 n B k2 n)] = E[1 A 1 (k 1)2 n <T k2 n]e[f(b t 1+k2 n B k2 n,..., B t k +k2 n B k2 n)] from which the result follows. The same arguments can be followed when T < with positive probability. A nice application of the strong Markov property is the reflexion principle : Theorem For all t >, denote by S t = sup s t B s. Then, if a and b a we have P(max s t B s a, B t b) = P(B t 2a b) In particular max s t B s has the same law as B t. Proof. We apply the strong Markov property with the stopping time T a = inf{t : B t = a} We have already seen that T a is finite almost surely. Moreover P(max s t B s a, B t b) = P(T a t, B (Ta) t T a b a) By the strong Markov property, B (Ta) t T a is independent of T a and also has the same law as B (Ta) t T a where B is a Brownian motion. Hence we get P(T a t, B (Ta) t T a b a) = P(T a t, B (Ta) t T a b a) = P(T a t, B t + B Ta b a) = P(T a t, B t 2a b) = P(B t 2a b) where we used that 2a b a in the last line. For the last point we notice that for a, P(S t a) = P(S t a, B t a) + P(S t a, B t a) = 2P(B t a) = P( B t a) 2. Processes with independent increments We will often consider stochastic processes with independent increments Definition 2.1. A stochastic process (X t ) t T based on (Ω, G, P) with values in (R d, B(R d )) is a process with independent increments (abbreviated I.I.P) iff (1) X = a.s.,

13 INTRODUCTION TO STOCHASTIC ANALYSIS 13 (2) For all n 2, for all t 1,..., t n R + so that t 1 < t 2 < < t n, the random variables are independent. X t1, X t2 X t1,..., X tn X tn 1 A stochastic process is a stationary process with independent increments (abbreviated S.I.I.P) if it is a I.I.P so that for all s, t R +, s < t, X t X s has the same law than X t s. When T = N, stationarity is described by the fact that there exists a sequence (Z i ) i N of i.i.d. variables so that S n = Z Z n In this case S n is also called a random walk. A family (µ t ) t R + of probability measures on (R d, B(R d )) is called a convolution semi-group if for all s, t [, + ), µ s+t = µ s µ t. Proposition 2.2. If (X t ) t R + is a S.I.I.P and µ t is the law of X t, (µ t ) t R + is a convolution semi-group. It is called the convolution semi-group of (X t ) t R +. More generally if (X t ) t R + is a I.I.P so that for all s < t X t X s has law µ s,t then for any s < t < u, we have µ s,u = µ s,t µ t,u. Proof. We write X u X s = (X u X t ) + (X t X s ) and use the independence of X u X t and X t X s to conclude. For SIIP we have an easier way to characterize the equivalence relation defined in 1.4 Proposition 2.3. a) If X and X are two SIIP with the same convolution semi-group, they are equivalent. b) More generally, if X and X are two IIP so that for all s < t, X t X s and X t X s have the same distribution, then they are equivalent. Let us give some examples 1) µ = δ at 2)µ t is the Poisson law P λt with parameter λt for all t > (P x (k) = e x x k /k!). We will call Poisson process with parameter λ > the SIIP with such convolution semi-group.

14 14 INTRODUCTION TO STOCHASTIC ANALYSIS 3)µ t is the centered Gaussian law with covariance t. Check that the SIIP with such convolution semi-group is the Brownian motion. iff More generally we will say that a stochastic process is a Gaussian real process It takes its values in (R, B(R)), For all n 1, all t 1,..., t n R +, the random variable (X t1,..., X tn ) is Gaussian. Note that in this case the semi-group is determined by the mean and the covariance as so is any Gaussian law. m(t) = E[X t ] C(t, s) = E[X t X s ] m(t)m(s), Note that any covariance C(t, s) is positive semi-definite, namely C(s, t) = C(t, s) s, t T and n 1, t 1,..., t n T, λ 1,..., λ n R, n c(t i, t j )λ i λ j. i,j=1 Proposition 2.4. A real stochastic process is a real Brownian motion iff it is a centered Gaussian real process with covariance E[X t X s ] = t s Proof. If X is a real Brownian motion, -X =, -For all t >, X t follows N(, t) -For all n 2 and all t 1,..., t n so that t 1 < t 2 < < t n, (X t1, X t2 X t1,..., X tn X tn 1 ) are independent Gaussian. Hence X is a SIIP. Finally, for s t E[X s X t ] = E[X 2 s ] + E[(X t X s )X s ] = E[X 2 s ] = s = s t by independence and centering. Let X be a centered Gaussian process with covariance s t. The first thing we need to check is that the increments are independent ; but this follows from the vanishing of the covariance E[(X t2 X t1 )(X t4 X t3 )] = if t 1 < t 2 t 3 < t 4 Hence X is a IIP. To check stationnarity it is enough to check that the covariances are stationary. But E[(X t X s ) 2 ] = t + s 2s t = t s

15 INTRODUCTION TO STOCHASTIC ANALYSIS 15 which completes the argument Law of a stochastic process, canonical process. Let (E, E) be a measurable space. Let T be a non empty set. We denote E T = {x = (x t ) t T : x t E, t T}. We will call product σ-algebra on E T (associated to the σ-algebra E and T) the smallest σ-algebra on E T so that the coordinate mappings : γ t : x = (x s ) s T x t are measurable as t T. It is denoted by E T. We call measurable product space associated with (E, E) and T the space (E T, E T ) = (E, E) T. Proposition 2.5. Let (Ω, F) be a measurable space. Let U be a map from Ω into E T.Then, U is measurable from (Ω, F) into (E, E) T iff t T, γ t U is measurable from (Ω, F) into (E, E). Proof. : If U is measurable, then γ t U is measurable as the composition of measurable maps. : Reciprocally, if t T, γ t U is measurable, then (γ t U) 1 (A) F, A E. But, (γ t U) 1 (A) = U 1 ((γ t ) 1 (A)). Thus, U 1 (B) F for all B D := {(γ t ) 1 (A) : t T, A E}. Since E T is the sigma-algebra generated by D, this implies that for all B E T. U 1 (B) F Let (X t ) t T be a family of maps from Ω into E. We denote by X the map : Ω E T ω (X t (ω)) t T. We then have, recall the definition 1.3 of stochastic processes, Corollary 2.6. (X t ) t T is a stochastic process with values in (E, E) iff the map X is measurable from (Ω, F) into (E, E) T. Proof. Follows from the previous Proposition. According to the last corollary, we can identify the stochastic process (X t ) t T and the measurable map X. In the following we will set X = (X t ) t T. We denote law, on (E, E) T, of X the push-forward of the probability measure P by the measurable map X. We denote it P X.

16 16 INTRODUCTION TO STOCHASTIC ANALYSIS Proposition 2.7. Two stochastic processes (X t ) t T and (X t) t T ( based respectively on (Ω, F, P) and (Ω, F, P )) with values in (E, E) are equivalent iff they have the same law on (E, E) T. Proof. : If P X = P X, then the processes are equivalent as : for any t 1,..., t n T, if A = t T A t with { Bi E if t = t A t = i, 1 i n E if t / {t 1,..., t n }, we have : Therefore, X 1 (A) = {X t1 B 1,..., X tn B n }. X 1 (A) = {X t 1 B 1,..., X t n B n }. P (X t 1 B 1,..., X t n B n ) = P X (A) = P X(A) = P(X 1 (A)) = P(X t1 B 1,..., X tn B n ). : If (X t ) t T and (X t) t T are equivalent, we have : P X (A) = P X (A) for any cylinder A in E T. But, E T = σ(c) with C the cylinder family (see exercise 2.23). But, C is stable under finite intersection and therefore following exercise 2.16 (Monotone class Thm), see also Lemma 9.2, we deduce that P X = P X. Let X = (X t ) t T be a stochastic process, based on (Ω, F, P), with values in (E, E). The canonical process (Y t ) t T, on (E, E) T, associated with (X t ) t T is the stochastic process based on (E T, E T, P X ) defined by : Y t (x) = γ t (x) = x t, x = (x s ) s T E T. Proposition 2.8. (X t ) t T and its canonical process (Y t ) t T are equivalent. Proof. Let t 1,..., t n T and B 1,..., B n E. If A = t T A t with we have : We also have Therefore, A t = { Bi if t = t i, 1 i n E if t / {t 1,..., t n }, X 1 (A) = {X t1 B 1,..., X tn B n }. A = {x; Y t1 (x) B 1,..., Y tn (x) B n }. P(X t1 B 1,..., X tn B n ) = P X (A) = P X (Y t1 B 1,..., Y tn B n ).

17 INTRODUCTION TO STOCHASTIC ANALYSIS Canonical process with given finite partitions. Let T be an infinite set. We denote I the set of finite (non empty) subsets of T. Let (E, E) be a measurable space. Let (P I ) I I be a family of probability measures indexed by I, so that for all I I, P I is a probability measure on (E, E) card(i) = (E card(i), E card(i) ). We say that (P I ) I I is a compatible system ( or a projective system) if : for all I I, any J I so that J I, P J is the push-forward of P I by the map Π I,J : (x t ) t I (x s ) s J. Let X = (X t ) t T be a stochastic process, based on (Ω, F, P), with values in (E, E). If, for I = {t 1,..., t n } I, we denote P I the law of the random variable (X t1,..., X tn ), then (P I ) I I is the family of the finite partitions of the stochastic process X = (X t ) t T. Moreover, we have : Proposition 2.9. The finite partitions of X = (X t ) t T are a compatible system. Proof. If I = {t 1,..., t n } J = {t i1,..., t ik }, with t i T, i = 1,..., n, n 2, 1 k < n, 1 i 1 < i 2 <... < i k n. We have : P J (B 1... B k ) = P(X ti1 B 1,..., X tik B k ) = P(X tj E, j {1,..., n} \ {i 1,..., i k }, X ti1 B 1,..., X tik B k ) = P I (Π 1 I,J (B 1... B k )) Reciprocally, let us be given a compatible system of probability measures (P I ) I I (with for all I I, P I a probability measure (E, E) card(i) ). We set : Ω = E T = {ω = (ω t ) t T : ω t E, t T}. F = E T. Y t (ω) = ω t. Let P be a probability measure on (Ω, F), (Y t ) t T can be seenas a stochastic process based on (Ω, F, P). It is called the associated canonical process associated with P (on (E T, E T, P)). Theorem 2.1. [Kolmogorov] (admitted) If E is a Polish space and if E is the Borel σ-algebra of E, there exists a unique probability measure P on (Ω, F) := (E, E) T so that the canonical process (Y t ) t R + on (Ω, F, P) has (P I ) I I has finite partitions family. Applications : Corollary a). To any convolution semi-group (µ t ) t ],+ [ on (R d, B(R d )) corresponds a SIIP (Y t ) t R +, which is unique up to equivalence, so that for all

18 18 INTRODUCTION TO STOCHASTIC ANALYSIS t ], + [, µ t is the law of Y t. b). To any family (µ s,t ) s<t of probability measures on (R d, B(R d )) satisfying : s, t, u so that s < t < u, µ s,u = µ s,t µ t,u, corresponds an IIP (Y t ) t R +, unique up to equivalence so that s, t R + so that s < t, Y t Y s has law µ s,t. Remark : a). allows in particular to show existence of the homogeneous Poisson process and of the Brownian motion, and b) that of inhomogeneous Poisson process. Proof. It is enough to prove b). : Let I = {t 1,..., t n } with t 1 < t 2 <... < t n, P I the push-forward of µ,t1 µ t1,t 2... µ tn 1,t n, by the map φ n : (x 1,..., x n ) (x 1, x 1 + x 2,..., x 1 + x x n ). Let us show that (P I ) I is compatible : Let J = {t i1,..., t ik } with 1 i 1 < i 2 <... < i k n, (and k < n). We have : P J = φ k (µ,ti1 µ ti1,t i2... µ tik 1,t ik ), that is that the push-forward by φ k of the probability measure µ,ti1 µ ti1,t i2... µ tik 1,t ik. But, with Therefore µ,ti1 µ ti1,t i2... µ tik 1,t ik = γ (µ,t1 µ t1,t 2... µ tn 1,t n ), γ(x 1,..., x n ) = ( <i i 1 x i, i 1<i i 2 x i,..., i k 1 <i i k x i ). P J = (φ k γ) (µ,t1 µ t1,t 2... µ tn 1,t n ). It is easy to see that φ k γ = Π φ n where Π := Π I,J. We hence deduce that P J = Π (φ n (µ,t1 µ t1,t 2... µ tn 1,t n )) = Π I,J P I. Corollary Let m be a map from T into R and let c be a semi-definite positive function from T 2 into R. There exists a real Gaussian process, unique up to equivalence, with mean m and covariance c. Proof. Let I = {t 1,..., t n } with t 1 < t 2 <... < t n, P I the Gaussian law on (R n, B(R n )) with mean (m(t 1 ),..., m(t n )) and covariance matrix (c(t i, t j )) i,j=1,...n. Let J I, with Π I,J P I and P J be two Gaussian probability measures with same mean and covariance. Hence, (P I ) I I is compatible and we can conclude by Kolmogorov theorem.

19 INTRODUCTION TO STOCHASTIC ANALYSIS 19 Example : Take T = R +. The function c is defined on R + R + by c(s, t) = s t is a covariance function as s 1,..., s n so that s 1 < s 2 <... < s n, the matrix (s i s j ) i,j=1,...,n is positive semi-definite as the covariance matrix of (U 1, U 1 + U 2,..., U U n ) where the variables (U 1,..., U n ) are independent and with centered Gaussian law N (, s 1 ) for U 1 and N (, s k s k 1 ) for U k Point processes, Poisson processes. In this last paragraph, we consider random partitions of points in ], + [. A point process on ], + [ is a sequence (S k ) k 1 of random variables defined on the same probability space (Ω, F, P), so that we have : < S 1 (ω) < S 2 (ω) <... < S k (ω) <... and lim k + S k (ω) = + for all ω Ω. The S k represent the arrival time of a random phenomenon(cf the arrival times of clients etc)... We set : Z 1 = S 1 and for all k 2, Z k = S k S k 1 (delay between two successive arrivals). So for all n 1, S n = n k=1 Z k. To any point process (S k ) k 1 on ], + [, we associate a stochastic process called random counting function (N t ) t R + given by: N (ω) = for all ω Ω and N t (ω) = + 1 {Sn(ω) t}(ω) n=1 the number of arrivals during the time interval ], t]. As lim k + S k (ω) = +, we have N t (ω) < + for all t > and all ω Ω. Moreover, (N t ) t R + takes values in N and has non decreasing, right continuous, trajectoires, following a stair shape with jumps no larger than one unit. The data of the point process (S k ) k 1 is equivalent to that of (N t ) t R + since (S ) {N t = n} = {S n t < S n+1 } We also have for all n N, {N t < n} = {S n > t}. Theorem If the random variables Z k are independent, exponentially distributed with parameter λ, then we have : a) For all t >, the random variable N t has Poisson distribution with parameter λt. b) (N t ) t R + is a SIIP. Proof. a). The random variable S n is the sum of n independent random exponential variables with parameter λ, and therefore follows a Gamma distribution with density g n (x) = λ n xn 1 (n 1)! e λx 1 R +(x). Therefore, P(N t < n) = λ n (n 1)! + = e λt( 1 + λt + (λt)2 2! t x n 1 e λx dx ) (λt)n 1 (n 1)!

20 2 INTRODUCTION TO STOCHASTIC ANALYSIS Hence, P(N t = ) = P(N t < 1) = e λt and so for all n 1, λt (λt)n P(N t = n) = P(N t < n + 1) P(N t < n) = e. n! Moreover, N is a SIIP as we can write N t N s = n=n s+1 1 Sn S Ns t s where S Ns = s and conditionally to N s, S n S Ns has the same law than S n Ns and is independent from N s. Hence, the law of N t N s conditionally to N s is the same as the law of N t s, or in other words N is a SIIP. We call standard Poisson process any real stochastic process (X t ) t R + so that (X t ) t R + is a Poisson process so that X and all trajectories are non-decreasing, right-continuous and with jump bounded by one. As a reciprocal to the previous theorem we have Theorem Assume that the counting function (N t ) t R + of the point process (S k ) k 1 is a SIIP. a) There exists λ > so that the random variable Z 1 is exponential with parameter λ. b) For any t >, the random variable N t is exponential with parameter λt. c) The sequence (Z k ) k 1 is i.i.d with exponential distribution with parameter λ. Proof. a). Noticing that {Z 1 > t} = {N t = }, we deduce P(Z 1 > t + s) = P(N t+s = ) = P(N t+s N s =, N s = ) = P(N t+s N s = )P(N s = ) = P(N t = )P(N s = ) = P(Z 1 > t)p(z 1 > s) Hence the fonction t P(Z 1 > t) taking values in [, 1], non increasing and so that P(Z 1 > ) = 1, there exists λ > so that for all t R +, P(Z 1 > t) = e λt = + Moreover, let ε (, t) so that, as N is a SIIP, (2.1) t λe λx dx. P(N t = n) P(N t ε = n) = (P(N ε = ) 1)P(N t ε = n)+ n P(N ε = y)p(n t ε = n y) y=1

21 INTRODUCTION TO STOCHASTIC ANALYSIS 21 We next claim that again by the SIIP property, P(N ε 2) Cε 2. Indeed, we can write p ε = P(N ε 2) P(N ε/2 2) + P(N ε/2 1) 2 e λε/2 p ε/2 + λ2 ε 2 where we used P(N ε/2 1) = 1 e ελ/2 ελ/2. The result follows by iteration. We therefore deduce from (??) that (2.2) ε 1 (P(N t = n) P(N t ε = n)) ε 1 (P(N ε = ) 1)P(N t ε = n) +ε 1 P(N ε 1)P(N t ε = n 1) Cε We thus see that t P(N t = n) is continuous and even differentiable with, by letting ε going to zero, t P(N t = n) = λp(n t = n) + λp(n t = n 1). It follows by induction over n that P(N t = n) = e λt (λt) n /n! for all integer number n, which proves the second point. Hence, the law of (N t ) t is uniquely determined and so is the law of (S n ) n. As it corresponds to the sum of i.i.d exponential variables, we are done. EXERCISES 4 Exercise Prove the following monotone class theorem : Let C be a family of subsets of Ω, non empty and stable under finite intersection. Then the σ-algebra σ(c) generated by C coincides with the smallest family D of subsets of Ω containing C, with Ω D, which is stable under difference and increasing limit. Exercise Prove the following corollary to the previous monotone class theorem: Let (Ω, F) be a mesurable space and let C be a family of subsets of Ω contained in F, stable under finite intersection and such that σ(c) = F. Let P and Q be two probability measures on (Ω, F) which coincide on C (i.e. such that P(A) = Q(A), A C). Then we have P = Q.

22 22 INTRODUCTION TO STOCHASTIC ANALYSIS Exercise Show that the d-dimensionnal stochastic process (X t ) t R + iff if we set Ft o = σ(x s ; s t), we have : X = P-a.s. and if s, t R + so that s < t, the random variable X t X s is independent from the σ-algebra Fs o. (Indication : Use the monotone class theorem to show that if (X t ) t R + is a IIP the random variable X t X s is independent from Fs o ). Exercise A stochastic process (X t ) t R + is self-similar (of order 1) if, for all λ >, the stochastic processes (X λt ) t R + and (λ X t ) t R + are equivalent. Show that if (B t ) t R + is a real Brownian motion, the process (B t 2) t R + is self-similar of order one. Exercise Show that if (B t ) t R + is a real Brownian motion, the following stochastic processes are real Brownian motions : a)( B t ) t R +. b) (c B t/c 2) t R +, c >. c) (X t ) t R + defined by : X = and by X t = t B 1/t, t >. Exercise 2.2. Let (B t ) t R + a real Brownian motion on (Ω, F, P). We set for t [, 1], Y t = B t t B 1 et Z t = Y 1 t. a) Show that (Y t ) t [,1] et (Z t ) t [,1] are centered Gaussian process and compare their finite dimensional laws. b) We set for t R +, W t = (t + 1)Y t/(1+t). Show that (W t ) t R + is a real Brownian motion. Exercise Let (B t ) t R + be a real Brownian motion on (Ω, F, P). Let λ >. We set for t R +, U t = e λt B e 2λt. a) Show that (U t ) t R + is a centered Gaussian process and determine its covariance c. b) Deduce from the form c that (U t ) t R + is stationnary, i.e. that : n 1, t 1,..., t n R +, s >, with t 1 < t 2 <... < t n, the random variables (X t1,..., X tn ) and (X t1+s,..., X tn+s) have the same law. Exercise Let d 2 and denote by <.,. > the scalar product on R d and. the Euclidean norm. On a probability space (Ω, F, P), we consider d real independent Brownian motions (Bt 1 ) t R +, (Bt 2 ) t R +,..., (Bt d ) t R + and we set for

23 INTRODUCTION TO STOCHASTIC ANALYSIS 23 t R +, B t = (Bt 1,..., Bt d ). (B t ) t R + is a d-dimensionnal Brownian. a) Show that x = (x 1,..., x d ) R d such that x 2 2 = x 2 i = 1, the real stochastic process (< B t, x >) t R + is a real Brownian motion. b) Take d = 2 and set X t = (Xt 1, Xt 2 ) with X 1 t = B 1 2t 3 B 2 t 3 and X 2 t = B 2 2t 3 + B 1 t 3 If x = (x 1, x 2 ) R 2 has norm one, what can we say about the process (< X t, x > ) t R +? Are the stochastic processes (Xt 1 ) t R + and (Xt 2 ) t R + independent? Are they real Brownian motions? c) If (X t ) t R + = (Xt 1,..., Xt d ) t R + is a d dimensionnal stochastic process such that x R d has norm one, (< X t, x >) t R + is a real Brownian motion, are (Xt 1 ) t R +,..., (Xt d ) t R + independent Brownian motions? Exercise Show that the product σ-algebra E T on E T coincides with the σ-algebra generated by cylinders of E T, that is by the sets B = t T A t where A t E, t T and A t = E except for a finite number of times t. 3. Martingales 3.1. Filtration. Adapted process. Martingale. A filtration (F t ) t R + on a probability space (Ω, F, P) is an increasing sequence of sub-σ-algebras of F (i.e. s < t, F s F t ). A measurable space (Ω, G) endowed with a filtration (G t ) t is said to be filtered. The filtration (F t ) t R + is said to be right-continuous if t R +, we have F t = s>t F s. To any filtration (F t ) t R +, we can associate a right-continuous filtration denoted (F t +) t R + given by F t + := s>t F s. We say that (F t ) t R + is complete (for P) if F contains all the neglectable ensembles of G(for P). If (F t ) t R + is a filtration on (Ω, F, P), we associate to it its completed filtration (for P): ( F t ) t R + by adding to each F t the neglectable sets of G. We assume in general that (F t ) t R + is complete up to replacing it by its completed filtration. Let (X t ) t R + be a stochastic process on (Ω, F, P), with values in (E, E). The natural filtration (Ft ) t R + associated to (X t ) t R + is given by: Ft = σ(x s : s t), t R +.

24 24 INTRODUCTION TO STOCHASTIC ANALYSIS Let (F t ) t R + be a filtration on (Ω, F, P). A stochastic process (X t ) t R + is said (F t ) t R +-adapted if for all t R +, the random variable X t is F t measurable. Any stochastic process is clearly adapted to its natural filtration. A stochastic process (X t ) t R + is (F t ) t R +-adapted if for all t R +, Ft F t. If (F t ) t R + is complete for P, if (X t ) t R + is (F t ) t R +-adapted and (Y t ) t R + is a modification of (X t ) t R +, then (Y t ) t R + is also adapted. if : A real stochastic process (X t ) t R + is a supermartingale with respect to (F t ) t R + i) (X t ) t R + is (F t ) t R +-adapted. ii) t R +, the random variable X t is integrable. iii) s R +, t R +, so that s t, we have : X s E(X t F s ), P a.s. A real stochastic process (X t ) t R + is a a submartingale with respect to (F t ) t R + if ( X t ) t R + is a super-martingale, that is : i) (X t ) t R + is (F t ) t R +-adapted. ii) t R +, the random variable the random variable X t is integrable. iii) s R +, t R +, so that s t, we have : X s E(X t F s ), P a.s. A real stochastic process (X t ) t R + is a a martingale with respect to (F t ) t R + if it is both a supermartingale and a submartingale, that is i) (X t ) t R + is (F t ) t R +-adapted. ii) t R +, the random variable the random variable X t is integrable. iii) s R +, t R +, so that s t, we have : X s = E(X t F s ), P a.s. Remark : a). If (X t ) t R + is a sub (resp. super, resp. ) the function t E(X t ) is decreasing (resp. increasing, resp constant) b). If (X t ) t R + is a sub(resp. super)martingale so that the function t E(X t ) is constant then (X t ) t R + is a martingale. (Exercise!) Example : Let U L 1 and set M t = E(U F t ), t R +, then (M t ) t R + is a martingale.

25 INTRODUCTION TO STOCHASTIC ANALYSIS 25 A d-dimensionnal stochastic process (X t ) t R + is a (F t ) t R +-IIPiff: i). X =, P-a.s. ii). (X t ) t R + is (F t ) t R +- adapted. iii). s R +, t R +, so that s t, the random variable X t X s is independent from F s. A d-dimensionnal stochastic process (X t ) t R +is a (F t ) t R +-SIIP if it is a (F t ) t R +- IIP so that : iv). s R +, t R +, so that s t, the random variable X t X s has the same law as X t s. Theorem 3.1. If (X t ) t R + is a real (F t ) t R +-IIP and if t R +, the random variable X t is centered and integrable, then (X t ) t R + is a (F t ) t R +-martingale. Proof. If s t with s, t R +, we have : E(X t X s F s ) = E(X t X s ) = E(X t ) E(X s ) =, P a.s. But E(X t X s F s ) = E(X t F s ) X s, P a.s. Therefore E(X t F s ) = X s, P a.s. As a corollary one easily deduces the following Corollary 3.2. a). If (B t ) t R + is a real Brownian motion, then (B t ) t R + is a martingale with respect to its natural filtration (Ft ) t R + and also for the natural completed filtration ( F t ) t R +. b). If (N t ) t R + is a Poisson process with parameter λ > then (N t λt) t R + is a martingale with respect to the natural filtration of (N t ) t R +. We have also Proposition 3.3. a). If (B t ) t R + is a real Brownian motion then (Bt 2 t) t R + is a martingale for the natural completed filtration of (B t ) t R +. For all α, (exp(αb t α2 2 t)) t R + is a martingale for the natural completed filtration ( F t ) t R +. b). If (N t ) t R + is a Poisson process with parameter λ > then ((N t λt) 2 λt) t R + is a martingale with respect to the natural filtration (Ft ) t R + of (N t ) t R +. For all α, (exp(αn t (e α 1)λt)) t R + is as well a martingale for (Ft ) t R +. Proof. a)clearly B 2 t L 1 and moreover E[B 2 t B 2 s F s ] = E[(B t B s ) 2 + 2B s (B t B s ) F s ]

26 26 INTRODUCTION TO STOCHASTIC ANALYSIS As B t B s is independent of F s and centered we deduce that E[B 2 t B 2 s F s ] = E[(B t B s ) 2 ] = t s from which the result follows. Similarly for all α exp{αb t } L 1 and The proof of b) is similar. E[e αbt F s ] = e αbs E[e α(bt Bs) ] = e αbs e α2 2 (t s) We next prove an important inequality due to Doob. Theorem 3.4. [ Doob s inequality] Let (X t ) t R + be a right-continuous non-negative submartingale or a right-continuous martingale with respect to the filtration (F t ) t R +, a).then for all t >, and c >, P( sup X s c) E( X t ). s [,t] c b). Assume as well that for all t R +, X t L p, with p > 1 given, then for all t >, all c >, P( sup X s c) E( X t p ) s [,t] c p. c). Under the hypotheses of b). we deduce that sup s [,t] X s L p and where C = p/(p 1) sup X s p C X t p, s [,t] Proof. The proof of a) will be deduced from the following Lemma Lemma 3.5. Let (Y k ) k N be a submartingale with respect to the filtration (G k ) k N on (Ω, F, P), then for all m 1, all c >, P( max k m Y k c) E( Y m 1 max k m Y k c) c Proof. Let for k 1, E( Y m ). c A k = {Y < c}... {Y k 1 < c} {Y k c} and A = {Y c}. Let A = {max k m Y k c}. As A is the disjoint union of the A k s, we get m c P(A) = c P(A k ) k= m E(Y k 1 Ak ) k=

27 INTRODUCTION TO STOCHASTIC ANALYSIS 27 Fix k, A k G k, so E(Y k 1 Ak ) E(E(Y m G k )1 Ak ) E(E(Y m 1 Ak G k )) E(E( Y m 1 Ak G k )) = E( Y m 1 Ak ) Hence m c P(A) E( Y m 1 Ak ) k= = E( Y m 1 A ) E( Y m ) We apply the Lemma for n 1, to Y (n) k = X k 2 n with G k = F k, k N. 2n - If t D the set of the dyadic numbers of R +, we obtain, letting n going to +, that P( sup s [,t] D X s c) E( X t ). c But as (X t ) t R + is right continuous, sup s [,t] X s = sup s [,t] D X s, so that the point a) follows. - If t / D, we use a sequence (t n ) n 1 in D so that t n t, Letting n going to infinity in we obtain sup X s = lim sup X s. s [,t] n s [,t n] P( sup X s c) E( X t n ), s [,t n] c P( sup X s c) lim inf n E( X tn ), s [,t] c and in fact lim inf n E( X tn ) = E[ X t ]. Indeed, we may assume without loss of generality that t n t + 1. Then, E( X tn ) E[ X t ] + ε + E[( X tn X t )1 Xtn X t +ε) whereas as Xis a submartingale and we assume t n t + 1 E[( X tn X t )1 Xtn X t +ε) E[( X t+1 X t )1 Xtn X t +ε). Since X t+1 X t L 1 and P( X tn X t + ε) goes to zero as n goes to infinity by right continuity, the conclusion follows.

28 28 INTRODUCTION TO STOCHASTIC ANALYSIS The point b) is a direct application of the point a) as if X is a martingale or a non-negative subartingale, as in both cases X t p is a non-negative submartingale since by Hölder inequality, for s t X s p E[X t F s ] p E[ X t p F s ] so that b) follows by applying a) to ( X t p ) t R +. (3.1) To deduce c), observe that for any fixed k, with S = sup s [,t] X s k, E[S p ] = E[ S px p 1 dx] = k pp (S x)x p 1 dx so that b) implies that for all p < p, there exists a finite constant c so that E[S p ] c X t p Letting k going to infinity and invoking the monotone convergence theorem yields the estimate with p < p. To derived it as announced we need to show the bound (3.2) xp (S x) E[ X t 1 S x ] This inequality was proved in Lemma 3.5 in the discrete case. To show that it extends to the continuous setting we can proceed by discrete approximation exactly as in the previous proof to deduce that if t D, (3.2) holds, and then for all t by density as if t n is a sequence of dyadic numbers decreasing to t, E[ X tn 1 sups [,tn] X s k x] lim inf n E[( X t +ɛ)1 sups [,tn] X s k x]+e[( X t+1 X t )1 Xtn X t +ɛ] As 1 sups [,tn] X s k x and 1 Xtn X t +ɛ go to 1 sups [,t] X s k x and respectively while X t and X t+1 X t are in L 1 we conclude that lim inf n E[ X t n 1 sups [,tn] X s k x] E[ X t 1 sups [,t] X s k x] so that (3.2) extends to all t R +. From (3.2) once plugged into (4.4) we deduce E[S p ] k pe[ X t 1 S x ]x p 2 dx = by Hölder inequality. Hence we conclude that E[S p p ] ( p 1 )p E[ X t p ] p p 1 E[ X t S p 1 ] p p 1 E[ X t p ] 1/p E[S p ] p 1 p Letting k going to infinity and applying monotone convergence theorem concludes the argument. Application of Theorem 3.4 (a):

29 Proposition 3.6. Let (B t ) t R + Then for all a >, INTRODUCTION TO STOCHASTIC ANALYSIS 29 be a real Brownian motion and set S t = sup B s. s [,t] P(S t a t) exp( a2 t 2 ). Proof. We may assume without loss of generality that the trajectories of B are continuous by Corollary 1.7. Let s use the martingales (M (α) t ) t R + given for α >, by : We have : M (α) t = exp(αb t α2 2 t). exp(α S t α2 t) = exp(α( sup B s ) α2 2 s [,t] 2 t) sup M s (α). s [,t] As for α >, x exp(αx) is increasing, we have P(S t a t) = P(exp(α S t α2 α2 t) exp(αat 2 2 t)) P( sup M s (α) exp(αat α2 s [,t] 2 t)) exp( αat + α2 2 (α) t) E( M t ) by the first Doob inequality = exp( αat + α2 (α) t) E(M ) = exp( αat + α2 2 2 t) But inf α> ( αat + α2 2 t) = a2 t 2, so that the result follows 4. Stopping time. Let (F t ) t R + be a filtration on (Ω, F, P). A stopping time with respect to (F t ) t R + is a map T : Ω [, + ] so that for all t R +, {T t} F t. We denote T the family of stopping times. We set : F = σ( t R + F t ). Let T be a stopping time for (F t ) t R +. We call σ-algebra of the events anterior to T and denote it F T, the family of elements A in F so that : t R +, A {T t} F t.

30 3 INTRODUCTION TO STOCHASTIC ANALYSIS We verify that F T is indeed a σ-algebra. Properties: a). If T t, T is a stopping time. b). If T T and if S = T + t with t R +, then S T. c). If T T, then T is F T -mesurable. d). If S, T T, and if S T, then F S F T. e). If S, T T, then S T T and S T T. Remark : We have the following result : T : Ω [, + ] is a (F t +) t R +-stopping time iff t ], + [, {T < t} F t. Examples of stopping time: Let A B(R d ) and (X t ) t R + be a stochastic d-dimensionnal process. We set (+ if { } = ø). T A is called the hitting time of A. T A (ω) = inf{t > : X t (ω) A} Proposition 4.1. Let A be open. If (X t ) t R + is right continuous then T A is a stopping time for the natural filtration (Ft ) + t R +. Proposition 4.2. Let A be closed. If (X t ) t R + is continuous then the random variable D A defined by D A (ω) = inf{t : X t (ω) A} is a stopping time for (F t ) t R +. D A is called entry time in A. The proof of these propositions follows from writing by right continuity whereas by continuity {T A < t} = s [,t[ Q {X s A} F t {D A t} = s [,t] Q {X s F } F t. Let (X t ) t R + be a d-dimensionnal stochastic process on (Ω, F, P) and let (F t ) t R + be a filtration on (Ω, F, P). We say that (X t ) t R + is strongly adapted if for all T T, the map ω X T (ω) (ω)1 {T (ω)< } is F T -measurable. If (X t ) t R + is strongly adapted, then (X t ) t R + is adapted. We next give conditions so that (X t ) t R + is strongly adapted. We say that (X t ) t R + is progressively measurable if for all t >, the map (s, ω) X s (ω) is measurable on ([, t] Ω, B([, t]) F t ) in (R d, B(R d )).If (X t ) t R + is progressively measurable, then (X t ) t R + is adapted. Theorem 4.3. If (X t ) t R + is progressively measurable, then it is strongly adapted.

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

FE 5204 Stochastic Differential Equations

FE 5204 Stochastic Differential Equations Instructor: Jim Zhu e-mail:zhu@wmich.edu http://homepages.wmich.edu/ zhu/ January 20, 2009 Preliminaries for dealing with continuous random processes. Brownian motions. Our main reference for this lecture

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

1 Sequences of events and their limits

1 Sequences of events and their limits O.H. Probability II (MATH 2647 M15 1 Sequences of events and their limits 1.1 Monotone sequences of events Sequences of events arise naturally when a probabilistic experiment is repeated many times. For

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

Point Process Control

Point Process Control Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued

More information

Do stochastic processes exist?

Do stochastic processes exist? Project 1 Do stochastic processes exist? If you wish to take this course for credit, you should keep a notebook that contains detailed proofs of the results sketched in my handouts. You may consult any

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

Brownian Motion. Chapter Stochastic Process

Brownian Motion. Chapter Stochastic Process Chapter 1 Brownian Motion 1.1 Stochastic Process A stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ,P and a real valued stochastic

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t 2.2 Filtrations Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of σ algebras {F t } such that F t F and F t F t+1 for all t = 0, 1,.... In continuous time, the second condition

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

3 Integration and Expectation

3 Integration and Expectation 3 Integration and Expectation 3.1 Construction of the Lebesgue Integral Let (, F, µ) be a measure space (not necessarily a probability space). Our objective will be to define the Lebesgue integral R fdµ

More information

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries Chapter 1 Measure Spaces 1.1 Algebras and σ algebras of sets 1.1.1 Notation and preliminaries We shall denote by X a nonempty set, by P(X) the set of all parts (i.e., subsets) of X, and by the empty set.

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

PROBABILITY THEORY II

PROBABILITY THEORY II Ruprecht-Karls-Universität Heidelberg Institut für Angewandte Mathematik Prof. Dr. Jan JOHANNES Outline of the lecture course PROBABILITY THEORY II Summer semester 2016 Preliminary version: April 21, 2016

More information

17. Convergence of Random Variables

17. Convergence of Random Variables 7. Convergence of Random Variables In elementary mathematics courses (such as Calculus) one speaks of the convergence of functions: f n : R R, then lim f n = f if lim f n (x) = f(x) for all x in R. This

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Empirical Processes: General Weak Convergence Theory

Empirical Processes: General Weak Convergence Theory Empirical Processes: General Weak Convergence Theory Moulinath Banerjee May 18, 2010 1 Extended Weak Convergence The lack of measurability of the empirical process with respect to the sigma-field generated

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

1. Probability Measure and Integration Theory in a Nutshell

1. Probability Measure and Integration Theory in a Nutshell 1. Probability Measure and Integration Theory in a Nutshell 1.1. Measurable Space and Measurable Functions Definition 1.1. A measurable space is a tuple (Ω, F) where Ω is a set and F a σ-algebra on Ω,

More information

Independent random variables

Independent random variables CHAPTER 2 Independent random variables 2.1. Product measures Definition 2.1. Let µ i be measures on (Ω i,f i ), 1 i n. Let F F 1... F n be the sigma algebra of subsets of Ω : Ω 1... Ω n generated by all

More information

Figure 10.1: Recording when the event E occurs

Figure 10.1: Recording when the event E occurs 10 Poisson Processes Let T R be an interval. A family of random variables {X(t) ; t T} is called a continuous time stochastic process. We often consider T = [0, 1] and T = [0, ). As X(t) is a random variable

More information

Stochastic Processes

Stochastic Processes Introduction and Techniques Lecture 4 in Financial Mathematics UiO-STK4510 Autumn 2015 Teacher: S. Ortiz-Latorre Stochastic Processes 1 Stochastic Processes De nition 1 Let (E; E) be a measurable space

More information

Probability and Measure

Probability and Measure Chapter 4 Probability and Measure 4.1 Introduction In this chapter we will examine probability theory from the measure theoretic perspective. The realisation that measure theory is the foundation of probability

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Stochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier.

Stochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier. Ito 8-646-8 Calculus I Geneviève Gauthier HEC Montréal Riemann Ito The Ito The theories of stochastic and stochastic di erential equations have initially been developed by Kiyosi Ito around 194 (one of

More information

Some basic elements of Probability Theory

Some basic elements of Probability Theory Chapter I Some basic elements of Probability Theory 1 Terminology (and elementary observations Probability theory and the material covered in a basic Real Variables course have much in common. However

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

MATHS 730 FC Lecture Notes March 5, Introduction

MATHS 730 FC Lecture Notes March 5, Introduction 1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists

More information

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias

Notes on Measure, Probability and Stochastic Processes. João Lopes Dias Notes on Measure, Probability and Stochastic Processes João Lopes Dias Departamento de Matemática, ISEG, Universidade de Lisboa, Rua do Quelhas 6, 1200-781 Lisboa, Portugal E-mail address: jldias@iseg.ulisboa.pt

More information

Continuous martingales and stochastic calculus

Continuous martingales and stochastic calculus Continuous martingales and stochastic calculus Alison Etheridge January 8, 218 Contents 1 Introduction 3 2 An overview of Gaussian variables and processes 5 2.1 Gaussian variables.........................

More information

Building Infinite Processes from Finite-Dimensional Distributions

Building Infinite Processes from Finite-Dimensional Distributions Chapter 2 Building Infinite Processes from Finite-Dimensional Distributions Section 2.1 introduces the finite-dimensional distributions of a stochastic process, and shows how they determine its infinite-dimensional

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

Math212a1413 The Lebesgue integral.

Math212a1413 The Lebesgue integral. Math212a1413 The Lebesgue integral. October 28, 2014 Simple functions. In what follows, (X, F, m) is a space with a σ-field of sets, and m a measure on F. The purpose of today s lecture is to develop the

More information

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion. Chapter Definition of Brownian motion Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

An almost sure invariance principle for additive functionals of Markov chains

An almost sure invariance principle for additive functionals of Markov chains Statistics and Probability Letters 78 2008 854 860 www.elsevier.com/locate/stapro An almost sure invariance principle for additive functionals of Markov chains F. Rassoul-Agha a, T. Seppäläinen b, a Department

More information

At t = T the investors learn the true state.

At t = T the investors learn the true state. 1. Martingales A discrete time introduction Model of the market with the following submodels (i) T + 1 trading dates, t =0, 1,...,T. Mathematics of Financial Derivatives II Seppo Pynnönen Professor of

More information

18.175: Lecture 2 Extension theorems, random variables, distributions

18.175: Lecture 2 Extension theorems, random variables, distributions 18.175: Lecture 2 Extension theorems, random variables, distributions Scott Sheffield MIT Outline Extension theorems Characterizing measures on R d Random variables Outline Extension theorems Characterizing

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

Jointly measurable and progressively measurable stochastic processes

Jointly measurable and progressively measurable stochastic processes Jointly measurable and progressively measurable stochastic processes Jordan Bell jordan.bell@gmail.com Department of Mathematics, University of Toronto June 18, 2015 1 Jointly measurable stochastic processes

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Multiple points of the Brownian sheet in critical dimensions

Multiple points of the Brownian sheet in critical dimensions Multiple points of the Brownian sheet in critical dimensions Robert C. Dalang Ecole Polytechnique Fédérale de Lausanne Based on joint work with: Carl Mueller Multiple points of the Brownian sheet in critical

More information

Stochastic Analysis. King s College London Version 1.4 October Markus Riedle

Stochastic Analysis. King s College London Version 1.4 October Markus Riedle Stochastic Analysis King s College London Version 1.4 October 215 Markus Riedle Preface These notes are for a course on Stochastic Analysis at King s College London. Given the limited time and diverse

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability Chapter 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary family

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information