Lecture 12. F o s, (1.1) F t := s>t

Size: px
Start display at page:

Download "Lecture 12. F o s, (1.1) F t := s>t"

Transcription

1 Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let F be the Borel σ-algebra on C generated by the topology of uniform convergence on compact sets. On the measurable space (C, F), we can impose a one-parameter family of probability measures (P x ) x R (with expectations (E x ) x R ), which are the laws of a Brownian motion starting with B 0 = x almost surely. A natural filtration on C is to let F o t = σ(ω s : ω C, 0 s t), where ω s : C R is the coordinate map at time s. However it turns out to be more convenient to define a right-continuous filtration F t := s>t F o s, (1.1) which allows an infinitesimal peek into the future. Note that F t is right-continuous since F t = s>t F s. One important advantage of the right-continuity is that a time of the form τ = inf{t 0 : B t O} for an open set O R is a stopping time w.r.t. F t, but not w.r.t. Ft o. We shall see that for the Wiener measure (i.e., the measure for a standard Brownian motion), F t and Ft o differ only by sets of measure zero. Instead of using the filtration F t, quite often people also use the so-called augmented filtration for Brownian motion, which is obtained by completing Ft o with sets of measure 0 under the Wiener measure. We now show that Brownian motion satisfies the Markov property, which intuitively means that conditional on (B t ) 0 t s, the law of (B s+t ) t 0 is the same as that of a Brownian motion starting at B s at time 0. A slight complication arises because with respect to the filtration F t, the above statement should really be modified to say that we condition on (B t ) 0 t s +. However, the theorem below will imply that this makes no difference. Theorem 1.1 Markov property If s 0 and f : C R is bounded and measurable, then for all x R, E x f((bs+t ) t 0 ) F s = EBs f(( Bt ) t 0 ) a.s., (1.2) where B is a Brownian motion starting at x with E x denoting its expectation, and given B s, B is an independent Brownian motion starting at B s at time 0. Proof. We sketch the proof. To verify (1.2), it suffices to show that E x 1A f((b s+t ) t 0 ) = E x 1A E Bs f(( Bt ) t 0 ) A F s. (1.3) For ω C, we first fix f to be of the form f((ω t ) t 0 ) = 1 n f i (ω ti ), (1.4)

2 where 0 < t 1 < < t n, and f i : R R are bounded and measurable. Fix h (0, t 1 ). Then for A Fs+h o of the form A = {ω C : ω si A i, 1 i m}, where 0 < s 1 < < s m s + h and A i are Borel sets, we can easily verify that E x 1A f((b s+t ) t 0 ) n = E x 1 A f i (B s+ti ) = E x 1 n A E Bs+h f i ( B ti h). (1.5) By the π-λ theorem, (1.5) holds for all A Fs+h o F s. Given any A F s, we can let h 0 in (1.5). By writing E Bs and E Bs+h explicitly as integration w.r.t. Gaussian densities, and using the fact that a.s. lim h 0 B s+h = B s, we can then apply dominated convergence theorem to deduce that n lim E Bs+h f i ( B n ti h) = E Bs f i ( B ti ). h 0 Therefore (1.2) holds for all f of the form (1.4). We can then use the monotone class theorem to deuce (1.2) for all bounded measurable f. One consequence of the Markov property of (B t ) t 0 w.r.t. the filtration F t is that Theorem 1.2 For any bounded measurable function f : C R and for all s 0 and x R, E x f((b t ) t 0 ) F s = E x f((b t ) t 0 ) F o s. (1.6) Proof. By the monotone class theorem, it suffices to consider f of the form in (1.4), and we can separate the indices t i into t i s and t i > s and write f((b t ) t 0 ) = g 1 ((B t ) 0 t s )g 2 ((B t ) t>s ). Then E x f F s = g 1 ( (Bt ) 0 t s ) Ex g 2 F s. By the Markov property (1.2), E x g 2 F s = E Bs g 2 (( B t ) t 0 ) F o s. Since F o s F s, this implies that E x g 2 F s = E x g 2 F o s, and hence E x f F s = E x f F o s. Setting f = 1 A for A F s in Theorem 1.2 shows that F s and Fs o are equivalent up to sets of measure zero. Furthermore, if we set s = 0 and let f = 1 A for A F 0, then we obtain 1 A = P x (A) so that P x (A) {0, 1} for all A F 0. This is known as Blumenthal s 0-1 law. Theorem 1.3 Blumenthal s 0-1 law For any x R, P x (A) {0, 1} for all A F 0. The σ-field F 0 is called the germ field, which is trivial by Blumenthal s 0-1 law. We remark that Blumenthal s 0-1 law is valid for more general Markov processes under suitable continuity assumptions. Here is an interesting application for Brownian motion. Theorem 1.4 Let τ = inf{t > 0 : B t > 0}. Then P 0 (τ = 0) = 1. Proof. Note that {τ = 0} F 0. Therefore by Blumenthal s 0-1 law, P 0 (τ = 0) {0, 1}. On the other hand, Therefore we must have P 0 (τ = 0) = 1. P 0 (τ = 0) = lim t 0 P 0 (τ t) lim t 0 P 0 (B t > 0) = 1 2. Theorem 1.4 shows that almost surely, there is an infinite sequence of times t n 0 with B tn > 0, and by symmetry, there also exists an infinite sequence of times s n 0 with B sn < 0. Thus by the a.s. continuity of Brownian sample paths, (B t ) t 0 crosses level 0 infinitely often in any neighborhood of t = 0. This is consistent with what we know from the law of the iterated logarithm for Brownian motion near t = 0. 2

3 2 Brownian motion: the strong Markov property The strong Markov property for Brownian motion basically says that conditioned on (B s ) 0 s τ for any stopping time τ, (B τ+t ) t 0 is distributed as a Brownian motion starting from B τ at t = 0. First let us define stopping times. Definition 2.1 Stopping times A random variable τ : C(0, ), R) 0, is called a stopping time if for all t 0, {ω C : τ(ω) t} F t. The corresponding stopped σ-field F τ is defined as {A F : A {τ t} F t t 0}. Note that because of the right continuity of (F t ) t 0, {τ t} F t for all t 0 if and only if {τ < t} F t for all t 0. Example 2.2 Examples of stopping times 1 If O R is an open set, then τ O = inf{t 0 : B t O} is a stopping time. 2 If K R is a closed set, then τ K = inf{t 0 : B t K} is a stopping time. 3 If T n is a sequence of stopping times and T n T (resp. T n T ), then T is a stopping time. 4 If S and T are stopping times, then S T and S T are stopping times. We leave the verification of these assertions as an exercise. We now state formally the strong Markov property for Brownian motion. Theorem 2.3 Strong Markov property Let f s (ω) : 0, ) C R be jointly measurable in s 0, ) and ω C. If τ is a stopping time, then for all x R, E x fτ ((B τ+t ) t 0 ) F τ = EBτ fτ (( B t ) t 0 ) a.s., (2.7) where B is a Brownian motion starting at x, and given B τ, B is an independent Brownian motion starting at B τ. The strong Markov property can be proved by first proving the statement for stopping times which take values in a countable discrete set, which follows from the Markov property. General stopping times can then be approximated by discrete stopping times. The only thing that we need to carry out this approximation is the continuity of the Brownian motion transition kernel in both space and time, which is known as the Feller property for Markov processes. See 1, Sec. 7.3 for a detailed proof. We now apply the strong Markov property to deduce some interesting properties for Brownian motion. Theorem 2.4 Zeroes of Brownian motion Let (B t ) t 0 be a standard Brownian motion with B 0 = x R. Let Z = {t 0 : B t = 0} be the zero set of B. Then almost surely, Z is a perfect set, i.e., Z is closed and every point in Z is an accumulation point. Furthermore, almost surely Z has Lebesgue measure 0. 3

4 Proof. Since B t is almost surely continuous in t, Z is a closed set. If with positive probability, Z contains an isolated point t ω so that B s 0 for all s (t ω ɛ ω, t ω + ɛ ω ) for some ɛ ω depending on B, then we can find a, δ Q with a > δ such that with positive probability, B has a unique zero t ω (a δ, a + δ). Let τ := inf{t a δ : B t = 0}, which is a stopping time. Since P x (B a δ = 0) = 0, we have P x (τ = a δ) = 0. Therefore by assumption, with positive probability τ = t ω, and it is an isolated zero of B in (a δ, a + δ). However, this is not possible by the strong Markov property, which implies that conditional on (B s ) 0 s τ, (B τ+s ) s 0 is distributed as a Brownian motion B starting at 0, and by Blumenthal s 0-1 law, t = 0 is a.s. an accumulation point of the zeros of ( B t ) t 0. The fact that Z almost surely has Lebesgue measure zero follows from Fubini s theorem, since T T E 1 {Bt=0}dt = P(B t = 0)dt = 0 for any T > Theorem 2.5 Reflection principle Let (B t ) t 0 be a standard Brownian motion with B 0 = 0. Let M t := sup 0 s t B s be the running maximum of B t. Then for any a > 0 and t > 0, P(M t a) = 2P(B t a) = P( B t a). (2.8) Note that with B 0 = 0 and a > 0, {M t a} = {τ a t}, where τ a := inf{t 0 : B t = a}. Proof. Since τ a is a stopping time, so is τ a t, and hence by the strong Markov property, E1 {Bt a} F t τa = 1 {τa t}p a ( B t τa a) = {τ a t}, where B is an independent Brownian motion starting from a. Taking expectation on both sides then yields (2.8). 3 Brownian motion as a martingale Most of the results for discrete time martingales, such as Doob s inequality, upcrossing inequality, martingale convergence theorems, optional stopping theorem, have analogues for continuous time martingales, provided we assume that the continuous time martingale has sample paths which are right continuous with left hand limits, known as càdlàg paths by its abbreviation from French. In particular, the optional stopping theorem states the following. Theorem 3.1 Optional stopping theorem Let (X t ) t 0 be a continuous time martingale with cádlág sample paths adapted to a right-continuous filtration (F t ) t 0, i.e., for any 0 s t <, EX t F s = X s a.s. Then for any two stopping times 0 σ τ, if (X t τ ) t 0 is uniformly integrable, then EX τ F σ = X σ a.s. (3.9) In particular, if τ is a bounded stopping time, then (X t τ ) t 0 is uniformly integrable. We collect here some most common functionals of Brownian motion which are martingales. Theorem 3.2 Martingales for Brownian motion The following functionals of (B t ) t 0 are martingales w.r.t. (F t ) t 0 : B t, B 2 t t, B 3 t 3tB t, e θbt θ2 t/2. 4

5 Remark. Theorem 3.2 can be easily verified using the independent Gaussian increment properties of (B t ) t 0. In fact, for any function f(t, x) with t f x f 0, f(t τ L, B t τl ) is a martingale for any L > 0, where τ L := inf{s 0 : B s L}. This includes the case when f(t, x) = f(x) is a harmonic function. The statement can be proved by Ito s formula. Remark. Applying the optional stopping theorem to the martingale B t, we can derive the exit probabilities of B t from the end points of an interval a, b. Using B 2 t t, we can compute the expected exit time of B t from a, b. 4 Infinitesimal generator of a Brownian motion For a discrete time Markov chain X with state space S and transition matrix Π, the operator Π I plays an important role. For any bounded measurable f : S R, f(x n ) f(x 0 ) n 1 i=0 (Π I)f(X i) is in fact a martingale. Furthermore, if (Π I)f = 0, in which case we call f a harmonic function for the Markov chain with transition matrix Π, f(x n ) is a martingale. We now introduce the continuous time analogue of Π I for Brownian motion, called the infinitesimal generator of Brownian motion. If (X t ) t 0 is a continuous time Markov process with state space S which is a complete separable metric space, then the semigroup associated with X is defined as the family of operators (S t ) t 0 acting on the class of bounded continuous functions f C b (S, R) with (S t f)(x) = E x f(x t ) for all x S. By the Markov property, it is easy to check that S t S s f = S s S t f = S t+s f, which together with the lack of inverse accounts for the name semigroup. The infinitesimal generator is then defined via S h f f Lf = lim. (4.10) h 0 h The class of f C b (S, R) for which the limit Lf C b (S, R) exists and the convergence takes place in C b (S, R) with sup norm is called the domain of the generator L. The generator L together with its domain uniquely determine the Markov process. Although one could also consider the adjoint semigroup (S t ) t 0 which acts on probability measures on S, functions are easier to handle because measures can in general be singular. Let us check that 1 2, where denotes the Laplacian, is the infinitesimal generator for Brownian motion, and the class of twice continuously differentiable functions with compact support, Cc 2, belongs to the domain of 1 2. Let f C2 c. Then by Taylor expansion, (S h f)(x) f(x) = Ef(x+B h ) f(x) = E f (x)b h + 1 x+bh 2 f (x)bh 2 + x In particular, (S h f)(x) f(x) 1 2 f x+b h (x)h E x y x y x (f (z) f (x))dzdy. f (z) f (x) dzdy E φ( B h )Bh 2, where φ(r) = sup x R, y x r f (y) f (x) is bounded with φ(r) 0 as r 0 by the assumption that f Cc 2. Since Eφ( B h )Bh 2 lim = lim Eφ( h B 1 )B 2 h 0 h h 0 1 = 0, it follows that h 1 (S h f f) converges in sup-norm to 1 2 f as h 0. The same argument applies to higher-dimensional Brownian motions. Ito s formula is also based on such Taylor expansions. Similar to the discrete time setting, if f is bounded and f = 0, then f(b t ) is a martingale. 5

6 5 Transience and recurrence of a Brownian motion in R d One-dimensional Brownian motion is clearly recurrent in the sense that it visits every point in R infinitely often, as can be seen from the law of the iterated logarithm. A d-dimensional Brownian motion is simply an R d -valued process B t := (B (1) t,, B (d) t ), where the coordinates are independent one-dimensional Brownian motions. It turns out that in d 3, B t is transient in the sense that B t a.s.; and in d = 2, B t a.s. returns to each open set infinitely often as t, but does not return to any fixed deterministic point. The proof rests on finding a suitable martingale. Recall that the fundamental solution for the Laplacian is log x d = 2, φ(x) = 1 x d 2 d 3. In particular, φ(x) = 0 for all x 0. Then for any 0 < r < R <, φ(b t τr τ R ) is actually a bounded positive martingale, where τ a = inf{s 0 : B s = a}. It is not difficult to show that if B 0 = x with x (r, R), then τ r τ R < almost surely. Therefore by the optional stopping theorem, φ(x) = P x (τ r < τ R )φ(r) + P x (τ R < τ r )φ(r), which implies that P x (τ r < τ R ) = φ(r) φ(x) φ(r) φ(r) = log R log x log R log r d = 2, R 2 d x 2 d R 2 d r 2 d d 3. (5.11) For d = 2, as we let R, we find that P x (τ r < ) = 1. Therefore by the strong Markov property, B t must visit the ball {z R 2 : z r} infinitely often a.s. as t. Since r > 0 can be arbitrary, B t must visit each open set in R 2 infinitely often a.s. If we fix R and let r 0, then we find that P x (τ R < τ 0 ) = 1. Since R can be arbitrary, this implies that B t almost surely never visits the origin, or any other deterministic point. For d = 3, fixing r and letting R gives P x (τ r < ) = rd 2 < 1. Therefore B x d 2 t is transient. It is easy to see that there is zero probability that B t stays confined in a finite ball for all time. Therefore for R > B 0, τ R < almost surely. By the strong Markov property, P BτR τ r < rd 2, which tends to 0 as R. Therefore, almost surely, B R d 2 t > r for all t sufficiently large. Since r > 0 is also arbitrary, this implies that B t as t. In fact, B t for a d-dimensional Brownian motion is a process on 0, ) called the d- dimensional Bessel process, which can be constructed from a 1-dimensional Brownian motion by adding a location-dependent drift to. References 1 R. Durrett, Probability: Theory and Examples, 2nd edition, Duxbury Press, Belmont, California,

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M.

Lecture 10. Theorem 1.1 [Ergodicity and extremality] A probability measure µ on (Ω, F) is ergodic for T if and only if it is an extremal point in M. Lecture 10 1 Ergodic decomposition of invariant measures Let T : (Ω, F) (Ω, F) be measurable, and let M denote the space of T -invariant probability measures on (Ω, F). Then M is a convex set, although

More information

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past. 1 Markov chain: definition Lecture 5 Definition 1.1 Markov chain] A sequence of random variables (X n ) n 0 taking values in a measurable state space (S, S) is called a (discrete time) Markov chain, if

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Lecture 7. 1 Notations. Tel Aviv University Spring 2011

Lecture 7. 1 Notations. Tel Aviv University Spring 2011 Random Walks and Brownian Motion Tel Aviv University Spring 2011 Lecture date: Apr 11, 2011 Lecture 7 Instructor: Ron Peled Scribe: Yoav Ram The following lecture (and the next one) will be an introduction

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013 MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.65/15.070J Fall 013 Lecture 7 9/5/013 The Reflection Principle. The Distribution of the Maximum. Brownian motion with drift Content. 1. Quick intro to stopping times.

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Harmonic Functions and Brownian motion

Harmonic Functions and Brownian motion Harmonic Functions and Brownian motion Steven P. Lalley April 25, 211 1 Dynkin s Formula Denote by W t = (W 1 t, W 2 t,..., W d t ) a standard d dimensional Wiener process on (Ω, F, P ), and let F = (F

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Verona Course April Lecture 1. Review of probability

Verona Course April Lecture 1. Review of probability Verona Course April 215. Lecture 1. Review of probability Viorel Barbu Al.I. Cuza University of Iaşi and the Romanian Academy A probability space is a triple (Ω, F, P) where Ω is an abstract set, F is

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS

ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS Bendikov, A. and Saloff-Coste, L. Osaka J. Math. 4 (5), 677 7 ON THE REGULARITY OF SAMPLE PATHS OF SUB-ELLIPTIC DIFFUSIONS ON MANIFOLDS ALEXANDER BENDIKOV and LAURENT SALOFF-COSTE (Received March 4, 4)

More information

Introduction to Random Diffusions

Introduction to Random Diffusions Introduction to Random Diffusions The main reason to study random diffusions is that this class of processes combines two key features of modern probability theory. On the one hand they are semi-martingales

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

BROWNIAN MOTION AND LIOUVILLE S THEOREM

BROWNIAN MOTION AND LIOUVILLE S THEOREM BROWNIAN MOTION AND LIOUVILLE S THEOREM CHEN HUI GEORGE TEO Abstract. Probability theory has many deep and surprising connections with the theory of partial differential equations. We explore one such

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Useful Probability Theorems

Useful Probability Theorems Useful Probability Theorems Shiu-Tang Li Finished: March 23, 2013 Last updated: November 2, 2013 1 Convergence in distribution Theorem 1.1. TFAE: (i) µ n µ, µ n, µ are probability measures. (ii) F n (x)

More information

STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, = (2n 1)(2n 3) 3 1.

STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, = (2n 1)(2n 3) 3 1. STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, 26 Problem Normal Moments (A) Use the Itô formula and Brownian scaling to check that the even moments of the normal distribution

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3. 1. GAUSSIAN PROCESSES A Gaussian process on a set T is a collection of random variables X =(X t ) t T on a common probability space such that for any n 1 and any t 1,...,t n T, the vector (X(t 1 ),...,X(t

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Chapter 3 A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Abstract We establish a change of variable

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Convergence of Feller Processes

Convergence of Feller Processes Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes

More information

Some Tools From Stochastic Analysis

Some Tools From Stochastic Analysis W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Problem List MATH 5143 Fall, 2013

Problem List MATH 5143 Fall, 2013 Problem List MATH 5143 Fall, 2013 On any problem you may use the result of any previous problem (even if you were not able to do it) and any information given in class up to the moment the problem was

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion. Chapter Definition of Brownian motion Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

The Arzelà-Ascoli Theorem

The Arzelà-Ascoli Theorem John Nachbar Washington University March 27, 2016 The Arzelà-Ascoli Theorem The Arzelà-Ascoli Theorem gives sufficient conditions for compactness in certain function spaces. Among other things, it helps

More information

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013. Conditional expectations, filtration and martingales MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 9 10/2/2013 Conditional expectations, filtration and martingales Content. 1. Conditional expectations 2. Martingales, sub-martingales

More information

Lecture 6. 2 Recurrence/transience, harmonic functions and martingales

Lecture 6. 2 Recurrence/transience, harmonic functions and martingales Lecture 6 Classification of states We have shown that all states of an irreducible countable state Markov chain must of the same tye. This gives rise to the following classification. Definition. [Classification

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. Institute for Applied Mathematics WS17/18 Massimiliano Gubinelli Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains. [version 1, 2017.11.1] We introduce

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f),

lim n C1/n n := ρ. [f(y) f(x)], y x =1 [f(x) f(y)] [g(x) g(y)]. (x,y) E A E(f, f), 1 Part I Exercise 1.1. Let C n denote the number of self-avoiding random walks starting at the origin in Z of length n. 1. Show that (Hint: Use C n+m C n C m.) lim n C1/n n = inf n C1/n n := ρ.. Show that

More information

Harmonic Functions and Brownian Motion in Several Dimensions

Harmonic Functions and Brownian Motion in Several Dimensions Harmonic Functions and Brownian Motion in Several Dimensions Steven P. Lalley October 11, 2016 1 d -Dimensional Brownian Motion Definition 1. A standard d dimensional Brownian motion is an R d valued continuous-time

More information

Random Process Lecture 1. Fundamentals of Probability

Random Process Lecture 1. Fundamentals of Probability Random Process Lecture 1. Fundamentals of Probability Husheng Li Min Kao Department of Electrical Engineering and Computer Science University of Tennessee, Knoxville Spring, 2016 1/43 Outline 2/43 1 Syllabus

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

Malliavin Calculus in Finance

Malliavin Calculus in Finance Malliavin Calculus in Finance Peter K. Friz 1 Greeks and the logarithmic derivative trick Model an underlying assent by a Markov process with values in R m with dynamics described by the SDE dx t = b(x

More information

Contents. 1 Preliminaries 3. Martingales

Contents. 1 Preliminaries 3. Martingales Table of Preface PART I THE FUNDAMENTAL PRINCIPLES page xv 1 Preliminaries 3 2 Martingales 9 2.1 Martingales and examples 9 2.2 Stopping times 12 2.3 The maximum inequality 13 2.4 Doob s inequality 14

More information

Stochastic Process (ENPC) Monday, 22nd of January 2018 (2h30)

Stochastic Process (ENPC) Monday, 22nd of January 2018 (2h30) Stochastic Process (NPC) Monday, 22nd of January 208 (2h30) Vocabulary (english/français) : distribution distribution, loi ; positive strictement positif ; 0,) 0,. We write N Z,+ and N N {0}. We use the

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

LECTURE NOTES. Introduction to Probability Theory and Stochastic Processes (STATS)

LECTURE NOTES. Introduction to Probability Theory and Stochastic Processes (STATS) VIENNA GRADUATE SCHOOL OF FINANCE (VGSF) LECTURE NOTES Introduction to Probability Theory and Stochastic Processes (STATS) Helmut Strasser Department of Statistics and Mathematics Vienna University of

More information

Optimal Stopping under Adverse Nonlinear Expectation and Related Games

Optimal Stopping under Adverse Nonlinear Expectation and Related Games Optimal Stopping under Adverse Nonlinear Expectation and Related Games Marcel Nutz Jianfeng Zhang First version: December 7, 2012. This version: July 21, 2014 Abstract We study the existence of optimal

More information

WEAK VERSIONS OF STOCHASTIC ADAMS-BASHFORTH AND SEMI-IMPLICIT LEAPFROG SCHEMES FOR SDES. 1. Introduction

WEAK VERSIONS OF STOCHASTIC ADAMS-BASHFORTH AND SEMI-IMPLICIT LEAPFROG SCHEMES FOR SDES. 1. Introduction WEAK VERSIONS OF STOCHASTIC ADAMS-BASHFORTH AND SEMI-IMPLICIT LEAPFROG SCHEMES FOR SDES BRIAN D. EWALD 1 Abstract. We consider the weak analogues of certain strong stochastic numerical schemes considered

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

Feller Processes and Semigroups

Feller Processes and Semigroups Stat25B: Probability Theory (Spring 23) Lecture: 27 Feller Processes and Semigroups Lecturer: Rui Dong Scribe: Rui Dong ruidong@stat.berkeley.edu For convenience, we can have a look at the list of materials

More information

Martingales, standard filtrations, and stopping times

Martingales, standard filtrations, and stopping times Project 4 Martingales, standard filtrations, and stopping times Throughout this Project the index set T is taken to equal R +, unless explicitly noted otherwise. Some things you might want to explain in

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk

Theory and Applications of Stochastic Systems Lecture Exponential Martingale for Random Walk Instructor: Victor F. Araman December 4, 2003 Theory and Applications of Stochastic Systems Lecture 0 B60.432.0 Exponential Martingale for Random Walk Let (S n : n 0) be a random walk with i.i.d. increments

More information

Functional Analysis Exercise Class

Functional Analysis Exercise Class Functional Analysis Exercise Class Week 2 November 6 November Deadline to hand in the homeworks: your exercise class on week 9 November 13 November Exercises (1) Let X be the following space of piecewise

More information

Summable processes which are not semimartingales

Summable processes which are not semimartingales Summable processes which are not semimartingales Nicolae Dinculeanu Oana Mocioalca University of Florida, Gainesville, FL 32611 Abstract The classical stochastic integral HdX is defined for real-valued

More information

Notes 15 : UI Martingales

Notes 15 : UI Martingales Notes 15 : UI Martingales Math 733 - Fall 2013 Lecturer: Sebastien Roch References: [Wil91, Chapter 13, 14], [Dur10, Section 5.5, 5.6, 5.7]. 1 Uniform Integrability We give a characterization of L 1 convergence.

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

1 Math 241A-B Homework Problem List for F2015 and W2016

1 Math 241A-B Homework Problem List for F2015 and W2016 1 Math 241A-B Homework Problem List for F2015 W2016 1.1 Homework 1. Due Wednesday, October 7, 2015 Notation 1.1 Let U be any set, g be a positive function on U, Y be a normed space. For any f : U Y let

More information

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory Part V 7 Introduction: What are measures and why measurable sets Lebesgue Integration Theory Definition 7. (Preliminary). A measure on a set is a function :2 [ ] such that. () = 2. If { } = is a finite

More information

9 Radon-Nikodym theorem and conditioning

9 Radon-Nikodym theorem and conditioning Tel Aviv University, 2015 Functions of real variables 93 9 Radon-Nikodym theorem and conditioning 9a Borel-Kolmogorov paradox............. 93 9b Radon-Nikodym theorem.............. 94 9c Conditioning.....................

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Brownian Motion and the Dirichlet Problem

Brownian Motion and the Dirichlet Problem Brownian Motion and the Dirichlet Problem Mario Teixeira Parente August 29, 2016 1/22 Topics for the talk 1. Solving the Dirichlet problem on bounded domains 2. Application: Recurrence/Transience of Brownian

More information

Ito Formula for Stochastic Integrals w.r.t. Compensated Poisson Random Measures on Separable Banach Spaces. B. Rüdiger, G. Ziglio. no.

Ito Formula for Stochastic Integrals w.r.t. Compensated Poisson Random Measures on Separable Banach Spaces. B. Rüdiger, G. Ziglio. no. Ito Formula for Stochastic Integrals w.r.t. Compensated Poisson Random Measures on Separable Banach Spaces B. Rüdiger, G. Ziglio no. 187 Diese rbeit ist mit Unterstützung des von der Deutschen Forschungsgemeinschaft

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Solutions to Problem Set 5 for , Fall 2007

Solutions to Problem Set 5 for , Fall 2007 Solutions to Problem Set 5 for 18.101, Fall 2007 1 Exercise 1 Solution For the counterexample, let us consider M = (0, + ) and let us take V = on M. x Let W be the vector field on M that is identically

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

Nonlinear Lévy Processes and their Characteristics

Nonlinear Lévy Processes and their Characteristics Nonlinear Lévy Processes and their Characteristics Ariel Neufeld Marcel Nutz January 11, 215 Abstract We develop a general construction for nonlinear Lévy processes with given characteristics. More precisely,

More information

Introductory Analysis 2 Spring 2010 Exam 1 February 11, 2015

Introductory Analysis 2 Spring 2010 Exam 1 February 11, 2015 Introductory Analysis 2 Spring 21 Exam 1 February 11, 215 Instructions: You may use any result from Chapter 2 of Royden s textbook, or from the first four chapters of Pugh s textbook, or anything seen

More information

Economics 204 Fall 2011 Problem Set 2 Suggested Solutions

Economics 204 Fall 2011 Problem Set 2 Suggested Solutions Economics 24 Fall 211 Problem Set 2 Suggested Solutions 1. Determine whether the following sets are open, closed, both or neither under the topology induced by the usual metric. (Hint: think about limit

More information

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R. Ergodic Theorems Samy Tindel Purdue University Probability Theory 2 - MA 539 Taken from Probability: Theory and examples by R. Durrett Samy T. Ergodic theorems Probability Theory 1 / 92 Outline 1 Definitions

More information

Part III Advanced Probability

Part III Advanced Probability Part III Advanced Probability Based on lectures by M. Lis Notes taken by Dexter Chua Michaelmas 2017 These notes are not endorsed by the lecturers, and I have modified them (often significantly) after

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information