Lecture 21 Representations of Martingales

Size: px
Start display at page:

Download "Lecture 21 Representations of Martingales"

Transcription

1 Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let A denote the set of all càdlàg and non-decreasing function defined on [, ) and taking values in [, ], with f () =. For f A, we also set f ( ) = lim t f (t). Even it does not need to be invertible, a function f in A admits an a right-continuous inverse, namely the function g : [, ) [, ], denoted by g = f 1, and given by g(s) = inf{t : f (t) > s}. The picture on the right shows a (portion of) the graph of a typical function in A (black), as well as its right-continuous inverse (blue). Note how jumps correspond to flat stretches and vice versa, and how the non-injectivity is resolved to yield right continuity. As always h(t ) = lim t t,t <t h(t ), for t > and h( ) =, for any càdlàg function h on [, ). Here is a precise statement of some of the properties of rightcontinuous inverses. This is best understood by looking at the picture above, so no proof is given (to practice real analysis, supply the proof yourself) Right-continuous inverses of each other. Proposition For f A and its right-continuous inverse g = f 1, we have 1. g A, 2. f is the right-continuous inverse of g, 3. g( f (t)) = sup Flat f (t), for each t [, ), where Flat f (t) = {t : f (t) = f (t )}. 4. Similarly, f (g(s)) = sup Vert f (s), for each s f ([, )), where Vert f (s) is the set of all values in the (unique) interval of the form [ f (t ), f (t)] which contains s. In particular, if f is continuous, then f (g(s)) = s, for all s. Last Updated: April 3, 215

2 Lecture 21: Representations of Martingales 2 of 11 Even though we will have no further use for it in these notes, the result of following problem (a change-of-variable formula) comes in handy from time to time. It is a mild generalization of the formula we use to compute expectations of functions of random variables (how?). Problem Show that, for f A, and a non-negative Borel function ϕ on [, ), we have ϕ(t) d f (t) = ϕ(g(s))1 {g(s)< } ds, where g = f 1 is the right-continuous inverse of f. Deduce that, a < b <, and a non-decreasing and continuous function u : [a, b] [, ), we have b a ϕ(u(s)) d f (u(s)) = u(b) u(a) ϕ(t) d f (t). Time-changes Definition A (not-necessarily adapted) stochastic process {τ s } s [, ) with trajectories in A is called a change of time or time change if the random variable τ s is a stopping time, for each s. Given a filtration {F t } t [, ) and a predictablely measurable process {X t } t [, ), the composition X τs defines a random variable for each s ; the stochastic process {X τs } s [, ) is called the time change of {X t } t [, ) by {τ s } s [, ). Similarly, we may define the time-changed filtration {G s } s [, ) by G s = F τs. By Problem 16.4, we have the following: {G s } s [, ) is right-continuous, since {F t } t [, ) is (we always assume that, remember). Same for completeness. the time-changed process {X τs } s [, ) is {G s } s [, ) adapted. If {X t } t [, ) happens to be càdlàg then so is {X τs } s [, ). Is the same true for continuous (or càglàd) processes? Even though we are asking only for τ s to be a stopping time for deterministic s, this property extends to the class of all {G s } s [, ) - stopping times: Proposition Then the random variable τ σ is an {F t } t [, ) -stopping time, as soon as σ is a {G s } s [, ) -stopping time. Proof. It is enough to deal with countable-valued stopping times. Indeed, a general stopping time σ can be approximated from the right Last Updated: April 3, 215

3 Lecture 21: Representations of Martingales 3 of 11 by a countable-valued sequence {σ n } n N of stopping times such that σ n σ, so that, by right contiuity of τ, we have τ σn τ σ, a.s. The right continuity of the filtration {F t } t [, ) takes care of the rest. We turn to a countable-valued stopping time σ of the form σ = s k1 Ak, where A k G sk = F τsk, for k N and A 1, A 2,... form a partition of Ω. For t, we have ) {τ σ t} = ({τ sk t} A k. Since A k F τsk, we have A k {τ sk t} F t, for each k N, and so {τ σ t} F t. Definition Let {τ s } s [, ) be a time change. A process {X t } t [, ) is said to be τ-continuous if it is continuous and X is constant on [τ s, τ s ], for all s, a.s. It is clear that {X τs } s [, ) is a continuous process if {τ s } s [, ) is a time change and {X t } t [, ) is a τ-continuous process. A deeper property of τ-continuity is that it preserves martingality. Before we state the precise result, let us show what can go wrong: Example Let {B t } t [, ) be a Brownian motion, and let {F t } t [, ) be the right-continuous augmentation of its natural filtration. We define the family {τ s } s [, ) of stopping times by τ s = inf{t : B t > s}, s. To show that {τ s } s [, ) is a time change, we simply express its paths as right-continuous inverses of the continuous and non-decreasing process {M t } t [, ), where M t = sup s t B s. By the continuity of M and the part (6) of Proposition 21.1, we have B τs = s, for all s, a.s. Therefore, we have managed to time-change a martingale B into a constant, finite-variation process s. Note that the time change τ is by no means continuous, as all flat stretches of M correspond to jumps in τ. In fact, it can be shown that, in some sense, τ grows by jumps only. Proposition Let {τ s } s [, ) be a time change, and let M M loc,c be a τ-continuous local martingale. Then the time-changed process {M τs } s [, ) is a continuous local martingale for the time-changed filtration {G s } s [, ) = {F τs } s [, ). Proof. By τ-continuity, the process {N s } s [, ), given by N s = M τs is a continuous process adapted to the filtration {G s } s [, ). Given an {F t } t [, ) -stopping time T such that M T is bounded, we set S = inf{s : τ s T}, Last Updated: April 3, 215

4 Lecture 21: Representations of Martingales 4 of 11 and show that that N S is a {G s } s [, ) -martingale. By τ-continuity of M, we have N s S = M τs S = M τ(s S), and, since τ (s S) T, we conclude that N S is bounded. Next, we pick a {G s } t [, ) -stopping time σ and note that the random variable τ (σ S) is, according to Proposition 21.3, an {F t } t [, ) -stopping time. Also, even though τ σ S is not necessarily bounded from above by T, the martingale M τ σ S remains bounded by the same constant as M T, so we can use the optional sampling theorem to get E[N S σ ] = E[N σ S ] = E[M τ(σ S) ] = E[M τ ] = E[N ] = E[N S ], which implies that N S is a martingale. We can repeat the procedure for a reducing sequence {T n } n N with M T n bounded, to conclude that N is, indeed, a local martingale. We have seen that the class of local martingales is closed under time changes only if additional conditions are fulfulled. The situation is quite diferent with semimartingales. We state two important results the proofs of which are outside the scope of these notes. Both results deal with càdlàg semimartingales, i.e., with processes which can be decomposed into sums of a càdlàg local martingale and a càdlàg and adapted process of finite variation. Theorem Let {X t } t [, ) be a càdlàg semimartingale, and {τ s } s [, ) a time-change. Then the time-changed process {X τs } s [, ) is a càdlàg semimartingale (not necessarily continuous even if X is). Theorem 21.8 (Monroe). Let {X s } s [, ) be a càdlàg semimartingale. Then there exist a filtered probability space (Ω, F, {F t } t, P ) which supports an {F t } t [, ) -Brownian motion {B t} t [, ) and a time-change {τ s } s [, ) such that {X s } s [, ) and {B τs } s [, ) have the same (finite-dimensional) distributions. A theorem of Dambis, Dubins and Schwarz As we have seen in Example 21.5, a right-continuous inverse (computed ω-wise) of an adapted, càdlàg and non-decreasing process is clearly a time change. The most important example of such a timechange for our purposes is obtained by taking a right-continuous inverse of the quadratic variation process of a continuous local martingale {M t } t [, ) : τ s = inf{t : M t > s}. Last Updated: April 3, 215

5 Lecture 21: Representations of Martingales 5 of 11 We note that the process τ will not take the value if the local martingale {M t } t [, ) is divergent, i.e., if M =, a.s. Lemma M is τ-continuous. Proof. By stopping, we may assume that both M and M are bounded. We pick a rational number r, and define the process N t = M r+t M r, t, which is clearly a martingale with N t = M r+t M t. The random variable T r = inf{t : N t > } is a stopping time, and therefore, the stopped martingale N T r is in M 2,c with N T r t =, for all t. Therefore, N T r t =, for all, t, and so, M is constant on [q, T q ]. It is not hard to see that any interval of constancy of M is a closure of a countable union of intervals of the form [r, t + T r ], and our claim follows. Theorem 21.1 (Dambis, Dubins and Schwarz). For a divergent M M loc,c, we define τ s = inf{t : M t > s}, and G s = F τs, for s. Then, the time-changed process {B s } s [, ), given by B s = M τs, s, is a G-Brownian motion and the local martingale M is a time-change of B, i.e. M t = B M t, for t. Proof. By Lemma 21.9 and Proposition 21.6, B is a continuous local martingale. To compute its quadratic variation, we start from the fact that Mt 2 M t is a τ-continous local martingale, and, therefore, timechange Mτ 2 s M τs is a continuous process. Since M τs = s, the process Bs 2 s is a {G s } s [, ) -local martingale, and so, B s = s. Lévy s characterization (Theorem implies that {B s } t [, ) is a Brownian motion. The fact that M t = B M t follows quite directly from Proposition 21.1, part 3. Indeed, B M t = M τ M t and τ M t is the the right edge of Flat M (t). Since M is τ M -continuous, we have M t = M τ M t = B M t. Remark The restriction that M be divergent is here mostly for convenience. It can be shown (try to prove it), that in that case M can still be written as M t = B M t, t, where B is a Brownian motion. This time, however, B may have to be defined on an extension of the original probability space. The reason is that we do not have enough of M to construct the whole path of B from. Last Updated: April 3, 215

6 Lecture 21: Representations of Martingales 6 of 11 Martingals as stochastic integrals Let {M t } t [, ) be a continuous local martingale. A continuous local martingale {Z} t [, ) is called the stochastic (Doléans-Dade) exponential of M, denoted by Z = E(M), if (21.1) t Z t = 1 + Z u dm u, for all t, a.s. Itô s formula implies readily that the following prescription E(M) t = exp(m t 1 2 M t), t, defines a stochastic exponential of M. It can be shown that the process E(M) is the only such process, i.e., that the stochastic integral equation (21.1) has a unique solution (in a specific sense). Exponential martingales, together with the following simple lemma (whose easy proof we leave to the reader), and the proposition that follows it, play a central role in the proof of the celebrated martingalerepresentation theorem below. We remind the reader that subset E of L 2 is said to be total if its linear hull { n } α k X k : n N, α 1,..., α n R, X 1,..., X k E is dense in L 2 (F). Also, for E L 2, its orthogonal complement E is given by E = {X L 2 (F) : E[XY] = for all Y E} = {}. Lemma Let (Ω, F, P) be a probability space, and let E be a subset of L 2 (F) such that E = {}. Then, E is total in L 2 (F). Proposition Let {B t } t [, ) be a Brownian motion, and let {F t } t [, ) be the usual augmentation of its natural filtration. With I denoting the set of all (deterministic!) functions f : [, ) R of the form (21.2) f (t) = n λ k 1 (tk 1,t k ](t), for some n N, λ 1,..., λ n R and = t < t 1 < < t n <, the set is total in L 2 (F ). E = { E( } f (u) db u) : f I Proof. Suppose that Y L 2 (F ) is such that E[YX] =, for all X E, i.e., Then, given a finite partition = t < t 1 < < t n < and f as Last Updated: April 3, 215

7 Lecture 21: Representations of Martingales 7 of 11 in (21.2), we have = E[E( f (u) db u ) Y] [ ( n ) ] = E exp λ k (B tk B tk 1 ) 2 1 n λ 2 k (t k t k 1 ) Y. By rearrangement and conditioning, it follows that [ ( n ) ] [ ( n ) ] E exp α k B tk Z + = E exp α k B tk Z, for all α 1,..., α n R, where Z = E[Y σ(b t1,..., B tn )]. Given that Z is σ(b t1,..., B tn )-measurable, there exists a Borel function ζ such that Z = ζ(b t1,..., B tn ). Consequently R n e n α kx k ζ + (x 1,..., x n )ϕ(x 1,..., x n )dx 1,..., dx n = R n e n α kx k ζ (x 1,..., x n )ϕ(x 1,..., x n )dx 1,... dx n, for all α 1,..., α n, where ϕ is the density of (B t1,..., B tn ). Both integrals above are Laplace transforms, one of ζ ϕ and the other of ζ + ϕ, and, since they agree, so must the transformed functions, i.e., ζ =, a.e., with respect to the Lebesgue measure on R n. It follows that E[Y G] =, i.e., that E[Y + 1 A ] = E[Y 1 A ], for each A σ(b t1,..., B tn ). Since the union of all such finite-dimensional σ-algebras is a π-system which generates F T, we can conclude that Y =, a.s., and Lemma applies. Proposition Let {B t } t [, ) be a Brownian motion, let {F t } t [, ) be the usual augmentation of its natural filtration, and let F = σ(f t ; t ). For any random variable X L 2 (F ) there exists a λ P-a.e.-unique predictable process {H t } t [, ), such that E[ H2 u du] < and (21.3) X = E[X] + H u db u, a.s. Proof. Let us deal with uniqueness, first. Suppose that (21.3) holds for two predictable processes H and K. Then [ ( ) ] 2 = E H u db u K u db u = E[ (H u K u ) 2 du], and it follows that H = K, λ P-a.e., Last Updated: April 3, 215

8 Lecture 21: Representations of Martingales 8 of 11 Next, we deal with existence and start by defining the subset H of L 2 (F ) by H = { H u db u : H is predictable and E[ Hu 2 du] < }. Let {Y n } n N be a sequence in H - with Y n = Hn u db u - which converges to some Y L 2 (F ). By completenes of L 2 (F ), {Y n } n N is Cauchy, thus, thanks to Itô s isometry, the sequence {H n } n N is Cauchy in L 2 ([, ) Ω) = L 2 ([, ) Ω, P, λ P), where P denotes the predictable σ-algebra. By completeness of L 2 ([, ) Ω), H n H, for some H L 2 ([, ) Ω). Consequently, the H u db u = lim n H n u db u = lim n Y n = Y, a.s. Therefore, H is a closed subspace of L 2 (F ). Moreover, the representation (21.1) of the random variables E( f (u) db u), f I, (defined in the statement of Proposition 21.13) tells us that E( f (u) db u ) 1 H, for all f I. It remains now to use Proposition to conclude that the linear span Span(H, 1) = { } x + Y : x R, Y H in L 2 (λ P), generated by H and the constant 1, is dense in L 2 (λ P), because it contains a total set. On the other hand, by the first part of the proof, Span(H, 1) is closed in L 2 (λ P), and, so, Span(H, 1) = L 2 (λ P). It follows that each X L 2 (F ) can be written as X = x + H u db u, for some predictable H with E[ H2 u du] <, and some constant x R. Taking expectations of both sides yields that x = E[X]. Even though, we have almost no use of non-continuous (RCLL) martingales in these notes, sometimes the show up naturally, even in the continuous-path framework. They are defined just like the continuous martingales - the only difference is that their trajectories are assumed to be RCLL. A deeper difference comes with the possible choices of the reducing sequence. In the continuous-path case this sequence can always be constructed to so that the stopped processed are bounded. That is no longer the case in the RCLL setting, and only uniform integrability, as required in the definition, can be guaranteed. Last Updated: April 3, 215

9 Lecture 21: Representations of Martingales 9 of 11 Corollary Let {F t } t [, ) be the usual agumentation of the Brownian filtration, and let M be an L 2 -bounded, càdlàg {F t } t [, ) -martingale. Then, there exists a predictable process H with E[ H2 u du] < such that t M t = M + H u db u, for all t, a.s. Proof. By Proposition 21.14, we can express the last element M of M as M = M + H u db u, for some predictable H such that E[ H2 u du] <. Let the continuous martingale N be defined as t N t = M + H u db u, so that N = M. It follows that M t = E[M F t ] = E[N F t ] = N t, a.s., for all t, and right continuity implies that M and N are indistinguishable. Proposition Let {F t } t [, ) be the usual augmentation of the Brownian filtration, and let {M t } t [, ) be a càdlàg {F t } t [, ) -local martingale. Then, there exists a predictable process H in L(B) such that t M t = M + H u db u, for all t, a.s. In particular, each local martingale in the augmented Brownian filtration is continuous. Proof. By stopping, we can reduce the statement to the case of an uniformly-integrable martingale (note that, as explained above, we cannot necessarily assume that M is a bounded martingale). Being UI, M admits the last element M and we can approximate it in L 2. More precisely, given ε >, there exists M ε L 2 (F ) such that M ε M L 1 < ε. Using M ε as the last element, We define the square-integrable martingale M ε by M ε t = E[M ε F t ], for t, and take a càdlàg modification (remember, we can do that for any martingale on a filtration satisfying the usual conditions). By the maximal inequality for martingales, we have [ ] (21.4) P sup Mt n Mt ε δ t 1 δ E[ Mn M ε ] 1 δ ε. Last Updated: April 3, 215

10 Lecture 21: Representations of Martingales 1 of 11 Corollary implies that M ε is continuous, and we can interpret the equation (21.4) in the following way: with probability at least 1 ε/δ, the trajectory of M is in a δ-neighborhood of a continuous function. That means, in particular, that (21.5) P[( M) 2δ] 1 δ ε, where the jump process M is defined by M t = M t M t. The left-hand side of (21.5) does not depend on ε, so, we conclude that ( M) 2δ, a.s., for all δ >, making M continuous. Now that we know that M is continuous, we can reduce it by stopping to a square-integrable martingale and finish the proof. An explicit example Martingale representations of random variables (and martingales) are seldom explicit. Here is an example of a nontrivial one where everything can be worked out closed-form. Example Let B be a Brownian motion and {F t } t [, ) its (augmented) filtration. With S t = sup u t B u denoting the running-maximum process, the random variable S 1 is in L 2 (F 1 ) so Proposition guarantees that there exists a predictable process {H t } t [, ) in L 2 (B) such that (note that, necessarily, H t = for t > 1) 1 S 1 = E[S 1 ] + H u db u, a.s. Proposition itself is silent about the exact form of H. We start by defining a martingale {M t } t [,1], in its continuous modification, by M t = E[S 1 F t ], a.s., so that M t = t H u db u, for all t [, 1], a.s. On the other hand, we have [ ( ) ] Ft E[S 1 F t ] = E max S t, B t + sup (B u B t ), a.s. u [t,1] The Strong Markov Property implies that sup u [t,1] (B u B t ) is independent of F t and distributed as S 1 t, so we have where, for s b, we define E[S 1 F t ] = F(t, S t, B t ), a.s., F(t, s, b) = E[max(s, b + S 1 t )] a.s., Last Updated: April 3, 215

11 Lecture 21: Representations of Martingales 11 of 11 By the reflection principle, we have F(t, s, b) = max(s, b + x 1 t) ϕ(x) dx, with ϕ denoting the density of the unit normal. The function F is twice differentiable in b for t < 1 (this can be checked directly) martingale on [, 1), and the the process F(t, S t, B t ) is a martingale. Therefore, Itô s formula implies that (why?) (21.6) t E[S 1 F t ] = E[S 1 ] + b F(u, S u, B u ) db u for t < 1, Since all local martingales in the Brownian filtration are continuous, we can let t 1, and conclude that the equality in (21.6) holds a.s., even for t = 1. It remains to compute the derivative of F; with Φ(x) = x ϕ(ξ) dξ, we have H t = ) 1 {Bt + x 1 t S t } (1 ϕ(x) dx = 2 Φ( S 1 t t B t ), A simulated path of a Brownian motion B (blue), its running max S (orange) and the martingale M (green) so that S 1 = 2 π + 1 ( ) 2 1 Φ( S 1 u u B u ) db u Values of the process H corresponding to the simulated path of the Brownian motion B above. Last Updated: April 3, 215

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Solutions to the Exercises in Stochastic Analysis

Solutions to the Exercises in Stochastic Analysis Solutions to the Exercises in Stochastic Analysis Lecturer: Xue-Mei Li 1 Problem Sheet 1 In these solution I avoid using conditional expectations. But do try to give alternative proofs once we learnt conditional

More information

Lecture 4 Lebesgue spaces and inequalities

Lecture 4 Lebesgue spaces and inequalities Lecture 4: Lebesgue spaces and inequalities 1 of 10 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 4 Lebesgue spaces and inequalities Lebesgue spaces We have seen how

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

ITO OLOGY. Thomas Wieting Reed College, Random Processes 2 The Ito Integral 3 Ito Processes 4 Stocastic Differential Equations

ITO OLOGY. Thomas Wieting Reed College, Random Processes 2 The Ito Integral 3 Ito Processes 4 Stocastic Differential Equations ITO OLOGY Thomas Wieting Reed College, 2000 1 Random Processes 2 The Ito Integral 3 Ito Processes 4 Stocastic Differential Equations 1 Random Processes 1 Let (, F,P) be a probability space. By definition,

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

An Overview of the Martingale Representation Theorem

An Overview of the Martingale Representation Theorem An Overview of the Martingale Representation Theorem Nuno Azevedo CEMAPRE - ISEG - UTL nazevedo@iseg.utl.pt September 3, 21 Nuno Azevedo (CEMAPRE - ISEG - UTL) LXDS Seminar September 3, 21 1 / 25 Background

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

Stochastic Calculus. Alan Bain

Stochastic Calculus. Alan Bain Stochastic Calculus Alan Bain 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral and some of its applications. They

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Additive Lévy Processes

Additive Lévy Processes Additive Lévy Processes Introduction Let X X N denote N independent Lévy processes on. We can construct an N-parameter stochastic process, indexed by N +, as follows: := X + + X N N for every := ( N )

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

The Skorokhod problem in a time-dependent interval

The Skorokhod problem in a time-dependent interval The Skorokhod problem in a time-dependent interval Krzysztof Burdzy, Weining Kang and Kavita Ramanan University of Washington and Carnegie Mellon University Abstract: We consider the Skorokhod problem

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

The Azéma-Yor Embedding in Non-Singular Diffusions

The Azéma-Yor Embedding in Non-Singular Diffusions Stochastic Process. Appl. Vol. 96, No. 2, 2001, 305-312 Research Report No. 406, 1999, Dept. Theoret. Statist. Aarhus The Azéma-Yor Embedding in Non-Singular Diffusions J. L. Pedersen and G. Peskir Let

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

1 Stat 605. Homework I. Due Feb. 1, 2011

1 Stat 605. Homework I. Due Feb. 1, 2011 The first part is homework which you need to turn in. The second part is exercises that will not be graded, but you need to turn it in together with the take-home final exam. 1 Stat 605. Homework I. Due

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Chapter 3 A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1 Abstract We establish a change of variable

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information

Real Analysis Notes. Thomas Goller

Real Analysis Notes. Thomas Goller Real Analysis Notes Thomas Goller September 4, 2011 Contents 1 Abstract Measure Spaces 2 1.1 Basic Definitions........................... 2 1.2 Measurable Functions........................ 2 1.3 Integration..............................

More information

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1) 1.4. CONSTRUCTION OF LEBESGUE-STIELTJES MEASURES In this section we shall put to use the Carathéodory-Hahn theory, in order to construct measures with certain desirable properties first on the real line

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

Stochastic Calculus (Lecture #3)

Stochastic Calculus (Lecture #3) Stochastic Calculus (Lecture #3) Siegfried Hörmann Université libre de Bruxelles (ULB) Spring 2014 Outline of the course 1. Stochastic processes in continuous time. 2. Brownian motion. 3. Itô integral:

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

A Barrier Version of the Russian Option

A Barrier Version of the Russian Option A Barrier Version of the Russian Option L. A. Shepp, A. N. Shiryaev, A. Sulem Rutgers University; shepp@stat.rutgers.edu Steklov Mathematical Institute; shiryaev@mi.ras.ru INRIA- Rocquencourt; agnes.sulem@inria.fr

More information

Tools from Lebesgue integration

Tools from Lebesgue integration Tools from Lebesgue integration E.P. van den Ban Fall 2005 Introduction In these notes we describe some of the basic tools from the theory of Lebesgue integration. Definitions and results will be given

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012 1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.

More information

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s.

If Y and Y 0 satisfy (1-2), then Y = Y 0 a.s. 20 6. CONDITIONAL EXPECTATION Having discussed at length the limit theory for sums of independent random variables we will now move on to deal with dependent random variables. An important tool in this

More information

On Integration-by-parts and the Itô Formula for Backwards Itô Integral

On Integration-by-parts and the Itô Formula for Backwards Itô Integral 11 On Integration-by-parts and the Itô Formula for... On Integration-by-parts and the Itô Formula for Backwards Itô Integral Jayrold P. Arcede a, Emmanuel A. Cabral b a Caraga State University, Ampayon,

More information

Integration on Measure Spaces

Integration on Measure Spaces Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of

More information

Analysis Comprehensive Exam Questions Fall 2008

Analysis Comprehensive Exam Questions Fall 2008 Analysis Comprehensive xam Questions Fall 28. (a) Let R be measurable with finite Lebesgue measure. Suppose that {f n } n N is a bounded sequence in L 2 () and there exists a function f such that f n (x)

More information

Stochastic Processes

Stochastic Processes Introduction and Techniques Lecture 4 in Financial Mathematics UiO-STK4510 Autumn 2015 Teacher: S. Ortiz-Latorre Stochastic Processes 1 Stochastic Processes De nition 1 Let (E; E) be a measurable space

More information

Summable processes which are not semimartingales

Summable processes which are not semimartingales Summable processes which are not semimartingales Nicolae Dinculeanu Oana Mocioalca University of Florida, Gainesville, FL 32611 Abstract The classical stochastic integral HdX is defined for real-valued

More information

Recent results in game theoretic mathematical finance

Recent results in game theoretic mathematical finance Recent results in game theoretic mathematical finance Nicolas Perkowski Humboldt Universität zu Berlin May 31st, 2017 Thera Stochastics In Honor of Ioannis Karatzas s 65th Birthday Based on joint work

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

4 Countability axioms

4 Countability axioms 4 COUNTABILITY AXIOMS 4 Countability axioms Definition 4.1. Let X be a topological space X is said to be first countable if for any x X, there is a countable basis for the neighborhoods of x. X is said

More information

I. ANALYSIS; PROBABILITY

I. ANALYSIS; PROBABILITY ma414l1.tex Lecture 1. 12.1.2012 I. NLYSIS; PROBBILITY 1. Lebesgue Measure and Integral We recall Lebesgue measure (M411 Probability and Measure) λ: defined on intervals (a, b] by λ((a, b]) := b a (so

More information

Measure and integration

Measure and integration Chapter 5 Measure and integration In calculus you have learned how to calculate the size of different kinds of sets: the length of a curve, the area of a region or a surface, the volume or mass of a solid.

More information

Stochastic Analysis I S.Kotani April 2006

Stochastic Analysis I S.Kotani April 2006 Stochastic Analysis I S.Kotani April 6 To describe time evolution of randomly developing phenomena such as motion of particles in random media, variation of stock prices and so on, we have to treat stochastic

More information

Uniformly Uniformly-ergodic Markov chains and BSDEs

Uniformly Uniformly-ergodic Markov chains and BSDEs Uniformly Uniformly-ergodic Markov chains and BSDEs Samuel N. Cohen Mathematical Institute, University of Oxford (Based on joint work with Ying Hu, Robert Elliott, Lukas Szpruch) Centre Henri Lebesgue,

More information

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures

Spring 2014 Advanced Probability Overview. Lecture Notes Set 1: Course Overview, σ-fields, and Measures 36-752 Spring 2014 Advanced Probability Overview Lecture Notes Set 1: Course Overview, σ-fields, and Measures Instructor: Jing Lei Associated reading: Sec 1.1-1.4 of Ash and Doléans-Dade; Sec 1.1 and A.1

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Martingales, standard filtrations, and stopping times

Martingales, standard filtrations, and stopping times Project 4 Martingales, standard filtrations, and stopping times Throughout this Project the index set T is taken to equal R +, unless explicitly noted otherwise. Some things you might want to explain in

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information

Generalized Gaussian Bridges of Prediction-Invertible Processes

Generalized Gaussian Bridges of Prediction-Invertible Processes Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine

More information

Lecture 6 Basic Probability

Lecture 6 Basic Probability Lecture 6: Basic Probability 1 of 17 Course: Theory of Probability I Term: Fall 2013 Instructor: Gordan Zitkovic Lecture 6 Basic Probability Probability spaces A mathematical setup behind a probabilistic

More information

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989),

1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer 11(2) (1989), Real Analysis 2, Math 651, Spring 2005 April 26, 2005 1 Real Analysis 2, Math 651, Spring 2005 Krzysztof Chris Ciesielski 1/12/05: sec 3.1 and my article: How good is the Lebesgue measure?, Math. Intelligencer

More information

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011

Lecture 5. 1 Chung-Fuchs Theorem. Tel Aviv University Spring 2011 Random Walks and Brownian Motion Tel Aviv University Spring 20 Instructor: Ron Peled Lecture 5 Lecture date: Feb 28, 20 Scribe: Yishai Kohn In today's lecture we return to the Chung-Fuchs theorem regarding

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Advanced Probability

Advanced Probability Advanced Probability Perla Sousi October 10, 2011 Contents 1 Conditional expectation 1 1.1 Discrete case.................................. 3 1.2 Existence and uniqueness............................ 3 1

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

Regular Variation and Extreme Events for Stochastic Processes

Regular Variation and Extreme Events for Stochastic Processes 1 Regular Variation and Extreme Events for Stochastic Processes FILIP LINDSKOG Royal Institute of Technology, Stockholm 2005 based on joint work with Henrik Hult www.math.kth.se/ lindskog 2 Extremes for

More information

Representations of Gaussian measures that are equivalent to Wiener measure

Representations of Gaussian measures that are equivalent to Wiener measure Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results

More information

Notes 1 : Measure-theoretic foundations I

Notes 1 : Measure-theoretic foundations I Notes 1 : Measure-theoretic foundations I Math 733-734: Theory of Probability Lecturer: Sebastien Roch References: [Wil91, Section 1.0-1.8, 2.1-2.3, 3.1-3.11], [Fel68, Sections 7.2, 8.1, 9.6], [Dur10,

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

Math 735: Stochastic Analysis

Math 735: Stochastic Analysis First Prev Next Go To Go Back Full Screen Close Quit 1 Math 735: Stochastic Analysis 1. Introduction and review 2. Notions of convergence 3. Continuous time stochastic processes 4. Information and conditional

More information

5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing.

5 Measure theory II. (or. lim. Prove the proposition. 5. For fixed F A and φ M define the restriction of φ on F by writing. 5 Measure theory II 1. Charges (signed measures). Let (Ω, A) be a σ -algebra. A map φ: A R is called a charge, (or signed measure or σ -additive set function) if φ = φ(a j ) (5.1) A j for any disjoint

More information

Convergence of Feller Processes

Convergence of Feller Processes Chapter 15 Convergence of Feller Processes This chapter looks at the convergence of sequences of Feller processes to a iting process. Section 15.1 lays some ground work concerning weak convergence of processes

More information

Metric Spaces Lecture 17

Metric Spaces Lecture 17 Metric Spaces Lecture 17 Homeomorphisms At the end of last lecture an example was given of a bijective continuous function f such that f 1 is not continuous. For another example, consider the sets T =

More information

LOCALLY INTEGRABLE PROCESSES WITH RESPECT TO LOCALLY ADDITIVE SUMMABLE PROCESSES

LOCALLY INTEGRABLE PROCESSES WITH RESPECT TO LOCALLY ADDITIVE SUMMABLE PROCESSES LOCALLY INTEGRABLE PROCESSES WITH RESPECT TO LOCALLY ADDITIVE SUMMABLE PROCESSES OANA MOCIOALCA Abstract. In [MD] we defined and studied a class of summable processes, called additive summable processes,

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information