An essay on the general theory of stochastic processes

Size: px
Start display at page:

Download "An essay on the general theory of stochastic processes"

Transcription

1 Probability Surveys Vol. 3 (26) ISSN: DOI: / An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16 Zürich 892, Switzerland ashkan.nikeghbali@math.ethz.ch Abstract: This text is a survey of the general theory of stochastic processes, with a view towards random times and enlargements of filtrations. The first five chapters present standard materials, which were developed by the French probability school and which are usually written in French. The material presented in the last three chapters is less standard and takes into account some recent developments. AMS 2 subject classifications: Primary 5C38, 15A15; secondary 5A15, 15A18. Keywords and phrases: General theory of stochastic processes, Enlargements of filtrations, Random times, Submartingales, Stopping times, Honest times, Pseudo-stopping times. Received August 25. Contents 1 Introduction Acknowledgements Basic notions of the general theory Stopping times Progressive, Optional and Predictable σ-fields Classification of stopping times Début theorems Section theorems Projection theorems The optional and predictable projections Increasing processes and projections Random measures on (R + Ω) and the dual projections The Doob-Meyer decomposition and multiplicative decompositions Multiplicative decompositions Some hidden martingales General random times, their associated σ-fields and Azéma s supermartingales Arbitrary random times and some associated sigma fields This is an original survey paper 345

2 A. Nikeghbali/The general theory of stochastic processes Azéma s supermartingales and dual projections associated with random times The case of honest times The case of pseudo-stopping times Honest times and Strong Brownian Filtrations The enlargements of filtrations Initial enlargements of filtrations Progressive enlargements of filtrations A description of predictable and optional processes in (G ρ t ) and ( ) Ft L The decomposition formula before ρ The decomposition formula for honest times The (H) hypothesis Concluding remarks on enlargements of filtrations References Introduction P.A. Meyer and C. Dellacherie have created the so called general theory of stochastic processes, which consists of a number of fundamental operations on either real valued stochastic processes indexed by [,( ), or random measures ) on [, ), relative to a given filtered probability space Ω, F, (F t ) t, P, where (F t ) is a right continuous filtration of (F, P) complete sub-σ-fields of F. This theory was gradually created from results which originated from the study of Markov processes, and martingales and additive functionals associated with them. A guiding principle for Meyer and Dellacherie was to understand to which extent the Markov property could be avoided; in fact, they were able to get rid of the Markov property in a radical way. At this point, we would like to emphasize that, perhaps to the astonishment of some readers, stochastic calculus was not thought of as a basic elementary tool in 1972, when C. Dellacherie s little book appeared. Thus it seemed interesting to view some important facts of the general theory in relation with stochastic calculus. The present essay falls into two parts: the first part, consisting of sections 2 to 5, is a review of the General Theory of Stochastic Processes and is fairly well known. The second part is a review of more recent results, and is much less so. Throughout this essay we try to illustrate as much as possible the results with examples. More precisely, the plan of the essay is as follows: in Section 2, we recall the basic notions of the theory: stopping times, the optional and predictable σ-fields and processes,etc. in Section 3, we present the fundamental Section theorems; in Section 4, we present the fundamental Projection theorems;

3 A. Nikeghbali/The general theory of stochastic processes 347 in Section 5, we recall the Doob-Meyer decomposition of semimartingales; in Section 6, we present a small theory of multiplicative decompositions of nonnegative local submartingales; in Section 7, we highlight the role of certain hidden martingales in the general theory of stochastic processes; in Section 8, we illustrate the theory with the study of arbitrary random times; in Section 9, we study how the basic operations depend on the underlying filtration, which leads us in fact to some introduction of the theory of enlargement of filtrations; Acknowledgements I would like to thank an anonymous referee for his comments and suggestions which helped to improve the present text. 2. Basic notions of the general theory Throughout ( this essay, ) we assume we are given a filtered probability space Ω, F, (F t ) t, P that satisfies the usual conditions, that is (F t ) is a right continuous filtration of (F, P) complete sub-σ-fields of F. A stochastic process is said to be càdlàg if it almost surely has sample paths which are right continuous with left limits. A stochastic process is said to be càglàd if it almost surely has sample paths which are left continuous with right limits Stopping times Definition 2.1. A stopping time is a mapping T : Ω R + such that {T t} F t for all t. To a given stopping time T, we associate the σ-field F T defined by: F T = {A F, A {T t} F t for all t }. We can also associate with T the σ-field F T generated by F and sets of the form: A {T > t}, with A F t and t. We recap here without proof some of the classical properties of stopping times. Proposition 2.2. Let T be a stopping time. Then T is measurable with respect to F T and F T F T. Proposition 2.3. Let T be a stopping time. If A F T, then { T (ω) if ω A T A (ω) = + if ω / A

4 is also a stopping time. A. Nikeghbali/The general theory of stochastic processes 348 Proposition 2.4 ([26], Theorem 53, p.187). Let S and T be two stopping times. 1. For every A F S, the set A {S T } F T. 2. For every A F S, the set A {S < T } F T. Proposition 2.5 ([26], Theorem 56, p.189). Let S and T be two stopping times such that S T. Then F S F T. One of the most used properties of stopping times is the optional stopping theorem. Theorem 2.6 ([69], Theorem 3.2, p.69). Let (M t ) be an (F t ) uniformly integrable martingale and let T be a stopping time. Then, one has: and hence: E[M F T ] = M T (2.1) E[M ] = E[M T ] (2.2) One can naturally ask whether there exist some other random times (i.e. nonnegative random variables) such that (2.1) or (2.2) hold. We will answer these questions in subsequent sections Progressive, Optional and Predictable σ-fields Now, we shall define the three fundamental σ-algebras we always deal with in the theory of stochastic processes. Definition 2.7. A process X = (X t ) t is called (F t ) progressive if for every t, the restriction of (t, ω) X t (ω) to [, t] Ω is B [, t] F t measurable. A set A R + Ω is called progressive if the process 1 A (t, ω) is progressive. The set of all progressive sets is a σ-algebra called the progressive σ-algebra, which we will denote M. Proposition 2.8 ([69], Proposition 4.9, p.44). If X is a (F t ) progressive process and T is a (F t ) stopping time, then X T 1 {T< } is F T measurable. Definition 2.9. The optional σ-algebra O is the σ-algebra, defined on R + Ω, generated by all processes (X t ) t, adapted to (F t ), with càdlàg paths. A process X = (X t ) t is called (F t ) optional if the map (t, ω) X t (ω) is measurable with respect to the optional σ-algebra O. Proposition 2.1. Let X = (X t ) t be an optional process and T a stopping time. Then: 1. X T 1 {T< } is F T measurable. 2. the stopped process X T = (X t T ) t is optional.

5 A. Nikeghbali/The general theory of stochastic processes 349 Definition The predictable σ-algebra P is the σ-algebra, defined on R + Ω, generated by all processes (X t ) t, adapted to (F t ), with left continuous paths on ], [. A process X = (X t ) t is called (F t ) predictable if the map (t, ω) X t (ω) is measurable with respect to the predictable σ-algebra P. Proposition 2.12 ([26], Theorem 67, p.2). Let X = (X t ) t be a predictable process and T a stopping time. Then: 1. X T 1 {T< } is F T measurable. 2. the stopped process X T = (X t T ) t is predictable. The following inclusions always hold: P O M. It is easy to show that every adapted càdlàg process is progressively measurable ([69]), hence O M. We also have that every càg and adapted process is predictable (every such process can be written as a limit of càdlàg processes), thus P O. The inclusions may be strict. Example O M. The following example is due Dellacherie and Meyer (see [24], p.128). Let B = (B t ) t be the standard Brownian motion and let (F t ) be the usual augmentation of the natural filtration of B. For each ω, the set {s : B s (ω) } is the disjoint union of open intervals, the excursions intervals. Define the set E by: E {(s, ω) : s is the left-hand endpoint of an excursion interval}. Then E is progressive but not optional. Example P O. The standard Poisson process N = (N t ) t is optional, but not predictable, in its own filtration (F t ). Now, we give a characterization of the optional and predictable σ-algebras in terms of stochastic intervals. Definition Let S and T be two positive random variables such that S T. We define [S, T[, as the random subset of R + Ω: [S, T[ = {(t, ω) : S (ω) t < T (ω)}. All the stochastic intervals we can form with a pair of stopping times (S, T), such that S T, are optional. Indeed, 1 [,T[ and 1 [S, [ are càdlàg. and adapted: hence 1 [S,T[ is optional. Taking T = S + 1 n and letting n go to infinity yields [S]

6 A. Nikeghbali/The general theory of stochastic processes 35 is optional. It is therefore immediate that the other type of stochastic intervals are optional. Now let Z be an F S measurable random variable; then the process Z (ω)1 [S,T[ (t, ω) is optional since it is adapted and càdlàg. We can also prove an analogous result with other types of stochastic intervals. The same way as above, it can be shown that ]S, T] is predictable and that for Z an F S measurable random variable, Z (ω)1 ]S,T] (t, ω) is predictable. Now, we can give a characterization of the optional and predictable σ-algebras in terms of stochastic intervals. Proposition The optional σ-algebra is generated by stochastic intervals of the form [T, [, where T is a stopping time: O = σ {[T, [, T is a stopping time}. Moreover, if Y is a F T measurable random variable, then there exists an optional process (X t ) t R+ such that Y = X T. Proof. From the remark above, it suffices to show that any càdlàg and adapted process X can be approximated with a sequence of elements of σ{[t, [, T is a stopping time}. Fix ε > and let: T and Z = X T. Now define inductively for n, T n+1 = inf {t > T n : X t Z n > ε} Z n+1 = X Tn+1 if T n+1 <. Since X has left limits, T n. Now set: Y n Z n 1 [Tn,T n+1[. Then X Y ε and this completes the proof. Remark We also have: O = σ {[, T[, T is a stopping time}. Remark It is useful to note that for a random time T, we have that [T, [ is in the optional sigma field if and only if T is a stopping time. A similar result holds for the predictable σ-algebra (see [26], [39] or [69]). Proposition 2.19 ([26], Theorem 67, p. 2). The predictable σ-algebra is generated by one of the following collections of random sets: 1. A {} where A F, and [, T] where T is a stopping time; 2. A {} where A F, and A (s, t]where s < t, A F s ; Now we give an easy result which is often used in martingale theory. Proposition 2.2. Let X = (X t ) t be an optional process. Then: 1. The jump process X X X is optional; 2. X is predictable; 3. if moreover X is predictable, then X is predictable.

7 A. Nikeghbali/The general theory of stochastic processes Classification of stopping times We shall now give finer results about stopping times. The notions developped here are very useful in the study of discontinuous semimartingales (see [39] for example). The proofs of the results presented here can be found in [24] or [26]. We first introduce the concept of predictable stopping times. Definition A predictable time is a mapping T : Ω R + such that the stochastic interval [, T[ is predictable. Every predictable time is a stopping time since [T, [ P O. Moreover, as [T] = [, T]\ [, T[, we deduce that [T] P. We also have the following characterization of predictable times: Proposition 2.22 ([26], Theorem 71, p.24). A stopping time T is predictable if there exists a sequence of stopping times (T n ) satisfying the following conditions: 1. (T n ) is increasing with limit T. 2. we have {T n < T } for all n on the set {T > }; The sequence (T n ) is called an announcing sequence for T. Now we enumerate some important properties of predictable stopping times, which can be found in [24] p.54, or [26] p.25. Theorem Let S be a predictable stopping time and T any stopping time. For all A F S, the set A {S T } F T. In particular, the sets {S T } and {S = T } are in F T. Proposition Let S and T be two predictable stopping times. Then the stopping times S T and S T are also predictable. Proposition Let A F T and T a predictable stopping time. Then the time T A is also predictable. Proposition Let (T n ) be an increasing sequence of predictable stopping times and T = lim n T n. Then T is predictable. We recall that a random set A is called evanescent if the set is P null. {ω : t R + with (t, ω) A} Definition Let T be a stopping time. 1. We say that T is accessible if there exists a sequence (T n ) of predictable stopping times such that: [T] ( n [T n ]) up to an evanescent set, or in other words, P [ n {ω : T n (ω) = T (ω) < }] = 1

8 A. Nikeghbali/The general theory of stochastic processes We say that T is totally inaccessible if for all predictable stopping times S we have: [T] [S] = up to an evanescent set, or in other words: P [{ω : T (ω) = S (ω) < }] =. Remark It is obvious that predictable stopping times are accessible and that the stopping times which are both accessible and totally inaccessible are almost surely infinite. Remark There exist stopping times which are accessible but not predictable. Theorem 2.3 ([26] Theorem 81, p.215). Let T be a stopping time. There exists a unique (up to a P null set) partition of the set {T < } into two sets A and B which belong to F T such that T A is accessible and T B is totally inaccessible. The stopping time T A is called the accessible part of T while T B is called the totally inaccessible part of T. Now let us examine a special case where the accessible times are predictable. For this, we need to define the concept of quasi-left continuous filtrations. Definition The filtration (F t ) is quasi-left continuous if for all predictable stopping times. F T = F T Theorem 2.32 ([26] Theorem 83, p.217). The following assertions are equivalent: 1. The accessible stopping times are predictable; 2. The filtration (F t ) is quasi-left continuous; 3. The filtration (F t ) does not have any discontinuity time: FTn = F (lim Tn) for all increasing sequences of stopping times (T n ). Definition A càdlàg process X is called quasi-left continuous if X T =, a.s. on the set {T < } for every predictable time T. Definition A random set A is called thin if it is of the form A = [T n ], where (T n ) is a sequence of stopping times; if moreover the sequence (T n ) satisfies [T n ] [T m ] = for all n m, it is called an exhausting sequence for A. Proposition Let X be a càdlàg adapted process. The following are equivalent: 1. X is quasi-left continuous;

9 A. Nikeghbali/The general theory of stochastic processes there exists a sequence of totally inaccessible stopping times that exhausts the jumps of X; 3. for any increasing sequence of stopping times (T n ) with limit T, we have limx Tn = X T a.s. on the set {T < } Début theorems In this section, we give a fundamental result for realizations of stopping times: the début theorem. Its proof is difficult and uses the same hard theory (capacities theory) as the section theorems which we shall state in the next section. Definition Let A be a subset of R + Ω. The début of A is the function D A defined as: D A (ω) = inf {t R + : (t, ω) A}, with D A (ω) = if this set is empty. It is a nice and difficult result that when the set A is progressive, then D A is a stopping time ([24], [26]): Theorem 2.37 ([24], Theorem 23, p. 51). Let A be a progressive set, then D A is a stopping time. Conversely, every stopping time is the début of a progressive (in fact optional) set: indeed, it suffices to take A = [T, [ or A = [T]. The proof of the début theorem is an easy consequence of the following difficult result from measure theory: Theorem If (E, E) is a locally compact space with a countable basis with its Borel σ-field and (Ω, F, P) is a complete probability space, for every set A E F, the projection π (A) of A into Ω belongs to F. Proof of the début theorem. We apply Theorem 2.38 to the set A t = A ([, t[ Ω) which belongs to B ([, t[) F t. As a result, {D A t} = π (A t ) belongs to F t. We can define the n-début of a set A by D n A (ω) = inf {t R + : [, t] A contains at least n points}; we can also define the début of A by: D n A (ω) = inf {t R + : [, t] A contains infinitely many points}. Theorem The n-début of a progressive set A is a stopping time for n = 1, 2,...,. Proof. The proof is easy once we know that DA 1 (ω) is a stopping time. Indeed, by induction on n, we prove that D n+1 A (ω) which is the début of the progressive set A n = A ]DA n (ω), [. D A (ω) is also a stopping time as the début of the progressive set A n.

10 A. Nikeghbali/The general theory of stochastic processes 354 It is also possible to show that the penetration time T of a progressive set A, defined by: T (ω) = inf {t R + : [, t] A contains infinitely non countable many points} is a stopping time. We can naturally wonder if the début of a predictable set is a predictable stopping time. One moment of reflexion shows that the answer is negative: every stopping time is the début of the predictable set ]T, [ without being predictable itself. However, we have: Proposition 2.4. Let D A be the début of a predictable set A. If [D A ] A, then D A is a predictable stopping time. Proof. If [D A ] A, then [D A ] = A [, D A ], is predictable since A is predictable and D A is a stopping time. Hence D A is predictable. One can deduce from there that: Proposition Let A be a predictable set which is closed for the right topology 1. Then its début D A is a predictable stopping time. Now we are going to link the above mentioned notions to the jumps of some stochastic processes. We will follow [39], chapter I. Lemma Any thin random set admits an exhausting sequence of stopping times. Proposition If X is a càdlàg adapted process, the random set U { X } is thin; an exhausting sequence (T n ) for this set is called a sequence that exhausts the jumps of X. Moreover, if X is predictable, the stopping times (T n ) can be chosen predictable. Proof. Let for n an integer and set V = U and U n { (t, ω) : X t (ω) X t (ω) > 2 n}, V n = U n U n 1. The sets V n are optional (resp. predictable if X is predictable) and are disjoint. Now, let us define the stopping times D 1 n = inf {t : (t, ω) V n } D k+1 n = inf { t > D k n : (t, ω) V n }, 1 We recall that the right topology on the real line is a topology whose basis is given by intervals [s, t[.

11 A. Nikeghbali/The general theory of stochastic processes 355 so that Dn j represents the j th jump of X whose size in absolute value is between 2 n and 2 n+1. Since X is càdlàg, V n does not have any accumulation point and the stopping times ( ) Dn k enumerate all the points in V (k,n) N 2 n. Moreover, from Proposition 2.41, the stopping times ( Dn) k are predictable if X is predictable. To complete the proof, it suffices to index the doubly indexed sequence ( ) Dn k into a simple indexed sequence (T n ). In fact, we have the following characterization for predictable processes: Proposition If X is càdlàg adapted process, then X is predictable if and only if the following two conditions are satisfied: 1. For all totally inaccessible stopping times T, X T =, a.s.on {T < } 2. For every predictable stopping time T, X T 1 {T< } is F T measurable. Finally, we characterize F T and F T measurable random variables: Theorem Let T be a stopping time. A random variable Z is F T measurable if and only if there exists a predictable process (X t ) such that Z = X T on the set {T < }. Similarly, a random variable Z is F T measurable if and only if there exists an optional process (Y t ) such that Z = Y T on the set {T < }. Proof. We only prove the first part, the proof for the second part being the same. We have just seen that the condition is sufficient. To show the condition is necessary, by the monotone class theorem, it suffices to check the theorem for indicators of sets that generate the sigma field F T, i.e. for Z = 1 A when A F and for Z = 1 B {s<t }, where B F s. But then, one can take X = 1 [A, [ in the first case and X = 1 ]sb, [ in the second case. 3. Section theorems This section is devoted to a deep and very difficult result called the section theorem. The reader can refer to [26], p or [24], p.7 for a proof. We will illustrate the theorem with some standard examples (here again the examples we deal with can be found in [26], [24] or [7]). Theorem 3.1 (Optional and predictable section theorems). Let A be an optional (resp. predictable) set. For every ε >, there is a stopping time (resp. predictable stopping time) T such that: 1. [T] A, 2. P [T < ] P (π (A)) ε, where π is the canonical projection of Ω R + onto Ω. Throughout the paper, we shall use the optional and predictable section theorems. For now, we give some classical applications.

12 A. Nikeghbali/The general theory of stochastic processes 356 Theorem 3.2. Let (X t ) and (Y t ) be two optional (resp. predictable) processes. If for every finite stopping time (resp. every finite predictable stopping time) one has: X T = Y T a.s., then the processes (X t ) and (Y t ) are indistinguishable. Proof. It suffices to apply the section theorem to the optional (resp. predictable) set A = {(t, ω) : X t (ω) Y t (ω)}. Indeed, if the set A were not evanescent, there would exist a stopping time whose graph would not be evanescent and which would be contained in A. This would imply the existence of some t R + such that X T t would not be equal to Y T t almost surely. Theorem 3.3. Let (X t ) and (Y t ) be two optional (resp. predictable) processes. If for every stopping time (resp. every predictable stopping time) one has: E[X T 1 T< ] = E[Y T 1 T< ], (3.1) then the processes (X t ) and (Y t ) are indistinguishable 2 Proof. It suffices to apply the section theorems to the sets: A = {(t, ω) : X t (ω) < Y t (ω)} A = {(t, ω) : X t (ω) > Y t (ω)}. Remark 3.4. It is very important to check (3.1) for any stopping time, finite or not. Indeed, if (M t ) is a uniformly integrable martingale, with M =, then for every finite stopping time, E[M T 1 T< ] =, but (M t ) is not usually indistinguishable from the null process. To conclude this section, we give two other well known results as a consequence of the section theorems. We first recall the definition of the class (D): Definition 3.5 (class (D)). A process X is said to be of class (D) if the family {X T 1 T<, T a stopping time} is uniformly integrable (T ranges through all stopping times). 2 Two processes X and X defined on the same probability space are called indistinguishable if for almost all ω, X t = X t for every t.

13 A. Nikeghbali/The general theory of stochastic processes 357 Proposition 3.6. Let (Z t ) t be an optional process. Assume that for all stopping times T, the random variable Z T is in L 1 and E[Z T ] does not depend on T. Then (Z t ) is a uniformly integrable martingale which is right continuous (up to an evanescent set). Proof. Let T be a stopping time and let A F T. The assumptions of the proposition yield: E[Z TA ] = E[Z ], and hence Z T dp + Z dp = A A c Consequently, we have: A Z T = E[Z F T ] a.s. Z dp + Z dp. A c Now define also (X t ) as the càdlàg version of the martingale: X t = E[Z F t ]. By the optional stopping theorem, we have X T = E[Z F T ], and with an application of the section theorem, we obtain that X and Z are indistinguishable and this completes the proof of the proposition. 4. Projection theorems In this section we introduce the fundamental notions of optional (resp. predictable) projection and dual optional (resp. predictable) projection. These projections play a very important role in the general theory of stochastic processes. We shall give some nice applications in subsequent sections (in particular we shall see how the knowledge of the dual predictable projection of some honest times may lead to quick proofs of multidimensional extensions of Paul Lévy s arc sine law). Here again, the material is standard in the general theory of stochastic processes and the reader can refer to the books [24] or [26, 27] for more details and refinements The optional and predictable projections By convention, we take F = F. Theorem 4.1. Let X be a measurable process either positive or bounded. There exists a unique (up to indistinguishability) optional process Y such that: for every stopping time T. E[X T 1 T< F T ] = Y T 1 T< a.s.

14 A. Nikeghbali/The general theory of stochastic processes 358 Definition 4.2. The process Y is called the optional projection of X, and is denoted by o X. Proof. The uniqueness follows from the optional section theorem. The space of bounded processes X which admit an optional projection is a vector space. Moreover, let X n be a uniformly bounded increasing sequence of processes with limit X and suppose they admit projections Y n. The section theorem again shows that the sequence (Y n ) is a.s. increasing; it is easily checked that limy n is a projection for X. By the monotone class theorem (as it is stated for example in [69] Theorem 2.2, p.3) it is now enough to prove the statement for a class of processes closed under pointwise multiplication and generating the sigma field F B (R + ). Such a class is provided by the processes X t (ω) = 1 [,u[ (t)h (ω), u, H L (F). Let (H t ) be a càdlàg version of E[H F t ] (with the convention that H = H ). The optional stopping theorem proves that Y t = 1 [,u[ (t)h t satisfies the condition of the statement. The proof is complete in the bounded case. For the general case we use the processes X n and pass to the limit. Theorem 4.3. Let X be a measurable process either positive or bounded. There exists a unique (up to indistinguishability) predictable process Z such that: for every predictable stopping time T. E[X T 1 T< F T ] = Z T 1 T< a.s. Definition 4.4. The process Z is called the predictable projection of X, and is denoted by p X. Proof. The proof is exactly the same as the proof of Theorem 4.1, except that at the end, one has to apply the predictable stopping theorem we shall give below. Theorem 4.5 (Predictable stopping theorem). Let (X t ) be a right continuous and uniformly integrable martingale. Then for any predictable stopping time, we have: X T = E[X T F T ] = E[X F T ]. Consequently, the predictable projection of X is the process (X t ). Proof. Let (T n ) be a sequence of stopping times that announces T; we then have [ X T = lim X T n = E X T ] F Tn n n [ = E X ] F Tn. n

15 A. Nikeghbali/The general theory of stochastic processes 359 Since F T = n F T n, the desired result follows easily. Remark 4.6. In the proof of Theorem 4.1, we have in fact given some fundamental examples of projections: if H is an integrable random variable, and if (H t ) is a càdlàg version of the martingale E[H F t ], then the optional projection of the constant process X t (ω) = H (ω) is (H t ) and its predictable projection is (H t ). Corollary 4.7. If X is a predictable local martingale, then X is continuous. Proof. By localization, it suffices to consider uniformly integrable martingales. Let T be a predictable stopping time. Since X is predictable, X T is F T measurable and hence: E[X T F T ] = X T, and from the predictable stopping theorem, X T = X T. As this holds for any predictable stopping time and X is predictable, we conclude that X has no jumps. We can also mention another corollary of the predictable stopping theorem which is not so well known: Corollary 4.8 ([27], p. 99). Let (X t ) be a right continuous local martingale. Then for any predictable stopping time, we have: and E[ X T 1 T< F T ] < a.s., E[X T 1 T< F T ] = X T 1 T<. By considering T Λ, where Λ F T or Λ F T, and where T is a stopping time or a predictable stopping, we may as well restate Theorems 4.1 and 4.3 as: Theorem 4.9. Let X be a measurable process either positive or bounded. 1. There exists a unique (up to indistinguishability) optional process Y such that: E[X T 1 T< ] = E[Y T 1 T< ] or any stopping time T. 2. There exists a unique (up to indistinguishability) predictable process Z such that: E[X T 1 T< ] = E[Z T 1 T< ] for any predictable stopping time T. Remark 4.1. Optional and predictable projections share some common properties with the conditional expectation: if X is a measurable process and Y is an optional (resp. predictable) bounded process, then o (XY ) = Y o X, (resp. p (XY ) = Y p X). Now, we state and prove an important theorem about the difference between the optional and predictable sigma fields.

16 A. Nikeghbali/The general theory of stochastic processes 36 Theorem 4.11 ([69], p.174). Let I be the sigma field generated by the processes (M t M t ) where M ranges through the bounded (F t ) martingales; then O = P I. In particular, if all (F t ) martingales are continuous, then O = P and every stopping time is predictable. Proof. Since every optional projection is its own projection, it is enough to show that every optional projection is measurable with respect to P I. But this is obvious for the processes 1 [,u[ (t)h considered in the proof of Theorem 4.1 and an application of the monotone class theorem completes the proof. Example In a Brownian filtration, every stopping time is predictable. To conclude this paragraph, let us give an example from filtering theory. Example 4.13 ([69], exercise 5.15, p.175). Let (F t ) be a filtration, B an (F t ) Brownian Motion and h a bounded optional process. Let Y t = t h s ds + B t ; if ĥ is the optional projection of h with respect to ( Ft Y the process is an ( F Y t N t = Y t t ĥ s ds σ (Y s, s t) ), then ) Brownian Motion. The process N is called the innovation process Increasing processes and projections 3 Definition We shall call an increasing process a process (A t ) which is nonnegative, (F t ) adapted, and whose paths are increasing and càdlàg. Remark A stochastic process which is nonnegative, and whose paths are increasing and càdlàg, but which is not (F t ) adapted, is called a raw increasing process. Increasing processes play a central role in the general theory of stochastic processes: the main idea is to think of an increasing process as a random measure on R +, da t (ω), whose distribution function is A (ω). That is why we shall make the convention that A =, so that A is the measure of {}. The increasing process A is called integrable if E[A ] <. A process which can be written as the difference of two increasing processes (resp. integrable increasing processes) is called a process of finite variation (resp. a process of integrable variation). 3 We use again the convention F = F.

17 A. Nikeghbali/The general theory of stochastic processes 361 Now, let (A t ) be an increasing process and define its right continuous inverse: C t = inf {s : A s > t}. We have C t = inf {s : A s t}. C t is the début of an optional set and hence it is a stopping time. Similarly, C t is a stopping time which is predictable if (A t ) is predictable (as the début of a right closed predictable set). Now, we recall the following time change formula ([27], p.132): for every nonnegative process Z, we have: Z s da s = = Z Cs 1 Cs < ds (4.1) Z Cs 1 Cs< ds. (4.2) With the time change formula, we can now prove the following theorem: Theorem Let X be a nonnegative measurable process and let (A t ) be an increasing process. Then we have: [ ] [ ] E X s da s = E o X s da s, [, [ [, [ and if (A t ) is predictable [ ] [ E X s da s = E [, [ [, [ p X s da s ]. Proof. We shall prove the second equality which is more difficult to prove. From (4.1), we have: [ ] [ ] E X s da s = E X Cs 1 Cs < ds = dse [ X Cs 1 Cs < ]. [, [ Similarly, we can show that: [ ] E p X s da s = [, [ dse [ p X Cs 1 Cs < ]. Now, since C s is a predictable stopping time, we have from the definition of predictable projections: E [ X Cs 1 Cs < ] = E [ p X Cs 1 Cs < ], and this completes the proof of the theorem.

18 A. Nikeghbali/The general theory of stochastic processes 362 Example One often uses the above theorem in the following way. Take X t (ω) a (t)h (ω), where a (t) is a nonnegative Borel function and H a nonnegative random variable. Let (H t ) be a càdlàg version of the martingale E[H F t ]. An application of Theorem 4.16 yields: [ ] [ ] E H a(s)da s = E H s a(s)da s, [, [ [, [ if A is optional and [ E H [, [ a(s)da s ] = E [ ] H s a(s)da s, [, [ if A is predictable. In particular, for all uniformly (F t ) martingales (M t ), we have (we assume A = ): [ ] E[M A ] = E M s da s, [, [ if A is optional and E[M A ] = E [ ] M s da s, [, [ if A is predictable. Now, we give a sort of converse to Theorem 4.16: Theorem 4.18 ([27], p.136). Let (A t ) be a raw increasing process which is further assumed to be integrable (E[A ] < ). 1. If for all bounded and measurable processes X we have [ ] [ ] E X s da s = E o X s da s, [, [ [, [ then A is optional. 2. If for all bounded and measurable processes X we have [ ] [ ] E X s da s = E p X s da s, [, [ then A is predictable. [, [ 4.3. Random measures on (R + Ω) and the dual projections Definition We call P-measure a bounded measure on the sigma field B (R + ) F (resp. O, P) which does not charge the P-evanescent sets of B (R + ) F (resp. O, P).

19 A. Nikeghbali/The general theory of stochastic processes 363 We can construct a P-measure µ in the following way: let (A t ) be a raw increasing process which is integrable; for any bounded measurable process [ ] µ (X) E X s da s. [, [ In fact, quite remarkably, all P-measures are of this form: Theorem 4.2 ([27], p.141). Let µ be a nonnegative P-measure (resp. P- measure) on B (R + ) F. Then, there exists a unique raw and integrable increasing process (A t ) (resp. a raw process of integrable variation), up to indistinguishability, such that for any bounded process X [ ] µ (X) = E X s da s. [, [ We say that A is the integrable increasing process (resp. the process of integrable variation) associated with µ. Furthermore, (A t ) is optional (resp. predictable) if and only if for any bounded measurable process X: µ (X) = µ ( o X); resp. µ (X) = µ ( p X). Definition A P-measure is called optional (resp. predictable) if for any bounded measurable process X: µ (X) = µ ( o X); resp. µ (X) = µ ( p X). Remark From Theorem 4.18, the increasing process associated with an optional (resp. predictable) measure µ is optional (resp. predictable). Now, we give some interesting consequences of Theorem 4.2. The first one is useful to prove the uniqueness of the Doob-Mayer decomposition of supermartingales. Theorem Let (A t ) and (B t ) be two processes of integrable variation. If for any stopping time T E[A A T F T ] = E[B B T F T ] a.s. (A = B = ) then A and B are indistinguishable. Similarly, if (A t ) and (B t ) are two predictable processes of integrable variation and if for any t then A and B are indistinguishable. E[A A t F t ] = E[B B t F t ] a.s.

20 A. Nikeghbali/The general theory of stochastic processes 364 Proof. We prove only the optional case, the proof for the predictable case being the same. Let µ be the measure associated with A B. The condition of the theorem entails that for any stopping time T: µ ([T, [) =, and consequently, µ (U) =, whenever U is any finite and disjoint union of stochastic intervals of the form [S, T[. Hence µ is null on a Boole algebra which generates the optional sigma field and thus µ ( o X) = for all bounded measurable process X, and from Theorem 4.2, µ is null on B (R + ) F, and A B =. Remark The above theorem states that the processes (A A t ) and (B B t ) have the same optional projection. Moreover, as already observed for the projection theorems, one can replace the conditions of the theorem, in the optional case, with the unconditional form: E[A A T ] = E[B B T ] for any stopping time T. Indeed, it suffices to consider the stopping times T H, with H F T. Remark For the predictable case, one must be more cautious. First, we note that the condition of the theorem combined with the right continuity of the paths entail: E[A A T F T ] = E[B B T F T ] a.s. for any stopping time T. From this we deduce now the unconditional form E[A A T ] = E[B B T ] for any stopping time T. In this case, (A A t ) and (B B t ) have the same optional projection. Now, we define projections of P-measures: Definition Let µ be a P-measure on B (R + ) F. We call optional (resp. predictable) projection of µ the P-measure µ o (resp. µ p ) defined on B (R + ) F, by: µ o (X) = µ ( o X) = (resp. µ p (X) = µ ( p X)), for all measurable and bounded processes X. Example Let µ be a nonnegative measure with a density f: [ ] [ ] µ (X) = E X s (ω)f (s, ω)ds (f, E f (s, ω)ds < ).

21 Then we have: A. Nikeghbali/The general theory of stochastic processes 365 [ ] µ o (X) = µ p (X) = E X s (ω)g (s, ω)ds, where g is any nonnegative measurable process such that g (s, ) = E[f (s, ) F s ] for almost all s. For example g can be the optional or predictable projection of f. But usually, unlike the above example, µ o (or µ p ) is not associated with o A (or p A); this leads us to the following fundamental definition of dual projections: Definition Let (A t ) be an integrable raw increasing process. We call dual optional projection of A the (optional) increasing process (A o t ) defined by: [ ] [ ] E X s da o s = E o X s da s, [, [ [, [ for any bounded measurable X. We call dual predictable projection of A the predictable increasing process (A p t) defined by: [ ] [ ] E X s da p s = E p X s da s, [, [ [, [ for any bounded measurable X. Remark The above definition extends in a straightforward way to processes of integrable variation. Remark 4.3. Formally, the projection operation consists in defining conveniently the process E[A t F t ], whereas the dual projection operation consists of defining the symbolic integral t E[dA s F s ]. In particular, the dual projection of a bounded process needs not be bounded (for example, the dual projection of an honest time, as will be explained in a subsequent section). Now, we shall try to compute the jumps of the dual projections. Proposition Let (A t ) and (B t ) be two raw processes of integrable variation. 1. A and B have the same dual optional projection if and only if for any stopping time T: E[A A T F T ] = E[B B T F T ] a.s. (A = B = ). 2. A and B have the same dual predictable projection if and only if for every t : E[A A t F t ] = E[B B t F t ] a.s..

22 A. Nikeghbali/The general theory of stochastic processes 366 Proof. It is exactly the same as the proof of Theorem Theorem Let (A t ) be a raw process of integrable variation. 1. Let T be a stopping time. Then the jump of A o at T, A o T, is given by: A o T = E[ A T F T ] a.s., with the convention that A o =. 2. Let T be a predictable stopping time. Then the jump of A p at T, A p T, is given by: A p T = E[ A T F T ] a.s., with the convention that A p =. Proof. We only deal with the optional case (the predictable case can be dealt with similarly). From Proposition 4.31, we have: Similarly, we have: and consequently E[A A T F T ] = E [ A o A o T F T ]. E[A A T F T ] = E[A o Ao T F T], E[ A T F T ] = E[ A o T F T]. Now, the result follows from the fact that A o T is F T measurable. Now we define predictable compensators. Definition Let (A t ) be an optional process of integrable variation. The dual predictable projection of A, which we shall denote by Ã, is also called the predictable compensator of A. Why is à called compensator? From Proposition 4.31, we have: ] A t Ãt = E [A à F t, a.s. and A = Ã. Consequently, A à is a martingale and à is the process that one has to subtract to A to obtain a martingale which vanishes at. This is for example what one does to go from the Poisson process N t to the compensated Poisson process N t λt. Another classical application is concerned with totally inaccessible stopping times: Proposition Let T be a stopping time which is strictly positive. Then the following are equivalent: 1. T is totally inaccessible; 2. there exists a uniformly integrable martingale (M t ), with M =, which is continuous outside [T], and such that M T = 1 on {T < }.

23 A. Nikeghbali/The general theory of stochastic processes 367 Proof. (2) (1). We have to prove that P [T = S < ] = for any predictable stopping time S. We have: Now, since S is predictable, and hence M S = M S 1 S< = M T 1 S=T< = 1 S=T<. E[M S ] = E[M ] = E[M S ], P [T = S < ] = E[ M S ] =. (1) (2). Let A t = 1 T t and let à denote its predictable compensator. From Theorem 4.32, for all predictable stopping times S, we have: ÃS = E[ A S F S ] = a.s. since P [T = S < ] = (T is predictable). Consequently, à is continuous. Now, it is easy to check that M A à satisfies the required properties. Now, we shall give the law of ÃT = Ã. Proposition) 4.35 (Azéma [2]). Let T be a finite, totally inaccessible stopping time. Let (Ãt be the predictable compensator of 1 T t. Then ÃT is exponentially distributed with parameter 1. Proof. Let M t = 1 T t à t be the martingale introduced in the proof of Proposition We associate with f, a Borel bounded function, the stochastic integral M f t t f (A s) dm s. We can compute M f t more explicitly: M f t = f (A T)1 T t t f (A s )da s = f (A T )1 T t F (A t ), where F (t) = t dsf (s). An application of the optional stopping theorem yields E[M ] =, which implies: E[f (A T )] = E[F (A T )]. As this last equality holds for every Borel bounded function, the law of A T must be the standard exponential law. Remark This above proposition has some nice applications to honest times that avoids stopping times in the theory of progressive enlargements of filtrations as we shall see later (see [42], [64]). Now, we give two nice applications of dual projections. First, we refine Theorem 4.18:

24 A. Nikeghbali/The general theory of stochastic processes 368 Theorem Let (A t ) be a raw increasing process such that E[A t ] < for all t. 1. If for all bounded (F t ) martingales X we have [ t ] E X s da s = E[X A t ], for all t then A is optional. 2. If for all bounded (F t ) martingales X we have then A is predictable. [ t ] E X s da s = E[X A t ], for all t Proof. If (X t ) is an (F t ) bounded martingale, then: [ t ] [ t ] E X s da s = E X s da o s = E[X t A o t ] = E[X A o t ], and similarly: [ t ] [ t ] E X s da s = E X s da p s = E[X t A p t ] = E[X A p t]. Consequently, under the hypothesis (1), one obtains: E[XA t ] = E[XA o t ], for every X L1, which implies A t = A o t. Likewise, under hypothesis (2), A t = A p t. As a by product of Theorem 4.37, we can show the characterization of stopping times by Knight and Maisonneuve. We introduce the sigma field (F ρ ) associated with an arbitrary random time ρ, i.e. a nonnegative random variable: F ρ = σ {z ρ, (z t ) any (F t ) optional process}. Theorem 4.38 (Knight-Maisonneuve [48]). If for all uniformly integrable (F t ) martingales (M t ), one has E[M F ρ ] = M ρ, on {ρ < }, then ρ is a (F t ) stopping time (the converse is Doob s optional stopping theorem).

25 A. Nikeghbali/The general theory of stochastic processes 369 Proof. For t we have E [ [ ] [ ] t ] M 1 (ρ t) = E Mρ 1 (ρ t) = E M s da ρ s = E[M A ρ t ]. Comparing the two extreme terms, we get i.e ρ is a (F t ) stopping time. 1 (ρ t) = A ρ t, Remark The result of Knight and Maisonneuve suggests the following questions: How may E[M F ρ ] on the one hand, and M ρ on the other hand, differ for a non stopping time ρ? The reader can refer to [6], [81] or [64] for some answers. Given an arbitrary random time ρ, is it possible to characterize the set of martingales which satisfy (2.1)? (see [6], [81] or [64]). Now we shall see how an application of dual projections and their simple properties gives a simple proof of a multidimensional extension of Lévy s arc sine law. The results that follow are borrowed from [62]. Consider (R t, L t ), where (R t ) is a Bessel process of dimension δ 2(1 µ) (, 2), starting from, and (L t ) a normalization of its local time at level zero (see [62] for more precisions on this normalization). Lemma 4.4 ([62]). Let g µ sup {t 1 : R t = }. Then the dual predictable projection A gµ t of 1 (gµ t) is: A gµ t = 1 t 1 2 µ Γ (1 + µ) i.e. for every nonnegative predictable process (x t ), E [ [ ] 1 1 x gµ = 2 µ Γ (1 + µ) E dl u (1 u) µ, x u dl u (1 u) µ Remark The random time g µ is not a stopping time; it is a typical example of an honest time (we shall define these times in a subsequent section). Now we give a result which was obtained by Barlow, Pitman and Yor, using excursion theory. The proof we give is borrowed from [62]. Proposition 4.42 ([14],[62]). The variable g µ follows the law: P (g µ dt) = sin (µπ) π ]. dt t 1 µ (1 t) µ, < t < 1,

26 A. Nikeghbali/The general theory of stochastic processes 37 i.e. the Beta law with parameters (µ, 1 µ). In particular, P (g dt) = 1 π i.e. g is arc sine distributed ([49]). dt t (1 t), Proof. From Lemma 4.4, for every Borel function f : [, 1] R +, we have: [ 1 1 ] E[f (g µ )] = 2 µ µγ (µ) E f (u) 1 1 f (u) dl u (1 u) µ = 2 µ d u E[L u ] µγ (µ) (1 u) µ. (4.3) By the scaling property of (L t ), E[L u ] = u µ E[L 1 ]. Moreover, by definition of (L t ) (see [62] or [21]), [ ] E[L 1 ] = E ; since R1 2 is distributed as 2 gam(1 µ), where gam(1 µ) denotes a random variable which follows the gamma law with parameter µ, we have [ ] E R 2µ 2 µ 1 = Γ (1 µ). Now, plugging this in (4.3) yields: E[f (g µ )] = 1 Γ (µ)γ(1 µ) R 2µ 1 1 f (u) du t 1 µ (1 u) µ. To conclude, it suffices to use the duplication formula for the Gamma function ([1]): π Γ (µ)γ(1 µ) = sin (µπ). We shall conclude this section giving a result which is useful in stochastic integration. Before, we need to introduce the notion of locally integrable increasing process. Definition A raw increasing process, such that A =, is said to be locally integrable if there exists an increasing sequence of stopping times (T n ), such that lim n T n =, and E[A Tn ] <. Theorem [[27], Theorem 8, p.153] 1. Every predictable process of finite variation is locally integrable. 2. An optional process A of finite variation is locally integrable if and only if there exists a predictable process à of finite variation such that (A Ã) is a local martingale which vanishes at. When it exists, à is unique. We say that à is the predictable compensator of A. Remark We shall see an application of this result to the existence of the bracket M of a local martingale in next section.

27 A. Nikeghbali/The general theory of stochastic processes The Doob-Meyer decomposition and multiplicative decompositions This section is devoted to a fundamental result in Probability Theory: the Doob- Meyer decomposition theorem. This result is now standard and a proof of it can be found for example in [27, 68, 7]. Theorem 5.1 (Doob-Meyer decomposition). An adapted càdlàg process Y is a submartingale of class (D) null at if and only if Y may be written: Y t = M t + A t (5.1) where M is a uniformly integrable martingale null at and A a predictable integrable increasing process null at. Moreover, the decomposition (5.1) is unique. The following theorem gives a necessary and sufficient condition for A to be continuous. Proposition 5.2. Let Y = M + A be a submartingale of class (D). The compensator A of Y is continuous if and only if for every predictable stopping time T: E[Y T ] = E[Y T ]. Proof. Let (T n ) be a sequence of stopping times that announce T. Since Y is of class (D), we have: which is: lim E[Y Y Tn ] = lim E[A A Tn ], n n E[Y Y T ] = E[A A T ]. Now, we also have E[Y Y T ] = E[A A T ]. Consequently, we have: E[Y T Y T ] = E[A T A T ], and the result of the proposition follows easily. Now we give the local form of the Doob-Meyer decomposition; for this we need the following lemma: Lemma 5.3 ([7], p.374). A submartingale Y is locally of class (D) (i.e. there exists a sequence of stopping times T n such that T n as n and such that the stopped process (Y t Tn ) is a submartingale of class (D) for every n). Theorem 5.4. Let Y be a local submartingale. Then Y may be written uniquely in the form: Z = Z + M + A where M is a local martingale null at and A is a predictable increasing process null at.

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Pseudo-stopping times and the hypothesis (H)

Pseudo-stopping times and the hypothesis (H) Pseudo-stopping times and the hypothesis (H) Anna Aksamit Laboratoire de Mathématiques et Modélisation d'évry (LaMME) UMR CNRS 80712, Université d'évry Val d'essonne 91037 Évry Cedex, France Libo Li Department

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

Immersion of strong Brownian filtrations with honest time avoiding stopping times

Immersion of strong Brownian filtrations with honest time avoiding stopping times Bol. Soc. Paran. Mat. (3s.) v. 35 3 (217): 255 261. c SPM ISSN-2175-1188 on line ISSN-378712 in press SPM: www.spm.uem.br/bspm doi:1.5269/bspm.v35i3.2833 Immersion of strong Brownian filtrations with honest

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

PROGRESSIVE ENLARGEMENTS OF FILTRATIONS AND SEMIMARTINGALE DECOMPOSITIONS

PROGRESSIVE ENLARGEMENTS OF FILTRATIONS AND SEMIMARTINGALE DECOMPOSITIONS PROGRESSIVE ENLARGEMENTS OF FILTRATIONS AND SEMIMARTINGALE DECOMPOSITIONS Libo Li and Marek Rutkowski School of Mathematics and Statistics University of Sydney NSW 26, Australia July 1, 211 Abstract We

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Summable processes which are not semimartingales

Summable processes which are not semimartingales Summable processes which are not semimartingales Nicolae Dinculeanu Oana Mocioalca University of Florida, Gainesville, FL 32611 Abstract The classical stochastic integral HdX is defined for real-valued

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Risk-Minimality and Orthogonality of Martingales

Risk-Minimality and Orthogonality of Martingales Risk-Minimality and Orthogonality of Martingales Martin Schweizer Universität Bonn Institut für Angewandte Mathematik Wegelerstraße 6 D 53 Bonn 1 (Stochastics and Stochastics Reports 3 (199, 123 131 2

More information

Point Process Control

Point Process Control Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

The Uniform Integrability of Martingales. On a Question by Alexander Cherny

The Uniform Integrability of Martingales. On a Question by Alexander Cherny The Uniform Integrability of Martingales. On a Question by Alexander Cherny Johannes Ruf Department of Mathematics University College London May 1, 2015 Abstract Let X be a progressively measurable, almost

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1)

n [ F (b j ) F (a j ) ], n j=1(a j, b j ] E (4.1) 1.4. CONSTRUCTION OF LEBESGUE-STIELTJES MEASURES In this section we shall put to use the Carathéodory-Hahn theory, in order to construct measures with certain desirable properties first on the real line

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Non-arbitrage condition and thin random times

Non-arbitrage condition and thin random times Non-arbitrage condition and thin random times Laboratoire d'analyse & Probabilités, Université d'evry 6th General AMaMeF and Banach Center Conference Warszawa 14 June 2013 Problem Question Consider a Default-free

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

INVARIANCE TIMES. Laboratoire de Mathématiques et Modélisation d Évry and UMR CNRS Évry Cedex, France

INVARIANCE TIMES. Laboratoire de Mathématiques et Modélisation d Évry and UMR CNRS Évry Cedex, France Submitted to the Annals of Probability By Stéphane INVARIANCE TIMES Crépey and Shiqi Song Université d Évry Val d Essonne Laboratoire de Mathématiques et Modélisation d Évry and UMR CNRS 807 9037 Évry

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

An Almost Sure Approximation for the Predictable Process in the Doob Meyer Decomposition Theorem

An Almost Sure Approximation for the Predictable Process in the Doob Meyer Decomposition Theorem An Almost Sure Approximation for the Predictable Process in the Doob Meyer Decomposition heorem Adam Jakubowski Nicolaus Copernicus University, Faculty of Mathematics and Computer Science, ul. Chopina

More information

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory

Part V. 17 Introduction: What are measures and why measurable sets. Lebesgue Integration Theory Part V 7 Introduction: What are measures and why measurable sets Lebesgue Integration Theory Definition 7. (Preliminary). A measure on a set is a function :2 [ ] such that. () = 2. If { } = is a finite

More information

Notes on uniform convergence

Notes on uniform convergence Notes on uniform convergence Erik Wahlén erik.wahlen@math.lu.se January 17, 2012 1 Numerical sequences We begin by recalling some properties of numerical sequences. By a numerical sequence we simply mean

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Math 564 Homework 1. Solutions.

Math 564 Homework 1. Solutions. Math 564 Homework 1. Solutions. Problem 1. Prove Proposition 0.2.2. A guide to this problem: start with the open set S = (a, b), for example. First assume that a >, and show that the number a has the properties

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE

FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE FOUNDATIONS OF MARTINGALE THEORY AND STOCHASTIC CALCULUS FROM A FINANCE PERSPECTIVE JOSEF TEICHMANN 1. Introduction The language of mathematical Finance allows to express many results of martingale theory

More information

Optimal stopping for non-linear expectations Part I

Optimal stopping for non-linear expectations Part I Stochastic Processes and their Applications 121 (2011) 185 211 www.elsevier.com/locate/spa Optimal stopping for non-linear expectations Part I Erhan Bayraktar, Song Yao Department of Mathematics, University

More information

Additive Summable Processes and their Stochastic Integral

Additive Summable Processes and their Stochastic Integral Additive Summable Processes and their Stochastic Integral August 19, 2005 Abstract We define and study a class of summable processes,called additive summable processes, that is larger than the class used

More information

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

MORE ON CONTINUOUS FUNCTIONS AND SETS

MORE ON CONTINUOUS FUNCTIONS AND SETS Chapter 6 MORE ON CONTINUOUS FUNCTIONS AND SETS This chapter can be considered enrichment material containing also several more advanced topics and may be skipped in its entirety. You can proceed directly

More information

Chapter 4. Measure Theory. 1. Measure Spaces

Chapter 4. Measure Theory. 1. Measure Spaces Chapter 4. Measure Theory 1. Measure Spaces Let X be a nonempty set. A collection S of subsets of X is said to be an algebra on X if S has the following properties: 1. X S; 2. if A S, then A c S; 3. if

More information

Some Background Material

Some Background Material Chapter 1 Some Background Material In the first chapter, we present a quick review of elementary - but important - material as a way of dipping our toes in the water. This chapter also introduces important

More information

Information and Credit Risk

Information and Credit Risk Information and Credit Risk M. L. Bedini Université de Bretagne Occidentale, Brest - Friedrich Schiller Universität, Jena Jena, March 2011 M. L. Bedini (Université de Bretagne Occidentale, Brest Information

More information

PREDICTABLE REPRESENTATION PROPERTY OF SOME HILBERTIAN MARTINGALES. 1. Introduction.

PREDICTABLE REPRESENTATION PROPERTY OF SOME HILBERTIAN MARTINGALES. 1. Introduction. Acta Math. Univ. Comenianae Vol. LXXVII, 1(28), pp. 123 128 123 PREDICTABLE REPRESENTATION PROPERTY OF SOME HILBERTIAN MARTINGALES M. EL KADIRI Abstract. We prove as for the real case that a martingale

More information

Martingales, standard filtrations, and stopping times

Martingales, standard filtrations, and stopping times Project 4 Martingales, standard filtrations, and stopping times Throughout this Project the index set T is taken to equal R +, unless explicitly noted otherwise. Some things you might want to explain in

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

On the Converse Law of Large Numbers

On the Converse Law of Large Numbers On the Converse Law of Large Numbers H. Jerome Keisler Yeneng Sun This version: March 15, 2018 Abstract Given a triangular array of random variables and a growth rate without a full upper asymptotic density,

More information

Basic Definitions: Indexed Collections and Random Functions

Basic Definitions: Indexed Collections and Random Functions Chapter 1 Basic Definitions: Indexed Collections and Random Functions Section 1.1 introduces stochastic processes as indexed collections of random variables. Section 1.2 builds the necessary machinery

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

of the set A. Note that the cross-section A ω :={t R + : (t,ω) A} is empty when ω/ π A. It would be impossible to have (ψ(ω), ω) A for such an ω.

of the set A. Note that the cross-section A ω :={t R + : (t,ω) A} is empty when ω/ π A. It would be impossible to have (ψ(ω), ω) A for such an ω. AN-1 Analytic sets For a discrete-time process {X n } adapted to a filtration {F n : n N}, the prime example of a stopping time is τ = inf{n N : X n B}, the first time the process enters some Borel set

More information

The Lebesgue Integral

The Lebesgue Integral The Lebesgue Integral Brent Nelson In these notes we give an introduction to the Lebesgue integral, assuming only a knowledge of metric spaces and the iemann integral. For more details see [1, Chapters

More information

Topology, Math 581, Fall 2017 last updated: November 24, Topology 1, Math 581, Fall 2017: Notes and homework Krzysztof Chris Ciesielski

Topology, Math 581, Fall 2017 last updated: November 24, Topology 1, Math 581, Fall 2017: Notes and homework Krzysztof Chris Ciesielski Topology, Math 581, Fall 2017 last updated: November 24, 2017 1 Topology 1, Math 581, Fall 2017: Notes and homework Krzysztof Chris Ciesielski Class of August 17: Course and syllabus overview. Topology

More information

Estimates for probabilities of independent events and infinite series

Estimates for probabilities of independent events and infinite series Estimates for probabilities of independent events and infinite series Jürgen Grahl and Shahar evo September 9, 06 arxiv:609.0894v [math.pr] 8 Sep 06 Abstract This paper deals with finite or infinite sequences

More information

3 Integration and Expectation

3 Integration and Expectation 3 Integration and Expectation 3.1 Construction of the Lebesgue Integral Let (, F, µ) be a measure space (not necessarily a probability space). Our objective will be to define the Lebesgue integral R fdµ

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

Convergence of Local Supermartingales

Convergence of Local Supermartingales Convergence of Local Supermartingales Martin Larsson Johannes Ruf August 20, 2018 Abstract We characterize the event of convergence of a local supermartingale. Conditions are given in terms of its predictable

More information

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS

GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS Di Girolami, C. and Russo, F. Osaka J. Math. 51 (214), 729 783 GENERALIZED COVARIATION FOR BANACH SPACE VALUED PROCESSES, ITÔ FORMULA AND APPLICATIONS CRISTINA DI GIROLAMI and FRANCESCO RUSSO (Received

More information

Stochastic Calculus. Alan Bain

Stochastic Calculus. Alan Bain Stochastic Calculus Alan Bain 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral and some of its applications. They

More information

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero

We are going to discuss what it means for a sequence to converge in three stages: First, we define what it means for a sequence to converge to zero Chapter Limits of Sequences Calculus Student: lim s n = 0 means the s n are getting closer and closer to zero but never gets there. Instructor: ARGHHHHH! Exercise. Think of a better response for the instructor.

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

REAL ANALYSIS LECTURE NOTES: 1.4 OUTER MEASURE

REAL ANALYSIS LECTURE NOTES: 1.4 OUTER MEASURE REAL ANALYSIS LECTURE NOTES: 1.4 OUTER MEASURE CHRISTOPHER HEIL 1.4.1 Introduction We will expand on Section 1.4 of Folland s text, which covers abstract outer measures also called exterior measures).

More information

Filters in Analysis and Topology

Filters in Analysis and Topology Filters in Analysis and Topology David MacIver July 1, 2004 Abstract The study of filters is a very natural way to talk about convergence in an arbitrary topological space, and carries over nicely into

More information

JUSTIN HARTMANN. F n Σ.

JUSTIN HARTMANN. F n Σ. BROWNIAN MOTION JUSTIN HARTMANN Abstract. This paper begins to explore a rigorous introduction to probability theory using ideas from algebra, measure theory, and other areas. We start with a basic explanation

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Measure theory and probability

Measure theory and probability Chapter 1 Measure theory and probability Aim and contents This chapter contains a number of exercises, aimed at familiarizing the reader with some important measure theoretic concepts, such as: Monotone

More information

2. The Concept of Convergence: Ultrafilters and Nets

2. The Concept of Convergence: Ultrafilters and Nets 2. The Concept of Convergence: Ultrafilters and Nets NOTE: AS OF 2008, SOME OF THIS STUFF IS A BIT OUT- DATED AND HAS A FEW TYPOS. I WILL REVISE THIS MATE- RIAL SOMETIME. In this lecture we discuss two

More information

A REMARK ON SLUTSKY S THEOREM. Freddy Delbaen Departement für Mathematik, ETH Zürich

A REMARK ON SLUTSKY S THEOREM. Freddy Delbaen Departement für Mathematik, ETH Zürich A REMARK ON SLUTSKY S THEOREM Freddy Delbaen Departement für Mathematik, ETH Zürich. Introduction and Notation. In Theorem of the paper by [BEKSY] a generalisation of a theorem of Slutsky is used. In this

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

CHAPTER 1. Martingales

CHAPTER 1. Martingales CHAPTER 1 Martingales The basic limit theorems of probability, such as the elementary laws of large numbers and central limit theorems, establish that certain averages of independent variables converge

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Measure and integration

Measure and integration Chapter 5 Measure and integration In calculus you have learned how to calculate the size of different kinds of sets: the length of a curve, the area of a region or a surface, the volume or mass of a solid.

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Introduction to Real Analysis Alternative Chapter 1

Introduction to Real Analysis Alternative Chapter 1 Christopher Heil Introduction to Real Analysis Alternative Chapter 1 A Primer on Norms and Banach Spaces Last Updated: March 10, 2018 c 2018 by Christopher Heil Chapter 1 A Primer on Norms and Banach Spaces

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Sample Spaces, Random Variables

Sample Spaces, Random Variables Sample Spaces, Random Variables Moulinath Banerjee University of Michigan August 3, 22 Probabilities In talking about probabilities, the fundamental object is Ω, the sample space. (elements) in Ω are denoted

More information

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set Analysis Finite and Infinite Sets Definition. An initial segment is {n N n n 0 }. Definition. A finite set can be put into one-to-one correspondence with an initial segment. The empty set is also considered

More information

SUPERMARTINGALES AS RADON-NIKODYM DENSITIES AND RELATED MEASURE EXTENSIONS

SUPERMARTINGALES AS RADON-NIKODYM DENSITIES AND RELATED MEASURE EXTENSIONS Submitted to the Annals of Probability SUPERMARTINGALES AS RADON-NIKODYM DENSITIES AND RELATED MEASURE EXTENSIONS By Nicolas Perkowski and Johannes Ruf Université Paris Dauphine and University College

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Supermartingales as Radon-Nikodym Densities and Related Measure Extensions

Supermartingales as Radon-Nikodym Densities and Related Measure Extensions Supermartingales as Radon-Nikodym Densities and Related Measure Extensions Nicolas Perkowski Institut für Mathematik Humboldt-Universität zu Berlin Johannes Ruf Oxford-Man Institute of Quantitative Finance

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

CHAPTER 7. Connectedness

CHAPTER 7. Connectedness CHAPTER 7 Connectedness 7.1. Connected topological spaces Definition 7.1. A topological space (X, T X ) is said to be connected if there is no continuous surjection f : X {0, 1} where the two point set

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

1.4 Outer measures 10 CHAPTER 1. MEASURE

1.4 Outer measures 10 CHAPTER 1. MEASURE 10 CHAPTER 1. MEASURE 1.3.6. ( Almost everywhere and null sets If (X, A, µ is a measure space, then a set in A is called a null set (or µ-null if its measure is 0. Clearly a countable union of null sets

More information

A BOREL SOLUTION TO THE HORN-TARSKI PROBLEM. MSC 2000: 03E05, 03E20, 06A10 Keywords: Chain Conditions, Boolean Algebras.

A BOREL SOLUTION TO THE HORN-TARSKI PROBLEM. MSC 2000: 03E05, 03E20, 06A10 Keywords: Chain Conditions, Boolean Algebras. A BOREL SOLUTION TO THE HORN-TARSKI PROBLEM STEVO TODORCEVIC Abstract. We describe a Borel poset satisfying the σ-finite chain condition but failing to satisfy the σ-bounded chain condition. MSC 2000:

More information

Local times for functions with finite variation: two versions of Stieltjes change of variables formula

Local times for functions with finite variation: two versions of Stieltjes change of variables formula Local times for functions with finite variation: two versions of Stieltjes change of variables formula Jean Bertoin and Marc Yor hal-835685, version 2-4 Jul 213 Abstract We introduce two natural notions

More information

Quasi-sure Stochastic Analysis through Aggregation

Quasi-sure Stochastic Analysis through Aggregation Quasi-sure Stochastic Analysis through Aggregation H. Mete Soner Nizar Touzi Jianfeng Zhang March 7, 21 Abstract This paper is on developing stochastic analysis simultaneously under a general family of

More information

Metric Spaces and Topology

Metric Spaces and Topology Chapter 2 Metric Spaces and Topology From an engineering perspective, the most important way to construct a topology on a set is to define the topology in terms of a metric on the set. This approach underlies

More information

Problem List MATH 5143 Fall, 2013

Problem List MATH 5143 Fall, 2013 Problem List MATH 5143 Fall, 2013 On any problem you may use the result of any previous problem (even if you were not able to do it) and any information given in class up to the moment the problem was

More information

arxiv: v1 [math.pr] 19 Oct 2018

arxiv: v1 [math.pr] 19 Oct 2018 On a tochastic Representation Theorem for Meyer-measurable Processes and its Applications in tochastic Optimal Control and Optimal topping Peter Bank 1 David Besslich 2 October 22, 2018 arxiv:1810.08491v1

More information

Part III. 10 Topological Space Basics. Topological Spaces

Part III. 10 Topological Space Basics. Topological Spaces Part III 10 Topological Space Basics Topological Spaces Using the metric space results above as motivation we will axiomatize the notion of being an open set to more general settings. Definition 10.1.

More information

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS

INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS INDISTINGUISHABILITY OF ABSOLUTELY CONTINUOUS AND SINGULAR DISTRIBUTIONS STEVEN P. LALLEY AND ANDREW NOBEL Abstract. It is shown that there are no consistent decision rules for the hypothesis testing problem

More information

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be

Chapter 6 Expectation and Conditional Expectation. Lectures Definition 6.1. Two random variables defined on a probability space are said to be Chapter 6 Expectation and Conditional Expectation Lectures 24-30 In this chapter, we introduce expected value or the mean of a random variable. First we define expectation for discrete random variables

More information

MATH5011 Real Analysis I. Exercise 1 Suggested Solution

MATH5011 Real Analysis I. Exercise 1 Suggested Solution MATH5011 Real Analysis I Exercise 1 Suggested Solution Notations in the notes are used. (1) Show that every open set in R can be written as a countable union of mutually disjoint open intervals. Hint:

More information