Jump Processes. Richard F. Bass

Size: px
Start display at page:

Download "Jump Processes. Richard F. Bass"

Transcription

1 Jump Processes Richard F. Bass

2 ii c Copyright 214 Richard F. Bass

3 Contents 1 Poisson processes Definitions Stopping times Markov properties A characterization Martingales Lévy processes Examples Construction of Lévy processes Representation of Lévy processes Symmetric stable processes Stochastic calculus Decomposition of martingales Stochastic integrals Itô s formula The reduction theorem Semimartingales The Girsanov theorem iii

4 iv CONTENTS 4 Stochastic differential equations Poisson point processes The Lipschitz case Analogue of Yamada-Watanabe theorem The space D[, 1] Convergence of probability measures The portmanteau theorem The Prohorov theorem Metrics for D[, 1] Compactness and completeness The Aldous criterion Markov processes Introduction Definition of a Markov process Transition probabilities The canonical process and shift operators Enlarging the filtration The Markov property Strong Markov property Stable-like processes Martingale problems Stable-like processes Some properties Harnack inequality Regularity

5 CONTENTS v 8 Symmetric jump processes Dirichlet forms Construction of the semigroup Symmetric jump processes The Poincaré and Nash inequalities Upper bounds on the transition densities

6 vi CONTENTS

7 Chapter 1 Poisson processes 1.1 Definitions Let (Ω, F, P) be a probability space. A filtration is a collection of σ-fields F t contained in F such that F s F t whenever s < t. A filtration satisfies the usual conditions if it is complete: N F t for all t whenever P(N) = and it is right continuous: F t+ = F t for all t, where F t+ = ε> F t+ε. Definition 1.1 Let {F t } be a filtration, not necessarily satisfying the usual conditions. A Poisson process with parameter λ > is a stochastic process X satisfying the following properties: (1) X =, a.s. (2) The paths of X t are right continuous with left limits. (3) If s < t, then X t X s is a Poisson random variable with parameter λ(t s). (4) If s < t, then X t X s is independent of F s. Define X t = lim s t,s<t X s, the left hand limit at time t, and X t = X t X t, the size of the jump at time t. We say a function f is increasing if s < t implies f(s) f(t). We use strictly increasing when s < t implies f(s) < f(t). We have the following proposition. 1

8 2 CHAPTER 1. POISSON PROCESSES Proposition 1.2 Let X be a Poisson process. With probability one, the paths of X t are increasing and are constant except for jumps of size 1. There are only finitely many jumps in each finite time interval. Proof. For any fixed s < t, we have that X t X s has the distribution of a Poisson random variable with parameter λ(t s), hence is non-negative, a.s.; let N s,t be the null set of ω s where X t (ω) < X s (ω). The set of pairs (s, t) with s and t rational is countable, and so N = s,t Q+ N s,t is also a null set, where we write Q + for the non-negative rationals. For ω / N, X t X s whenever s < t are rational. In view of the right continuity of the paths of X, this shows the paths of X are increasing with probability one. Similarly, since Poisson random variables only take values in the nonnegative integers, X t is a non-negative integer, a.s. Using this fact for every t rational shows that with probability one, X t takes values only in the nonnegative integers when t is rational, and the right continuity of the paths implies this is also the case for all t. Since the paths have left limits, there can only be finitely many jumps in finite time. It remains to prove that X t is either or 1 for all t. Let t >. If there were a jump of size 2 or larger at some time t strictly less than t, then for each n sufficiently large there exists k n 2 n such that X (kn+1)t /2 n X k nt /2n 2. Therefore P( s < t : X s 2) P( k 2 n : X (k+1)t /2 n X kt /2n 2) (1.1) 2 n sup P(X (k+1)t /2 n X kt /2 n 2) k 2 n = 2 n P(X t /2 n 2n ) 2 n (1 P(X t /2 n = ) P(X t /2 n = 1)) = 2 n (1 e λt /2 n (λt /2 n )e λt /2 n). We used Definition 1.1(3) for the two equalities. By l Hôpital s rules, (1 e x xe x )/x as x. We apply this with x = λt /2 n, and see that the last line of (1.1) tends to as n. Since the left hand side of (1.1) does not depend on n, it must be. This holds for each t.

9 1.2. STOPPING TIMES Stopping times Throughout this section we suppose we have a filtration {F t } satisfying the usual conditions. Definition 1.3 A random variable T : Ω [, ] is a stopping time if for all t, (T < t) F t. We say T is a finite stopping time if T <, a.s. We say T is a bounded stopping time if there exists K [, ) such that T K, a.s. Note that T can take the value infinity. Stopping times are also known as optional times. Given a stochastic process X, we define X T (ω) to be equal to X(T (ω), ω), that is, for each ω we evaluate t = T (ω) and then look at X(, ω) at this time. Proposition 1.4 Suppose F t satisfies the usual conditions. Then (1) T is a stopping time if and only if (T t) F t for all t. (2) If T = t, a.s., then T is a stopping time. (3) If S and T are stopping times, then so are S T and S T. (4) If T n, n = 1, 2,..., are stopping times with T 1 T 2, then so is sup n T n. (5) If T n, n = 1, 2,..., are stopping times with T 1 T 2, then so is inf n T n. (6) If s and S is a stopping time, then so is S + s. Proof. We will just prove part of (1), leaving the rest as an exercise. Note (T t) = n N (T < t + 1/n) F t+1/n for each N. Thus (T t) N F t+1/n F t+ = F t. It is often useful to be able to approximate stopping times from the right. If T is a finite stopping time, that is, T <, a.s., define T n (ω) = (k + 1)/2 n if k/2 n T (ω) < (k + 1)/2 n. (1.2)

10 4 CHAPTER 1. POISSON PROCESSES Define F T = {A F : for each t >, A (T t) F t }. (1.3) This definition of F T, which is supposed to be the collection of events that are known by time T, is not very intuitive. But it turns out that this definition works well in applications. Proposition 1.5 Suppose {F t } is a filtration satisfying the usual conditions. (1) F T is a σ-field. (2) If S T, then F S F T. (3) If F T + = ε> F T +ε, then F T + = F T. (4) If X t has right continuous paths, then X T is F T -measurable. Proof. If A F T, then A c (T t) = (T t) \ [A (T t)] F t, so A c F T. The rest of the proof of (1) is easy. Suppose A F S and S T. Then A (T t) = [A (S t)] (T t). We have A (S t) F t because A F S, while (T t) F t because T is a stopping time. Therefore A (T t) F t, which proves (2). For (3), if A F T +, then A F T +ε for every ε, and so A (T +ε t) F t for all t. Hence A (T t ε) F t for all t, or equivalently A (T t) F t+ε for all t. This is true for all ε, so A (T t) F t+ = F t. This says A F T. (4) Define T n by (1.2). Note (X Tn B) (T n = k/2 n ) = (X k/2 n B) (T n = k/2 n ) F k/2 n. Since T n only takes values in {k/2 n : k }, we conclude (X Tn B) (T n t) F t, and so (X Tn B) F Tn F T +1/2 n. Hence X Tn is F T +1/2 n measurable. If n m, then X Tn is measurable with respect to F T +1/2 n F T +1/2 m. Since X Tn X T, then X T is F T +1/2 m measurable for each m. Therefore X T is measurable with respect to F T + = F T. 1.3 Markov properties Let us begin with the Markov property.

11 1.3. MARKOV PROPERTIES 5 Theorem 1.6 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process with respect to {F t }. If u is a fixed time, then Y t = P t+u P u is a Poisson process independent of F u. Proof. Let G t = F t+u. It is clear that Y has right continuous paths, is zero at time, has jumps of size one, and is adapted to {G t }. Since Y t Y s = P t+u P s+u, then Y t Y s is a Poisson random variable with mean λ(t s) that is independent of F s+u = G s. The strong Markov property is the Markov property extended by replacing fixed times u by finite stopping times. Theorem 1.7 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process adapted to {F t }. If T is a finite stopping time, then Y t = P T +t P T is a Poisson process independent of F T. Proof. We will first show that whenever m 1, t 1 < < t m, f is a bounded continuous function on R m, and A F T, then E [f(y t1,..., Y tm ); A] = E [f(p t1,..., P tm )] P(A). (1.4) Once we have done this, we will then show how (1.4) implies our theorem. To prove (1.4), define T n by (1.2). We have E [f(p Tn+t1 P Tn,..., P Tn+tm P Tn ); A] (1.5) = E [f(p Tn+t1 P Tn,..., P Tn+tm P Tn ); A, T n = k/2 n ] = k=1 E [f(p t1 +k/2 n P k/2 n,..., P t m+k/2 n P k/2 n); A, T n = k/2 n ]. k=1 Following the usual practice in probability that, means and, we use E [ ; A, T n = k/2 n ] as an abbreviation for E [ ; A (T n = k/2 n )]. Since A F T, then A (T n = k/2 n ) = A ((T < k/2 n )\(T < (k 1)/2 n )) F k/2 n. We use the independent increments property of Poisson process and

12 6 CHAPTER 1. POISSON PROCESSES the fact that P t P s has the same law as P t s to see that the sum in the last line of (1.5) is equal to E [f(p t1 +k/2 n P k/2 n,..., P t m+k/2 n P k/2 n)] P(A, T n = k/2 n ) k=1 = E [f(p t1,..., P tm )] P(A, T n = k/2 n ) k=1 = E [f(p t1,..., P tm )] P(A), which is the right hand side of (1.4). Thus E [f(p Tn+t 1 P Tn,...,P Tn+t m P Tn ); A] (1.6) = E [f(p t1,..., P tm )] P(A). Now let n. By the right continuity of the paths of P, the boundedness and continuity of f, and the dominated convergence theorem, the left hand side of (1.6) converges to the left hand side of (1.4). If we take A = Ω in (1.4), we obtain E [f(y t1,..., Y tm )] = E [f(p t1,..., P tm )] whenever m 1, t 1,..., t m [, ), and f is a bounded continuous function on R m. This implies that the finite dimensional distributions of Y and P are the same. Since Y has right continuous paths, Y is a Poisson process. Next take A F T. By using a limit argument, (1.4) holds whenever f is the indicator of a Borel subset B of R d, or in other words, whenever B is a cylindrical set. P(Y B, A) = P(Y B)P(A) (1.7) When we discuss the Skorokhod topology, we will be able be more precise for the independence argument. Observe that what was needed for the above proof to work is not that P be a Poisson process, but that the process P have right continuous paths and that P t P s be independent of F s and have the same distribution as P t s. We therefore have the following corollary.

13 1.4. A CHARACTERIZATION 7 Corollary 1.8 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let X be a process adapted to {F t }. Suppose X has paths that are right continuous with left limits and suppose X t X s is independent of F s and has the same law as X t s whenever s < t. If T is a finite stopping time, then Y t = X T +t X T is a process that is independent of F T and X and Y have the same law. 1.4 A characterization Another characterization of the Poisson process is as follows. Let T 1 = inf{t : X t = 1}, the time of the first jump. Define T i+1 = inf{t > T i : X t = 1}, so that T i is the time of the i th jump. Proposition 1.9 The random variables T 1, T 2 T 1,..., T i+1 T i,... are independent exponential random variables with parameter λ. Proof. In view of Corollary 1.8 it suffices to show that T 1 is an exponential random variable with parameter λ. If T 1 > t, then the first jump has not occurred by time t, so X t is still zero. Hence P(T 1 > t) = P(X t = ) = e λt, using the fact that X t is a Poisson random variable with parameter λt. We can reverse the characterization in Proposition 1.9 to construct a Poisson process. We do one step of the construction, leaving the rest as an exercise. Let U 1, U 2,... be independent exponential random variables with parameter λ and let T j = j i=1 U i. Define X t (ω) = k if T k (ω) t < T k+1 (ω). (1.8) An examination of the densities shows that an exponential random variable has a gamma distribution with parameters λ and r = 1, so T j is a gamma random variable with parameters λ and j. Thus P(X t < k) = P(T k > t) = t λe λx (λx) k 1 dx. Γ(k)

14 8 CHAPTER 1. POISSON PROCESSES Performing the integration by parts repeatedly shows that k 1 λt (λt)i P(X t < k) = e, i! and so X t is a Poisson random variable with parameter λt. We will use the following proposition later. i= Proposition 1.1 Let {F t } be a filtration satisfying the usual conditions. Suppose X =, a.s., X has paths that are right continuous with left limits, X t X s is independent of F s if s < t, and X t X s has the same law as X t s whenever s < t. If the paths of X are piecewise constant, increasing, all the jumps of X are of size 1, and X is not identically, then X is a Poisson process. Proof. Let T = and T i+1 = inf{t > T i : X t = 1}, i = 1, 2,.... We will show that if we set U i = T i T i 1, then the U i s are i.i.d. exponential random variables. By Corollary 1.8, the U i s are independent and have the same law. Hence it suffices to show U 1 is an exponential random variable. We observe P(U 1 > s + t) = P(X s+t = ) = P(X s+t X s =, X s = ) = P(X t+s X s = )P(X s = ) = P(X t = )P(X s = ) = P(U 1 > t)p(u 1 > s). Setting f(t) = P(U 1 > t), we thus have f(t + s) = f(t)f(s). Since f(t) is decreasing and < f(t) < 1, we conclude P(U 1 > t) = f(t) = e λt for some λ >, or U 1 is an exponential random variable. 1.5 Martingales We define continuous time martingales. Let {F t } be a filtration, not necessarily satisfying the usual conditions.

15 1.5. MARTINGALES 9 Definition 1.11 M t is a continuous time martingale with respect to the filtration {F t } and the probability measure P if (1) E M t < for each t; (2) M t is F t measurable for each t; (3) E [M t F s ] = M s, a.s., if s < t. Part (2) of the definition can be rephrased as saying M t is adapted to F t. If in part (3) = is replaced by, then M t is a submartingale, and if it is replaced by, then we have a supermartingale. Taking expectations in Definition 1.11(3), we see that if s < t, then E M s E M t is M is a submartingale and E M s E M t if M is a supermartingale. Thus submartingales tend to increase, on average, and supermartingales tend to decrease, on average. If P t is a Poisson process with index λ, then P t λt is a continuous time martingale. To see this, E [P t λt F s ] = E [P t P s F s ] λt + P s = E [P t P s ] λt + P s = λ(t s) = λt + P s = P s λs. We give another example of a martingale. Example 1.12 Recall that given a filtration {F t }, each F t is contained in F, where (Ω, F, P) is our probability space. Let X be an integrable F measurable random variable, and let M t = E [X F t ]. Then and M is a martingale. E [M t F s ] = E [E [X F t ] F s ] = E [X F s ] = M s, We derive the analogs of Doob s inequalities in the stochastic process context. Theorem 1.13 Suppose M t is a martingale or non-negative submartingale with paths that are right continuous with left limits. Then

16 1 CHAPTER 1. POISSON PROCESSES (1) P(sup M s λ) E M t /λ. s t (2) If 1 < p <, then ( p ) pe E [sup M s ] p Mt p. s t p 1 Proof. We will do the case where M t is a martingale, the submartingale case being nearly identical. Let D n = {kt/2 n : k 2 n }. If we set N (n) k = M kt/2 n and G (n) k = F kt/2 n, it is clear that {N (n) k } is a discrete time martingale with respect to {G (n) k }. Let A n = {sup s t,s Dn M s > λ}. By Doob s inequality for discrete time martingales, P(A n ) = P(max k 2 (n) N n k > λ) E N (n) 2 n λ = E M t λ. Note that the A n are increasing, and since M t is right continuous, Then n A n = {sup M s > λ}. s t P(sup M s > λ) = P( n A n ) = lim P(A n ) E M t /λ. s t n If we apply this with λ replaced by λ ε and let ε, we obtain (1). The proof of (2) is similar. By Doob s inequality for discrete time martingales, ( p ) pe ( E [sup k p (n) p ) pe ] N 2 p 1 n p = Mt p. p 1 k 2 n N (n) Since sup k 2 n N (n) k p increases to sup s t M s p by the right continuity of M, (2) follows by Fatou s lemma. Here is an example. If P t is a Poisson process of index λ, then P t λt is a martingale. So e a(pt λt) is a submartingale for any real number a. Then P(sup s t P s λs A) = P(sup e a(ps λs > e aa ) e aa E e apt e aλt. s t

17 1.5. MARTINGALES 11 We know ( ) E e apt = exp (e a 1)λt. We substitute this in the above and then optimize over a. We will need Doob s optional stopping theorem for continuous time martingales. Theorem 1.14 Let {F t } be a filtration satisfying the usual conditions. If M t is a martingale or non-negative submartingale whose paths are right continuous, sup t E M 2 t <, and T is a finite stopping time, then E M T E M. Proof. We do the submartingale case, the martingale case being very similar. By Doob s inequality (Theorem 1.13(1)), E [sup Ms 2 ] 4E Mt 2. s t Letting t, we have E [sup t M 2 t ] < by Fatou s lemma. Let us first suppose that T < K, a.s., for some real number K. Define T n by (1.2). Let N (n) k = M k/2 n, G (n) k = F k/2 n, and S n = 2 n T n. By Doob s optional stopping theorem applied to the submartingale N (n) k, we have E M = E N (n) E N (n) S n = E M Tn. Since M is right continuous, M Tn M T, a.s. The random variables M Tn are bounded by 1+sup t M 2 t, so by dominated convergence, E M Tn E M T. We apply the above to the stopping time T K to get E M T K E M. The random variables M T K are bounded by 1+sup t M 2 t, so by dominated convergence, we get E M T E M when we let K. We present the continuous time version of Doob s martingale convergence theorem. We will see that not only do we get limits as t, but also a regularity result. Let D n = {k/2 n : k }, D = n D n. Theorem 1.15 Let {M t : t D} be either a martingale, a submartingale, or a supermartingale with respect to {F t : t D} and suppose sup t D E M t <. Then

18 12 CHAPTER 1. POISSON PROCESSES (1) lim t M t exists, a.s. (2) With probability one M t has left and right limits along D. The second conclusion says that except for a null set, if t [, ), then both lim t D,t t M t and lim t D,t t M t exist and are finite. The null set does not depend on t. Proof. Martingales are also submartingales and if M t is a supermartingale, then M t is a submartingale, so we may without loss of generality restrict our attention to submartingales. By Doob s inequality, P( sup M t > λ) 1 t D n,t n λ E M n. Letting n and using Fatou s lemma, P(sup t D M t > λ) 1 λ sup E M t. t This is true for all λ, so with probability one, { M t : t D} is a bounded set. Therefore the only way either (1) or (2) can fail is that if for some pair of rationals a < b the number of upcrossings of [a, b] by {M t : t D} is infinite. Recall that we define upcrossings as follows. Given an interval [a, b] and a submartingale M, if S 1 = inf{t : M t a}, T i = inf{t > S i : M t b}, and S i+1 = inf{t > T i : M t a}, then the number of upcrossings up to time u is sup{k : T k u}. Doob s upcrossing lemma tells us that if V n is the number of upcrossings by {M t : t D n [, n]}, then E V n E M n b a. Letting n and using Fatou s lemma, the number of upcrossings of [a, b] by {M t : t D} has finite expectation, hence is finite, a.s. If N a,b is the null set where the number of upcrossings of [a, b] by {M t : t D} is infinite and N = a<b,a,b Q+ N a,b, where Q + is the collection of non-negative rationals, then P(N) =. If ω / N, then (1) and (2) hold. As a corollary we have

19 1.5. MARTINGALES 13 Corollary 1.16 Let {F t } be a filtration satisfying the usual conditions, and let M t be a martingale with respect to {F t }. Then M has a version that is also a martingale and that in addition has paths that are right continuous with left limits. Proof. Let D be as in the above proof. For each integer N 1, E M t E M N < for t N since M t is a submartingale by the conditional expectation form of Jensen s inequality. Therefore M t N has left and right limits when taking limits along t D. Since N is arbitrary, M t has left and right limits when taking limits along t D, except for a set of ω s that form a null set. Let M t = lim M u. u D,u>t,u t It is clear that M has paths that are right continuous with left limits. Since F t+ = F t and M t is F t+ measurable, then M t is F t measurable. Let N be fixed. We will show {M t ; t N} is a uniformly integrable family of random variables. Let ε >. Since M N is integrable, there exists δ such that if P(A) < δ, then E [ M N ; A] < ε. If L is large enough, P( M t > L) E M t /L E M N /L < δ. Then E [ M t ; M t > L] E [ M N ; M t > L] < ε, since M t is a submartingale and ( M t > L) F t. Uniform integrability is proved. Now let t < N. If B F t, E [ M t ; B] = lim E [M u; B] = E [M t ; B]. u D,u>t,u t Here we used the Vitali convergence theorem and the fact that M t is a martingale. Since M t is F t measurable, this proves that M t = M t, a.s. Since N was arbitrary, we have this for all t. We thus have found a version of M that has paths that are right continuous with left limits. That M t is a martingale is easy. The following technical result will be used in the next chapter. A function f is increasing if s < t implies f(s) f(t). A process A t has increasing paths if the function t A t (ω) is increasing for almost every ω.

20 14 CHAPTER 1. POISSON PROCESSES Proposition 1.17 Suppose {F t } is a filtration satisfying the usual conditions and suppose A t is an adapted process with paths that are increasing, are right continuous with left limits, and A = lim t A t exists, a.s. Suppose X is a non-negative integrable random variable, and M t is a version of the martingale E [X F t ] which has paths that are right continuous with left limits. Suppose E [XA ] <. Then E X da s = E M s da s. (1.9) Proof. First suppose X and A are bounded. Let n > 1 and let us write E X da s as E [X(A k/2 n A (k 1)/2 n)]. k=1 Conditioning the k th summand on F k/2 n, this is equal to [ ] E E [X F k/2 n](a k/2 n A (k 1)/2 n). k=1 Given s and n, define s n to be that value of k/2 n such that (k 1)/2 n < s k/2 n. We then have E X da s = E M sn da s. (1.1) For any value of s, s n s as n, and since M has right continuous paths, M sn M s. Since X is bounded, so is M. By dominated convergence, the right hand side of (1.1) converges to E M s da s. This completes the proof when X and A are bounded. We apply this to X N and A N, let N, and use monotone convergence for the general case. The only reason we assume X is non-negative is so that the integrals make sense. The equation (1.9) can be rewritten as E X da s = E E [X F s ] da s. (1.11)

21 1.5. MARTINGALES 15 We also have E t X da s = E t E [X F s ] da s (1.12) for each t. This follows either by following the above proof or by applying Proposition 1.17 to A s t.

22 16 CHAPTER 1. POISSON PROCESSES

23 Chapter 2 Lévy processes A Lévy process is a process with stationary and independent increments whose paths are right continuous with left limits. Having stationary increments means that the law of X t X s is the same as the law of X t s X whenever s < t. Saying that X has independent increments means that X t X s is independent of σ(x r ; r s) whenever s < t. We want to examine the structure of Lévy processes. We know three examples: the Poisson process, Brownian motion, and the deterministic process X t = t. It turns out all Lévy processes can be built up out of these as building blocks. We will show how to construct Lévy processes and we will give a representation of an arbitrary Lévy process. Recall that we use X t = lim s<t,s t X s and X t = X t X t. 2.1 Examples Let us begin at looking at some simple Lévy processes. Let P j t, j = 1,..., J, be a sequence of independent Poisson processes with parameters λ j, resp. Each P j t is a Lévy process and the formula for the characteristic function of a Poisson random variable shows that the characteristic function of P j t is E e iup j t = exp(tλ j (e iu 1)). 17

24 18 CHAPTER 2. LÉVY PROCESSES Therefore the characteristic function of a j P j t is E e iua jp j t = exp(tλ j (e iua j 1)) and the characteristic function of a j P j t a j λ j t is E e iua jp t j a)jλ jt = exp(tλ j (e iua j 1 iua j )). If we let m j be the measure on R defined by m j (dx) = λ j δ aj (dx), where δ aj (dx) is point mass at a j, then the characteristic function for a j P j t can be written as ( ) exp t [e iux 1] m j (dx) (2.1) R and the one for a j P j t a j λ j t as ( ) exp t [e iux 1 iux] m j (dx). (2.2) Now let R X t = J j=1 a j P j t. It is clear that the paths of X t are right continuous with left limits, and the fact that X has stationary and independent increments follows from the corresponding property of the P j s. Moreover the characteristic function of a sum of independent random variables is the product of the characteristic functions, so the characteristic function of X t is given by ( E e iuxt = exp t with m(dx) = J j=1 λ jδ aj (dx). R ) [e iux 1] m(dx) (2.3) The process Y t = X t t J j=1 a jλ j is also a Lévy process and its characteristic function is ( ) E e iuyt = exp t [e iux 1 iux] m(dx), (2.4) R again with m(dx) = J j=1 λ jδ aj (dx).

25 2.1. EXAMPLES 19 Remark 2.1 Recall that if ϕ is the characteristic function of a random variable Z, then ϕ () = ie Z and ϕ () = E Z 2. If Y t is as in the paragraph above, then clearly E Y t =, and calculating the second derivative of E e iuyt at, we obtain E Y 2 t = t x 2 m(dx). The following lemma is a restatement of Corollary 1.8. Lemma 2.2 If X t is a Lévy process and T is a finite stopping time, then X T +t X T is a Lévy process with the same law as X t X and independent of F T. We will need the following lemma: Lemma 2.3 Suppose X 1,..., X n are independent exponential random variables with parameters a 1,..., a n, resp. (1) Then min(x 1,..., X n ) is an exponential random variable with parameter a a n. (2) The probability that X i is the smallest of the n exponentials is a i a a n. Proof. (1) Write P(min(X 1,..., X n ) > t) = P(X 1 > t,..., X n > t) = P(X 1 > t) P(X n > t) = e a 1t e ant = e (a 1+ +a n)t. (2) Without loss of generality we may suppose i = 1. Let s first do the case n = 2. The joint density of (X 1, X 2 ) is a 1 e a 1x a 2 e a 2y and we want to integrate this over x < y. Doing the integration yields a 1 /(a 1 + a 2 ). For the general case of n > 2, apply the above to X 1 and min(x 2,..., X n ). If P 1,..., P n are independent Poisson processes with parameters λ 1,..., λ n resp., let X t = n i=1 b ip i (t). By the above lemma, the times between jumps

26 2 CHAPTER 2. LÉVY PROCESSES of X are independent exponentials with parameter λ λ n. At each jump, X jumps b i with probability λ i /(λ λ n ). Thus another way to construct X is to let U 1, U 2,... be independent exponentials with parameter λ λ n and let Y 1, Y 2,... be a sequence of i.i.d. random variables independent of the U i s such that P(Y k = b j ) = λ j /(λ λ n ). We then let X be, let X t be piecewise constant, and at time m i=1 U i we let X jump by the amount Y m. 2.2 Construction of Lévy processes A process X has bounded jumps if there exists a real number K > such that sup t X t K, a.s. Lemma 2.4 If X t is a Lévy process with bounded jumps and with X =, then X t has moments of all orders, that is, E X t p < for all positive integers p. Proof. Suppose the jumps of X t are bounded in absolute value by K. Since X t is right continuous with left limits, there exists M > K such that P(sup s t X s M) 1/2. Let T 1 = inf{t : X t M} and T i+1 = inf{t > T i : X t X Ti > M}. For s < T 1, X s M, and then X T1 X T1 + X T1 M + K 2M. We have using Lemma 2.2. Now P(T i+1 < t) P(T i < t, T i+1 T i < t) = P(T i+1 T i < t)p(t i < t) = P(T 1 < t)p(t i < t), P(T 1 < t) P(sup X s M) 1, 2 s t so P(T i+1 < t) 1P(T 2 i < t), and then by induction, P(T i < t) 2 i. Therefore P(sup X s 2(i + 1)M) P(T i < t) 2 i s t

27 2.2. CONSTRUCTION OF LÉVY PROCESSES 21 and the lemma now follows immediately. A key lemma is the following. Lemma 2.5 Suppose I is a finite interval of the form (a, b), [a, b), (a, b], or [a, b] with a > and m is a finite measure on R giving no mass to I c. Then there exists a Lévy process X t satisfying (2.3) Proof. First let us consider the case where I = [a, b). We approximate m be a discrete measure. If n 1, let z j = a + j(b a)/2 n, j =,..., 2 n 1, and let m n (dx) = 2 n 1 j= m([z j, z j+1 ))δ zj (dx), where δ zj is point mass at z j. The measures m n converge weakly to m as n in the sense that f(x) m n (dx) f(x) dx whenever f is a bounded continuous function on R. We let U 1, U 2,... be independent exponential random variables with parameter m(i). Let Y 1, Y 2,... be i.i.d. random variables independent of the U i s with P(Y i dx) = m(dx)/m(i). We let X t start at and be piecewise constant with jumps of size Y m at times m i=1 U i. If we define Xt n is the exact same way, except that we replace m by m n and we let Yi n = z j if Y i [z j, z j+1 ), then we know from the previous section that Xt n is a Lévy process with Lévy measure m n. Moreover each Yi n differs from Y i by at most (b a)2 n, so sup Xs n X s (b a)2 n N, s t where N is the number of jumps of these processes before time t. N is a Poisson random variable with parameter m(i), so has moments of all orders. It follows that Xt n converges uniformly to X t almost surely on each finite interval, and the difference goes to in L p for each p.

28 22 CHAPTER 2. LÉVY PROCESSES We conclude that the law of X t X s is independent of F s and has the same law as that of X t s because these hold for each X n. Since x e iux is a bounded continuous function and m n converges weakly to m, starting with ( ) E exp(iuxt n ) = exp t [e iux 1] m n (dx), and passing to the limit, we obtain that the characteristic function of X under P is given by (2.3). If now the interval I contains the point b, we follow the above proof, except we let P 2n 1 t be a Poisson random variable with parameter m([z n 1, b]). Similarly, if I does not contain the point a, we change Pt to be a Poisson random variable with parameter m((a, z 1 )). With these changes, the proof works for intervals I, whether or not they contain either of their endpoints. Remark 2.6 If X is the Lévy process constructed in Lemma 2.5, then Y t = X t E X t will be a Lévy process satisfying (2.4). Here is the main theorem of this section. Theorem 2.7 Suppose m is a measure on R with m({}) = and (1 x 2 )m(dx) <. Suppose b R and σ. There exists a Lévy process X t such that ( { }) E e iuxt = exp t iub σ 2 u 2 /2 + [e iux 1 iux1 ( x 1) ]m(dx). (2.5) R The above equation is called the Lévy-Khintchine formula. The measure m is called the Lévy measure. If we let m(dx) = 1 + x2 m (dx) x 2

29 2.2. CONSTRUCTION OF LÉVY PROCESSES 23 and b = b + then we can also write ( x 1) ( { E e iuxt = exp t iub σ 2 u 2 /2 + x x m(dx) x 2 ( x >1) 1 + x m(dx), 2 R [ e iux 1 iux ] 1 + x x 2 x 2 }) m (dx). Both expressions for the Lévy-Khintchine formula are in common use. Proof. Let m(dx) be a measure supported on (, 1] with x 2 m(dx) <. Let m n (dx) be the measure m restricted to (2 n, 2 n+1 ]. Let Yt n be independent Lévy processes whose characteristic functions are given by (2.4) with m replaced by m n ; see Remark 2.6. Note E Yt n = for all n by Remark 2.1. By the independence of the Y n s, if M < N, ( N E n=m ) 2 N Yt n = E (Yt n ) 2 = n=m N n=m t 2 M x 2 m n (dx) = t x 2 m(dx). 2 N By our assumption on m, this goes to as M, N, and we conclude that N n= Y t n converges in L 2 for each t. Call the limit Y t. It is routine to check that Y t has independent and stationary increments. Each Yt n has independent increments and is mean, so E [Y n t Y n s F s ] = E [Y n t Y n s ] =, or Y n is a martingale. By Doob s inequalities and the L 2 convergence, E sup s t N Ys n n=m 2 as M, N, and hence there exists a subsequence M k such that M k n=1 Y s n converges uniformly over s t, a.s. Therefore the limit Y t will have paths that are right continuous with left limits. If m is a measure supported in (1, ) with m(r) <, we do a similar procedure starting with Lévy processes whose characteristic functions are of the form (2.3). We let m n (dx) be the restriction of m to (2 n, 2 n+1 ], let Xt n be independent Lévy processes corresponding to m n, and form X t = n= Xn t.

30 24 CHAPTER 2. LÉVY PROCESSES Since m(r) <, for each t, the number of times t less than t at which any one of the Xt n jumps is finite. This shows X t has paths that are right continuous with left limits, and it is easy to then see that X t is a Lévy process. Finally, suppose x 2 1 m(dx) <. Let Xt 1, Xt 2 be Lévy processes with characteristic functions given by (2.3) with m replaced by the restriction of m to (1, ) and (, 1), resp., let Xt 3, Xt 4 be Lévy processes with characteristic functions given by (2.4) with m replaced by the restriction of m to (, 1] and [ 1, ), resp., let Xt 5 = bt, and let Xt 6 be σ times a Brownian motion. Suppose the X i s are all independent. Then their sum will be a Lévy process whose characteristic function is given by (2.5). A key step in the construction was the centering of the Poisson processes to get Lévy processes with characteristic functions given by (2.4). Without the centering one is forced to work only with characteristic functions given by (2.3). 2.3 Representation of Lévy processes We now work towards showing that every Lévy process has a characteristic function of the form given by (2.5). Lemma 2.8 If X t is a Lévy process and A is a Borel subset of R that is a positive distance from, then N t (A) = s t 1 A ( X s ) is a Poisson process. Saying that A is a positive distance from means that inf{ x : x A} >. Proof. Since X t has paths that are right continuous with left limits and A is a positive distance from, then there can only be finitely many jumps of X that lie in A in any finite time interval, and so N t (A) is finite and has paths that are right continuous with left limits. It follows from the fact that X t has stationary and independent increments that N t (A) also has stationary and independent increments. We now apply Proposition 1.1.

31 2.3. REPRESENTATION OF LÉVY PROCESSES 25 Our main result is that N t (A) and N t (B) are independent if A and B are disjoint. Theorem 2.9 Let {F t } be a filtration satisfying the usual conditions. Suppose that N t (A) is a Poisson point process with respect to the measure λ. If A 1,..., A n are pairwise disjoint measurable subsets of R with E N t (A k ) < for k = 1,..., n, then the processes N t (A 1 ),..., N t (A n ) are mutually independent. Proof. Define λ(a) = E N 1 (A). The previous lemma shows that if λ(a) <, then N t (A) is a Poisson process, and clearly its parameter is λ(a). We first make the observation that because A 1, A 2,..., A n are disjoint, then no two of the N t (A k ) have jumps at the same time. To prove the theorem, it suffices to let = r < r 1 < < r m and show that the random variables {N rj (A k ) N rj 1 (A k ) : 1 j m, 1 k n} are independent. Since for each j and each k, N rj (A k ) N rj 1 (A k ) is independent of F rj 1, it suffices to show that for each j m, the random variables {N rj (A k ) N rj 1 (A k ) : 1 k n} are independent. We will do the case j = m = 1 and write r for r j for simplicity; the case when j, m > 1 differs only in notation. We will prove this using induction. We start with the case n = 2 and show the independence of N r (A 1 ) and N r (A 2 ). Each N t (A k ) is a Poisson process, and so N t (A k ) has moments of all orders. Let u 1, u 2 R and set φ k = λ(a k )(e iu k 1), k = 1, 2. Let We see that M k t M k t = e iu kn t(a k ) tφ k. is a martingale because E e iu kn t(a k ) = e tφ k, and therefore E [M k t F s ] = M k s E [e iu(nt(a k) N s(a k ))) (t s)φ k F s ] = M k s e (t s)φ k E [e iu(nt(a k) N s(a k )) ] = M k s,

32 26 CHAPTER 2. LÉVY PROCESSES using the independence and stationarity of the increments of a Poisson process. Now we can write E [M 1 t M 2 t ] = E [M 1 t ] + E = 1 + E t t M 1 s dm 2 s, M 1 t dm 2 s using that M 2 = 1, M 1 is a martingale, and Proposition (Here M 2 is the difference of two increasing processes; the adjustments needed are easy.) Since we have argued that no two of the N t (A k ) jump at the same time, the same is true for the Mt k and so the above is equal to 1 + E t M 1 s dm 2 s. It therefore remains to prove that the above integral is equal to. If H s is a process of the form where K is F a measurable, then t H s (ω) = K(ω)1 (a,b] (s) H s dm 2 s = K(M 2 t b M 2 t a), and conditioning on F a, the expectation is zero: E [K(M 2 t b M 2 t a)] = E [KE [M 2 t b M 2 t a F a ] ] =, using that M 2 is a martingale. We are doing Lebesgue-Stieltjes integrals here, but the argument is similar to one used with stochastic integrals. The expectation is also for linear combinations of such H s. Since Ms 1 is left continuous, we can approximate it by such H s, and therefore the integral is as required. We thus have E M 1 r M 2 r = 1.

33 2.3. REPRESENTATION OF LÉVY PROCESSES 27 This implies [ ] [ ] [ ] E e i(u 1N r(a 1 )+u 2 N r(a 2 )) = e rφ 1 e rφ 2 = E e iu 1N r(a 1 ) E e iu 2N r(a 2 ). Since this holds for all u 1, u 2, then N r (A 1 ) and N r (A 2 ) are independent. We conclude that the processes N t (A 1 ) and N t (A 2 ) are independent. To handle the case n = 3, we first show that Mt 1 Mt 2 write E [M 1 t M 2 t F s ] is a martingale. We = M 1 s M 2 s e (t s)(φ 1+φ 2 ) E [e i(u 1(N t(a 1 ) N s(a 1 ))+u 2 (N t(a 2 ) N s(a 2 ))) F s ] = M 1 s M 2 s e (t s)(φ 1+φ 2 ) E [e i(u 1(N t(a 1 ) N s(a 1 ))+u 2 (N t(a 2 ) N s(a 2 ))) ] = M 1 s M 2 s, using the fact that N t (A 1 ) and N t (A 2 ) are independent of each other and each have stationary and independent increments. Note that M 3 t = e iu 3N t(a 3 ) tφ 3 has no jumps in common with M 1 t or M 2 t. Therefore if M 3 t = M 3 t r, then and as before this leads to E [M 3 (M 1 M 2 )] =, E [M 3 r (M 1 r M 2 r )] = 1. As above this implies that N r (A 1 ), N r (A 2 ), and N r (A 3 ) are independent. To prove the general induction step is similar. We will also need the following corollary. Corollary 2.1 Let F t and N t (A k ) be as in Theorem 4.2. Suppose Y t is a process with paths that are right continuous with left limits such that Y t Y s is independent of F s whenever s < t and Y t Y s has the same law as Y t s for each s < t. Suppose moreover that Y has no jumps in common with any of the N t (A k ). Then the processes N t (A 1 ),..., N t (A n ), and Y t are independent.

34 28 CHAPTER 2. LÉVY PROCESSES Proof. The law of Y is the same as that of Y t Y t, so Y =, a.s. By the fact that Y has stationary and independent increments, E e iuy s+t = E e iuys E e iu(y s+t Y s) = E e iuys E e iuyt, which implies that the characteristic function of Y is of the form E e iuyt = e tψ(u) for some function ψ(u). We fix u R and define M Y t = e iuyt tψ(u). As in the proof of Theorem 4.2, we see that Mt Y is a martingale. Since M Y has no jumps in common with any of the Mt k, if M Y t = Mt r, Y we see as above that E [M Y (M 1 M n )] = 1, or E [M Y r M 1 r M n r ] = 1. This leads as above to the independence of Y from all the N t (A k ) s. Here is the representation theorem for Lévy processes. Theorem 2.11 Suppose X t is a Lévy process with X =. Then there exists a measure m on R {} with (1 x 2 ) m(dx) < and real numbers b and σ such that the characteristic function of X t is given by (2.5). Proof. Define m(a) = E N 1 (A) if A is a bounded Borel subset of (, ) that is a positive distance from. Since N 1 ( k=1 A k) = k=1 N 1(A k ) if the A k are pairwise disjoint and each is a positive distance from, we see that m is a measure on [a, b] for each < a < b <, and m extends uniquely to a measure on (, ).

35 2.3. REPRESENTATION OF LÉVY PROCESSES 29 First we want to show that s t X s1 ( Xs>1) is a Lévy process with characteristic function ( ) exp t [e iux 1] m(dx). 1 Since the characteristic function of the sum of independent random variables is equal to the product of the characteristic functions, it suffices to suppose < a < b and to show that ( ) E e iuzt = exp t [e iux 1] m(dx), (a,b] where Z t = s t X s 1 (a,b] ( X s ). Let n > 1 and z j = a + j(b a)/n. By Lemma 2.8, N t ((z j, z j+1 ]) is a Poisson process with parameter l j = E N 1 ((z j 1, z j ]) = m((z j, z j+1 ]). Thus n 1 j= z jn t ((z j, z j+1 ]) has characteristic function n 1 ( n 1 exp(tl j (e iuz j 1)) = exp t (e iuz j 1)l j ), j= j= which is equal to ( exp t ) (e iux 1) m n (dx), (2.6) where m n (dx) = n 1 j= l jδ zj (dx). Since Z n t converges to Z t as n, passing to the limit shows that Z t has a characteristic function of the form (2.5). Next we show that m(1, ) <. (We write m(1, ) instead of m((1, )) for esthetic reasons.) If not, m(1, K) as K. Then for each fixed L and each fixed t, lim sup P(N t (1, K) L) = lim sup K K L j= tm(1,k) m(1, K)j e j! =. This implies that N t (1, ) = for each t. However, this contradicts the fact that X t has paths that are right continuous with left limits.

36 3 CHAPTER 2. LÉVY PROCESSES We define m on (, ) similarly. We now look at Y t = X t s t X s 1 ( Xs >1). This is again a Lévy process, and we need to examine its structure. This process has bounded jumps, hence has moments of all orders. By subtracting c 1 t for an appropriate constant c 1, we may suppose Y t has mean. Let I 1, I 2,... be an ordering of the intervals {[2 (m+1), 2 m ), ( 2 m, 2 (m+1) ] : m }. Let X k t = s t X s 1 ( Xs I k ) and let Xt k = X t k E X t k. By the fact that all the X k have mean and are independent, k=1 [( E (Xt k ) 2 E Y t k=1 X k t ) 2 ] [( + E k=1 X k t ) 2 ] = E (Y t ) 2 <. Hence [ N E k=m X k t ] 2 N = E (Xt k ) 2 k=m tends to as M, N, and thus X t N k=1 Xk t converges in L 2. The limit, X c t, say, will be a Lévy process independent of all the X k t. Moreover, X c has no jumps, i.e., it is continuous. Since all the X k have mean, then E X c t =. By the independence of the increments, E [X c t X c s F s ] = E [X c t X c s] =, and we see X c is a continuous martingale. Using the stationarity and independence of the increments, E [(X c s+t) 2 ] = E [(X c s) 2 ] + 2E [X c s(x c s+t X c s)] + E [(X c s+t X c s) 2 ] = E [(X c s) 2 ] + E [(X c t ) 2 ],

37 2.4. SYMMETRIC STABLE PROCESSES 31 which implies that there exists a constant c 2 such that E (X c t ) 2 = c 2 t. We then have E [(X c t ) 2 c 2 t F s ] = (X c s) 2 c 2 s + E [(X c t X c s) 2 F s ] c 2 (t s) = (X c s) 2 c 2 s + E [(X c t X c s) 2 ] c 2 (t s) = (X c s) 2 c 2 s. The quadratic variation process of X c is therefore c 2 t, and by Lévy s theorem, X c t / c 2 is a constant multiple of Brownian motion. To complete the proof, it remains to show that 1 1 x2 m(dx) <. But by Remark 2.1, x 2 m(dx) = E (X1 k ) 2, I k and we have seen that k E (X k 1 ) 2 E Y 2 1 <. Combining gives the finiteness of 1 1 x2 m(dx). 2.4 Symmetric stable processes Let α (, 2). If m(dx) = c dx, x 1+α we have what is called a symmetric stable process of index α. We see that 1 x 2 m(dx) is finite. Because x 1 α is symmetric, in the Lévy-Khintchine formula we can take iux 1 ( x <a) for any a instead of iux 1 ( x <1). Then [ e iux c [ 1 iux 1 ( x <1)] dx = e iux c 1 iux 1 x 1+α ( x <1/ u )] dx x 1+α [ ] = e iy 1+α dy 1 iy 1 ( y <1) u u = c u α.

38 32 CHAPTER 2. LÉVY PROCESSES In the last line we have the negative sign because the imaginary part of e iy 1 iy 1 ( y <1) is zero and the real part is negative since cos y 1. Therefore is X t is a symmetric stable process of index α, E e iuxt = e c t u α. An exercise is to show that if a > and X t is a symmetric stable process of index α, then X at and a 1/α X t have the same law. By Exercise of Chung s book, P(X 1 > A) ca α, A, (2.7) where f g means the ratio of the two sides goes to. Since e c t u α is integrable, X t has a continuous density function p t (x). We have p t () = 1 e c t u α du, (2.8) 2π and by a change of variables, If x, then p t (x) = 1 2π e iux e c t u α du = 1 2π p t () = ct 1/α. (2.9) (cos ux i sin ux)e c t u α du. Since sin ux is an odd function of u, the imaginary term is. Since cos ux 1 and in fact is strictly less than 1 except at countably many values of u, we see that p t (x) < p t (). (2.1) If β < 1, we can take m(dx) = c/ x 1+β for x > and for x <. We can also take the Lévy-Khintchine exponent to be just [e iux 1] if we take the drift term to cancel out the iux 1 ( x <1) term. This reflects that here we do not need to subtract the mean to get convergence of the compound Poisson processes. In this case we get the one-sided stable processes of index β. The paths of such a process only increase. There is a notion of subordination which is very curious. Suppose that T t is a one-sided stable process of index β with β < 1. Let W t be a Brownian

39 2.4. SYMMETRIC STABLE PROCESSES 33 motion independent of T t. Then Y t = W Tt is a symmetric stable process of index 2β. Let s see why that is so. That Y is a Lévy process is not hard to see. We must therefore calculate the Lévy measure m. If P t is a Poisson process with parameter λ, then E e upt = λt (λt)k e e uk = e λt e uλt = e λt(eu 1). k! k= Using that the moment generating function of independent random variables is the product of the moment generating functions and taking limits, we see that E e utt = e cuβ. Then E e iuyt = E e iuws P(T t ds) = e u2s/2 P(T t ds) = E e u2 T t/2 = e ct(u2 /2) β = e c u 2β.

40 34 CHAPTER 2. LÉVY PROCESSES

41 Chapter 3 Stochastic calculus In this chapter we investigate the stochastic calculus for processes which may have jumps as well as a continuous component. If X is not a continuous process, it is no longer true that X t TN is a bounded process when T N = inf{t : X t N}, since there could be a large jump at time T N. We investigate stochastic integrals with respect to square integrable (not necessarily continuous) martingales, Itô s formula, and the Girsanov transformation. We prove the reduction theorem that allows us to look at semimartingales that are not necessarily bounded. We will need the Doob-Meyer decomposition, which can be found in Chapter 16 of Bass, Stochastic Processes. That in turn depends on the debut and section theorems. A simpler proof than the standard one for the debut and section theorems can be found in the Arxiv: Decomposition of martingales We assume throughout this chapter that {F t } is a filtration satisfying the usual conditions. This means that each F t contains every P-null set and ε> F t+ε = F t for each t. Let us with a few definitions and facts. The predictable σ-field is the σ-field of subsets of [, ) Ω generated by the collection of bounded, left continuous processes that are adapted to {F t }. A stopping time T is predictable and 35

42 36 CHAPTER 3. STOCHASTIC CALCULUS predicted by the sequence of stopping times T n if T n T, and T n < T on the event (T > ). A stopping time T is totally inaccessible if P(T = S) = for every predictable stopping time S. The graph of a stopping time T is [T, T ] = {(t, ω) : t = T (ω) < }. If X t is a process that is right continuous with left limits, we set X t = lim s t,s<t X s and X t = X t X t. Thus X t is the size of the jump of X t at time t. Let s look at some examples. If W t is a Brownian motion and T = inf{t : W t = 1}, then T n = inf{t : W t = 1 (1/n)} are stopping times that predict T. On the other hand, if P t is a Poisson process (with parameter 1, say, for convenience), then we claim that T = inf{t : P t = 1} is totally inaccessible. To show this, suppose S is a stopping time and S n S are stopping times such that S n < S on (S > ). We will show that P(S = T ) =. To do that, it suffices to show that P(S N = T ) = for each positive integer N. Since P t t is a martingale, E P Sn N = E (S n N). Letting n, we obtain (by monotone convergence) that E P (S N) = E (S N). We also know that E P S N = E (S N). Therefore E P (S N) = E P S N. Since P has increasing paths, this implies that P (S N) = P S N, and we conclude P(S N = T ) =. In this chapter we will assume througout for simplicity that every jump time of whichever process we are considering is totally inaccessible. The general case is not much harder, but the differences are only technical. A supermartingale Z is of class D if the family of random variables: is uniformly integrable. {Z T : T a finite stopping time} Theorem 3.1 (Doob-Meyer decomposition) Let {F t } be a filtration satisfying the usual conditions and let Z be a supermartingale of class D whose paths are right continuous with left limits. Then Z can be written Z t = M t A t in one and only one way, where M and A are adapted processes whose paths are right continuous with left limits, A has continuous increasing paths and A = lim t A t is integrable, and M is a uniformly integrable martingale Suppose A t is a bounded increasing process whose paths are right continuous with left limits. Recall that a function f is increasing if s < t implies

43 3.1. DECOMPOSITION OF MARTINGALES 37 f(s) f(t). Then trivially A t is a submartingale, and by the Doob-Meyer decomposition, there exists a continuous increasing process Ãt such that A t Ãt is a martingale. We call à t the compensator of A t. If A t = B t C t is the difference of two increasing processes B t and C t, then we can use linearity to define Ãt as B t C t. We can even extend the notion of compensator to the case where A t is complex valued and has paths that are locally of bounded variation by looking at the real and imaginary parts. We will use the following lemma. A = lim t A t. For any increasing process A we let Lemma 3.2 Suppose A t has increasing paths that are right continuous with left limits, A t K a.s. for each t, and let B t be its compensator. Then E B 2 2K 2. Proof. If M t = A t B t, then M t is a martingale, and then E [M M t F t ] =. We then write E B 2 = 2E = 2E (B B t ) db t = 2E E [A A t F t ] db t 2KE = 2KE B = 2KE A 2K 2. E [B B t F t ] db t db t From the lemma we get the following corollary. Corollary 3.3 If A t = B t C t, where B t and C t are increasing right continuous processes with B = C =, a.s., and in addition B and C are bounded, then E sup à 2 t <. t

Chapter 1. Poisson processes. 1.1 Definitions

Chapter 1. Poisson processes. 1.1 Definitions Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes BMS Basic Course Stochastic Processes II/ Wahrscheinlichkeitstheorie III Michael Scheutzow Lecture Notes Technische Universität Berlin Sommersemester 218 preliminary version October 12th 218 Contents

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

Probability Theory. Richard F. Bass

Probability Theory. Richard F. Bass Probability Theory Richard F. Bass ii c Copyright 2014 Richard F. Bass Contents 1 Basic notions 1 1.1 A few definitions from measure theory............. 1 1.2 Definitions............................. 2

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Applications of Ito s Formula

Applications of Ito s Formula CHAPTER 4 Applications of Ito s Formula In this chapter, we discuss several basic theorems in stochastic analysis. Their proofs are good examples of applications of Itô s formula. 1. Lévy s martingale

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Brownian Motion and Stochastic Calculus

Brownian Motion and Stochastic Calculus ETHZ, Spring 17 D-MATH Prof Dr Martin Larsson Coordinator A Sepúlveda Brownian Motion and Stochastic Calculus Exercise sheet 6 Please hand in your solutions during exercise class or in your assistant s

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Point Process Control

Point Process Control Point Process Control The following note is based on Chapters I, II and VII in Brémaud s book Point Processes and Queues (1981). 1 Basic Definitions Consider some probability space (Ω, F, P). A real-valued

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Properties of an infinite dimensional EDS system : the Muller s ratchet

Properties of an infinite dimensional EDS system : the Muller s ratchet Properties of an infinite dimensional EDS system : the Muller s ratchet LATP June 5, 2011 A ratchet source : wikipedia Plan 1 Introduction : The model of Haigh 2 3 Hypothesis (Biological) : The population

More information

1 Independent increments

1 Independent increments Tel Aviv University, 2008 Brownian motion 1 1 Independent increments 1a Three convolution semigroups........... 1 1b Independent increments.............. 2 1c Continuous time................... 3 1d Bad

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Jump-type Levy Processes

Jump-type Levy Processes Jump-type Levy Processes Ernst Eberlein Handbook of Financial Time Series Outline Table of contents Probabilistic Structure of Levy Processes Levy process Levy-Ito decomposition Jump part Probabilistic

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

(A n + B n + 1) A n + B n

(A n + B n + 1) A n + B n 344 Problem Hints and Solutions Solution for Problem 2.10. To calculate E(M n+1 F n ), first note that M n+1 is equal to (A n +1)/(A n +B n +1) with probability M n = A n /(A n +B n ) and M n+1 equals

More information

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI Contents Preface 1 1. Introduction 1 2. Preliminaries 4 3. Local martingales 1 4. The stochastic integral 16 5. Stochastic calculus 36 6. Applications

More information

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem Definition: Lévy Process Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes David Applebaum Probability and Statistics Department, University of Sheffield, UK July

More information

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc CIMPA SCHOOL, 27 Jump Processes and Applications to Finance Monique Jeanblanc 1 Jump Processes I. Poisson Processes II. Lévy Processes III. Jump-Diffusion Processes IV. Point Processes 2 I. Poisson Processes

More information

(B(t i+1 ) B(t i )) 2

(B(t i+1 ) B(t i )) 2 ltcc5.tex Week 5 29 October 213 Ch. V. ITÔ (STOCHASTIC) CALCULUS. WEAK CONVERGENCE. 1. Quadratic Variation. A partition π n of [, t] is a finite set of points t ni such that = t n < t n1

More information

Exercises Measure Theoretic Probability

Exercises Measure Theoretic Probability Exercises Measure Theoretic Probability 2002-2003 Week 1 1. Prove the folloing statements. (a) The intersection of an arbitrary family of d-systems is again a d- system. (b) The intersection of an arbitrary

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

Selected Exercises on Expectations and Some Probability Inequalities

Selected Exercises on Expectations and Some Probability Inequalities Selected Exercises on Expectations and Some Probability Inequalities # If E(X 2 ) = and E X a > 0, then P( X λa) ( λ) 2 a 2 for 0 < λ

More information

Stochastic Processes

Stochastic Processes Stochastic Processes A very simple introduction Péter Medvegyev 2009, January Medvegyev (CEU) Stochastic Processes 2009, January 1 / 54 Summary from measure theory De nition (X, A) is a measurable space

More information

Doléans measures. Appendix C. C.1 Introduction

Doléans measures. Appendix C. C.1 Introduction Appendix C Doléans measures C.1 Introduction Once again all random processes will live on a fixed probability space (Ω, F, P equipped with a filtration {F t : 0 t 1}. We should probably assume the filtration

More information

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Optional Stopping Theorem Let X be a martingale and T be a stopping time such Plan Counting, Renewal, and Point Processes 0. Finish FDR Example 1. The Basic Renewal Process 2. The Poisson Process Revisited 3. Variants and Extensions 4. Point Processes Reading: G&S: 7.1 7.3, 7.10

More information

7 Poisson random measures

7 Poisson random measures Advanced Probability M03) 48 7 Poisson random measures 71 Construction and basic properties For λ 0, ) we say that a random variable X in Z + is Poisson of parameter λ and write X Poiλ) if PX n) e λ λ

More information

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due ). Show that the open disk x 2 + y 2 < 1 is a countable union of planar elementary sets. Show that the closed disk x 2 + y 2 1 is a countable

More information

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1 Last compiled: November 6, 213 1. Conditional expectation Exercise 1.1. To start with, note that P(X Y = P( c R : X > c, Y c or X c, Y > c = P( c Q : X > c, Y

More information

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9 MAT 570 REAL ANALYSIS LECTURE NOTES PROFESSOR: JOHN QUIGG SEMESTER: FALL 204 Contents. Sets 2 2. Functions 5 3. Countability 7 4. Axiom of choice 8 5. Equivalence relations 9 6. Real numbers 9 7. Extended

More information

An Introduction to Stochastic Processes in Continuous Time

An Introduction to Stochastic Processes in Continuous Time An Introduction to Stochastic Processes in Continuous Time Flora Spieksma adaptation of the text by Harry van Zanten to be used at your own expense May 22, 212 Contents 1 Stochastic Processes 1 1.1 Introduction......................................

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

A D VA N C E D P R O B A B I L - I T Y

A D VA N C E D P R O B A B I L - I T Y A N D R E W T U L L O C H A D VA N C E D P R O B A B I L - I T Y T R I N I T Y C O L L E G E T H E U N I V E R S I T Y O F C A M B R I D G E Contents 1 Conditional Expectation 5 1.1 Discrete Case 6 1.2

More information

STOCHASTIC ANALYSIS FOR JUMP PROCESSES

STOCHASTIC ANALYSIS FOR JUMP PROCESSES STOCHASTIC ANALYSIS FOR JUMP PROCESSES ANTONIS PAPAPANTOLEON Abstract. Lecture notes from courses at TU Berlin in WS 29/1, WS 211/12 and WS 212/13. Contents 1. Introduction 2 2. Definition of Lévy processes

More information

Probability Theory II. Spring 2016 Peter Orbanz

Probability Theory II. Spring 2016 Peter Orbanz Probability Theory II Spring 2016 Peter Orbanz Contents Chapter 1. Martingales 1 1.1. Martingales indexed by partially ordered sets 1 1.2. Martingales from adapted processes 4 1.3. Stopping times and

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

4th Preparation Sheet - Solutions

4th Preparation Sheet - Solutions Prof. Dr. Rainer Dahlhaus Probability Theory Summer term 017 4th Preparation Sheet - Solutions Remark: Throughout the exercise sheet we use the two equivalent definitions of separability of a metric space

More information

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij Weak convergence and Brownian Motion (telegram style notes) P.J.C. Spreij this version: December 8, 2006 1 The space C[0, ) In this section we summarize some facts concerning the space C[0, ) of real

More information

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

Notes on uniform convergence

Notes on uniform convergence Notes on uniform convergence Erik Wahlén erik.wahlen@math.lu.se January 17, 2012 1 Numerical sequences We begin by recalling some properties of numerical sequences. By a numerical sequence we simply mean

More information

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p Doob s inequality Let X(t) be a right continuous submartingale with respect to F(t), t 1 P(sup s t X(s) λ) 1 λ {sup s t X(s) λ} X + (t)dp 2 For 1 < p

More information

P (A G) dp G P (A G)

P (A G) dp G P (A G) First homework assignment. Due at 12:15 on 22 September 2016. Homework 1. We roll two dices. X is the result of one of them and Z the sum of the results. Find E [X Z. Homework 2. Let X be a r.v.. Assume

More information

THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON

THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON GEORGIAN MATHEMATICAL JOURNAL: Vol. 3, No. 2, 1996, 153-176 THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON M. SHASHIASHVILI Abstract. The Skorokhod oblique reflection problem is studied

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

On the submartingale / supermartingale property of diffusions in natural scale

On the submartingale / supermartingale property of diffusions in natural scale On the submartingale / supermartingale property of diffusions in natural scale Alexander Gushchin Mikhail Urusov Mihail Zervos November 13, 214 Abstract Kotani 5 has characterised the martingale property

More information

Lecture Characterization of Infinitely Divisible Distributions

Lecture Characterization of Infinitely Divisible Distributions Lecture 10 1 Characterization of Infinitely Divisible Distributions We have shown that a distribution µ is infinitely divisible if and only if it is the weak limit of S n := X n,1 + + X n,n for a uniformly

More information

ERRATA: Probabilistic Techniques in Analysis

ERRATA: Probabilistic Techniques in Analysis ERRATA: Probabilistic Techniques in Analysis ERRATA 1 Updated April 25, 26 Page 3, line 13. A 1,..., A n are independent if P(A i1 A ij ) = P(A 1 ) P(A ij ) for every subset {i 1,..., i j } of {1,...,

More information

Hardy-Stein identity and Square functions

Hardy-Stein identity and Square functions Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /

More information

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure? MA 645-4A (Real Analysis), Dr. Chernov Homework assignment 1 (Due 9/5). Prove that every countable set A is measurable and µ(a) = 0. 2 (Bonus). Let A consist of points (x, y) such that either x or y is

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1 Chapter 2 Probability measures 1. Existence Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension to the generated σ-field Proof of Theorem 2.1. Let F 0 be

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

DISTRIBUTION OF THE SUPREMUM LOCATION OF STATIONARY PROCESSES. 1. Introduction

DISTRIBUTION OF THE SUPREMUM LOCATION OF STATIONARY PROCESSES. 1. Introduction DISTRIBUTION OF THE SUPREMUM LOCATION OF STATIONARY PROCESSES GENNADY SAMORODNITSKY AND YI SHEN Abstract. The location of the unique supremum of a stationary process on an interval does not need to be

More information

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES RUTH J. WILLIAMS October 2, 2017 Department of Mathematics, University of California, San Diego, 9500 Gilman Drive,

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

Math 6810 (Probability) Fall Lecture notes

Math 6810 (Probability) Fall Lecture notes Math 6810 (Probability) Fall 2012 Lecture notes Pieter Allaart University of North Texas September 23, 2012 2 Text: Introduction to Stochastic Calculus with Applications, by Fima C. Klebaner (3rd edition),

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

MATH 6605: SUMMARY LECTURE NOTES

MATH 6605: SUMMARY LECTURE NOTES MATH 6605: SUMMARY LECTURE NOTES These notes summarize the lectures on weak convergence of stochastic processes. If you see any typos, please let me know. 1. Construction of Stochastic rocesses A stochastic

More information

Stable Lévy motion with values in the Skorokhod space: construction and approximation

Stable Lévy motion with values in the Skorokhod space: construction and approximation Stable Lévy motion with values in the Skorokhod space: construction and approximation arxiv:1809.02103v1 [math.pr] 6 Sep 2018 Raluca M. Balan Becem Saidani September 5, 2018 Abstract In this article, we

More information

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen Title Author(s) Some SDEs with distributional drift Part I : General calculus Flandoli, Franco; Russo, Francesco; Wolf, Jochen Citation Osaka Journal of Mathematics. 4() P.493-P.54 Issue Date 3-6 Text

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi Real Analysis Math 3AH Rudin, Chapter # Dominique Abdi.. If r is rational (r 0) and x is irrational, prove that r + x and rx are irrational. Solution. Assume the contrary, that r+x and rx are rational.

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Convergence of Markov Processes. Amanda Turner University of Cambridge

Convergence of Markov Processes. Amanda Turner University of Cambridge Convergence of Markov Processes Amanda Turner University of Cambridge 1 Contents 1 Introduction 2 2 The Space D E [, 3 2.1 The Skorohod Topology................................ 3 3 Convergence of Probability

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. Lecture 2 1 Martingales We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales. 1.1 Doob s inequality We have the following maximal

More information

Stochastic Analysis. Prof. Dr. Andreas Eberle

Stochastic Analysis. Prof. Dr. Andreas Eberle Stochastic Analysis Prof. Dr. Andreas Eberle March 13, 212 Contents Contents 2 1 Lévy processes and Poisson point processes 6 1.1 Lévy processes.............................. 7 Characteristic exponents.........................

More information

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A ) 6. Brownian Motion. stochastic process can be thought of in one of many equivalent ways. We can begin with an underlying probability space (Ω, Σ, P) and a real valued stochastic process can be defined

More information

The Pedestrian s Guide to Local Time

The Pedestrian s Guide to Local Time The Pedestrian s Guide to Local Time Tomas Björk, Department of Finance, Stockholm School of Economics, Box 651, SE-113 83 Stockholm, SWEDEN tomas.bjork@hhs.se November 19, 213 Preliminary version Comments

More information

Homework #6 : final examination Due on March 22nd : individual work

Homework #6 : final examination Due on March 22nd : individual work Université de ennes Année 28-29 Master 2ème Mathématiques Modèles stochastiques continus ou à sauts Homework #6 : final examination Due on March 22nd : individual work Exercise Warm-up : behaviour of characteristic

More information

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS Qiao, H. Osaka J. Math. 51 (14), 47 66 EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS HUIJIE QIAO (Received May 6, 11, revised May 1, 1) Abstract In this paper we show

More information

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

16.1. Signal Process Observation Process The Filtering Problem Change of Measure 1. Introduction The following notes aim to provide a very informal introduction to Stochastic Calculus, and especially to the Itô integral. They owe a great deal to Dan Crisan s Stochastic Calculus and

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

Set-Indexed Processes with Independent Increments

Set-Indexed Processes with Independent Increments Set-Indexed Processes with Independent Increments R.M. Balan May 13, 2002 Abstract Set-indexed process with independent increments are described by convolution systems ; the construction of such a process

More information

The main results about probability measures are the following two facts:

The main results about probability measures are the following two facts: Chapter 2 Probability measures The main results about probability measures are the following two facts: Theorem 2.1 (extension). If P is a (continuous) probability measure on a field F 0 then it has a

More information

On a class of stochastic differential equations in a financial network model

On a class of stochastic differential equations in a financial network model 1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

STAT 7032 Probability Spring Wlodek Bryc

STAT 7032 Probability Spring Wlodek Bryc STAT 7032 Probability Spring 2018 Wlodek Bryc Created: Friday, Jan 2, 2014 Revised for Spring 2018 Printed: January 9, 2018 File: Grad-Prob-2018.TEX Department of Mathematical Sciences, University of Cincinnati,

More information

Iowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions

Iowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions Math 50 Iowa State University Introduction to Real Analysis Department of Mathematics Instructor: Alex Roitershtein Summer 205 Homework #5 Solutions. Let α and c be real numbers, c > 0, and f is defined

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

MATHS 730 FC Lecture Notes March 5, Introduction

MATHS 730 FC Lecture Notes March 5, Introduction 1 INTRODUCTION MATHS 730 FC Lecture Notes March 5, 2014 1 Introduction Definition. If A, B are sets and there exists a bijection A B, they have the same cardinality, which we write as A, #A. If there exists

More information