Jump Processes. Richard F. Bass

Similar documents
Chapter 1. Poisson processes. 1.1 Definitions

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

An essay on the general theory of stochastic processes

Stochastic integration. P.J.C. Spreij

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

1. Stochastic Processes and filtrations

Lecture 17 Brownian motion as a Markov process

Probability Theory. Richard F. Bass

n E(X t T n = lim X s Tn = X s

Poisson random measure: motivation

Applications of Ito s Formula

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Brownian Motion and Stochastic Calculus

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Point Process Control

Lecture 12. F o s, (1.1) F t := s>t

Properties of an infinite dimensional EDS system : the Muller s ratchet

1 Independent increments

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Jump-type Levy Processes

Exponential martingales: uniform integrability results and applications to point processes

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

(A n + B n + 1) A n + B n

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

(B(t i+1 ) B(t i )) 2

Exercises Measure Theoretic Probability

µ X (A) = P ( X 1 (A) )

ELEMENTS OF PROBABILITY THEORY

Brownian Motion and Conditional Probability

Part III Stochastic Calculus and Applications

Selected Exercises on Expectations and Some Probability Inequalities

Stochastic Processes

Doléans measures. Appendix C. C.1 Introduction

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

7 Poisson random measures

3 (Due ). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

MAT 570 REAL ANALYSIS LECTURE NOTES. Contents. 1. Sets Functions Countability Axiom of choice Equivalence relations 9

An Introduction to Stochastic Processes in Continuous Time

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

A D VA N C E D P R O B A B I L - I T Y

STOCHASTIC ANALYSIS FOR JUMP PROCESSES

Probability Theory II. Spring 2016 Peter Orbanz

9 Brownian Motion: Construction

4th Preparation Sheet - Solutions

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

{σ x >t}p x. (σ x >t)=e at.

Notes on uniform convergence

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

P (A G) dp G P (A G)

THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON

I forgot to mention last time: in the Ito formula for two standard processes, putting

4 Sums of Independent Random Variables

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Lecture 19 L 2 -Stochastic integration

Lecture 22 Girsanov s Theorem

1 Brownian Local Time

On the submartingale / supermartingale property of diffusions in natural scale

Lecture Characterization of Infinitely Divisible Distributions

ERRATA: Probabilistic Techniques in Analysis

Hardy-Stein identity and Square functions

2 (Bonus). Let A X consist of points (x, y) such that either x or y is a rational number. Is A measurable? What is its Lebesgue measure?

Lecture 21 Representations of Martingales

Theorem 2.1 (Caratheodory). A (countably additive) probability measure on a field has an extension. n=1

Stochastic Differential Equations.

DISTRIBUTION OF THE SUPREMUM LOCATION OF STATIONARY PROCESSES. 1. Introduction

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

The strictly 1/2-stable example

Math 6810 (Probability) Fall Lecture notes

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

MATH 6605: SUMMARY LECTURE NOTES

Stable Lévy motion with values in the Skorokhod space: construction and approximation

Some SDEs with distributional drift Part I : General calculus. Flandoli, Franco; Russo, Francesco; Wolf, Jochen

A Concise Course on Stochastic Partial Differential Equations

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

Convergence of Markov Processes. Amanda Turner University of Cambridge

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

Stochastic Analysis. Prof. Dr. Andreas Eberle

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

The Pedestrian s Guide to Local Time

Homework #6 : final examination Due on March 22nd : individual work

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

Reflected Brownian Motion

Set-Indexed Processes with Independent Increments

The main results about probability measures are the following two facts:

On a class of stochastic differential equations in a financial network model

SMSTC (2007/08) Probability.

STAT 7032 Probability Spring Wlodek Bryc

Iowa State University. Instructor: Alex Roitershtein Summer Homework #5. Solutions

Exercises. T 2T. e ita φ(t)dt.

MATHS 730 FC Lecture Notes March 5, Introduction

Transcription:

Jump Processes Richard F. Bass

ii c Copyright 214 Richard F. Bass

Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov properties......................... 4 1.4 A characterization........................ 7 1.5 Martingales............................ 8 2 Lévy processes 17 2.1 Examples............................. 17 2.2 Construction of Lévy processes................. 2 2.3 Representation of Lévy processes................ 24 2.4 Symmetric stable processes.................... 31 3 Stochastic calculus 35 3.1 Decomposition of martingales.................. 35 3.2 Stochastic integrals........................ 41 3.3 Itô s formula............................ 43 3.4 The reduction theorem...................... 48 3.5 Semimartingales.......................... 52 3.6 The Girsanov theorem...................... 55 iii

iv CONTENTS 4 Stochastic differential equations 57 4.1 Poisson point processes...................... 57 4.2 The Lipschitz case........................ 59 4.3 Analogue of Yamada-Watanabe theorem............ 61 5 The space D[, 1] 67 5.1 Convergence of probability measures.............. 67 5.2 The portmanteau theorem.................... 67 5.3 The Prohorov theorem...................... 7 5.4 Metrics for D[, 1]......................... 72 5.5 Compactness and completeness................. 76 5.6 The Aldous criterion....................... 8 6 Markov processes 85 6.1 Introduction............................ 85 6.2 Definition of a Markov process.................. 86 6.3 Transition probabilities...................... 88 6.4 The canonical process and shift operators............ 9 6.5 Enlarging the filtration...................... 91 6.6 The Markov property....................... 92 6.7 Strong Markov property..................... 93 7 Stable-like processes 95 7.1 Martingale problems....................... 95 7.2 Stable-like processes....................... 97 7.3 Some properties.......................... 97 7.4 Harnack inequality........................ 11 7.5 Regularity............................. 18

CONTENTS v 8 Symmetric jump processes 113 8.1 Dirichlet forms.......................... 113 8.2 Construction of the semigroup.................. 115 8.3 Symmetric jump processes.................... 12 8.4 The Poincaré and Nash inequalities............... 121 8.5 Upper bounds on the transition densities............ 123

vi CONTENTS

Chapter 1 Poisson processes 1.1 Definitions Let (Ω, F, P) be a probability space. A filtration is a collection of σ-fields F t contained in F such that F s F t whenever s < t. A filtration satisfies the usual conditions if it is complete: N F t for all t whenever P(N) = and it is right continuous: F t+ = F t for all t, where F t+ = ε> F t+ε. Definition 1.1 Let {F t } be a filtration, not necessarily satisfying the usual conditions. A Poisson process with parameter λ > is a stochastic process X satisfying the following properties: (1) X =, a.s. (2) The paths of X t are right continuous with left limits. (3) If s < t, then X t X s is a Poisson random variable with parameter λ(t s). (4) If s < t, then X t X s is independent of F s. Define X t = lim s t,s<t X s, the left hand limit at time t, and X t = X t X t, the size of the jump at time t. We say a function f is increasing if s < t implies f(s) f(t). We use strictly increasing when s < t implies f(s) < f(t). We have the following proposition. 1

2 CHAPTER 1. POISSON PROCESSES Proposition 1.2 Let X be a Poisson process. With probability one, the paths of X t are increasing and are constant except for jumps of size 1. There are only finitely many jumps in each finite time interval. Proof. For any fixed s < t, we have that X t X s has the distribution of a Poisson random variable with parameter λ(t s), hence is non-negative, a.s.; let N s,t be the null set of ω s where X t (ω) < X s (ω). The set of pairs (s, t) with s and t rational is countable, and so N = s,t Q+ N s,t is also a null set, where we write Q + for the non-negative rationals. For ω / N, X t X s whenever s < t are rational. In view of the right continuity of the paths of X, this shows the paths of X are increasing with probability one. Similarly, since Poisson random variables only take values in the nonnegative integers, X t is a non-negative integer, a.s. Using this fact for every t rational shows that with probability one, X t takes values only in the nonnegative integers when t is rational, and the right continuity of the paths implies this is also the case for all t. Since the paths have left limits, there can only be finitely many jumps in finite time. It remains to prove that X t is either or 1 for all t. Let t >. If there were a jump of size 2 or larger at some time t strictly less than t, then for each n sufficiently large there exists k n 2 n such that X (kn+1)t /2 n X k nt /2n 2. Therefore P( s < t : X s 2) P( k 2 n : X (k+1)t /2 n X kt /2n 2) (1.1) 2 n sup P(X (k+1)t /2 n X kt /2 n 2) k 2 n = 2 n P(X t /2 n 2n ) 2 n (1 P(X t /2 n = ) P(X t /2 n = 1)) = 2 n (1 e λt /2 n (λt /2 n )e λt /2 n). We used Definition 1.1(3) for the two equalities. By l Hôpital s rules, (1 e x xe x )/x as x. We apply this with x = λt /2 n, and see that the last line of (1.1) tends to as n. Since the left hand side of (1.1) does not depend on n, it must be. This holds for each t.

1.2. STOPPING TIMES 3 1.2 Stopping times Throughout this section we suppose we have a filtration {F t } satisfying the usual conditions. Definition 1.3 A random variable T : Ω [, ] is a stopping time if for all t, (T < t) F t. We say T is a finite stopping time if T <, a.s. We say T is a bounded stopping time if there exists K [, ) such that T K, a.s. Note that T can take the value infinity. Stopping times are also known as optional times. Given a stochastic process X, we define X T (ω) to be equal to X(T (ω), ω), that is, for each ω we evaluate t = T (ω) and then look at X(, ω) at this time. Proposition 1.4 Suppose F t satisfies the usual conditions. Then (1) T is a stopping time if and only if (T t) F t for all t. (2) If T = t, a.s., then T is a stopping time. (3) If S and T are stopping times, then so are S T and S T. (4) If T n, n = 1, 2,..., are stopping times with T 1 T 2, then so is sup n T n. (5) If T n, n = 1, 2,..., are stopping times with T 1 T 2, then so is inf n T n. (6) If s and S is a stopping time, then so is S + s. Proof. We will just prove part of (1), leaving the rest as an exercise. Note (T t) = n N (T < t + 1/n) F t+1/n for each N. Thus (T t) N F t+1/n F t+ = F t. It is often useful to be able to approximate stopping times from the right. If T is a finite stopping time, that is, T <, a.s., define T n (ω) = (k + 1)/2 n if k/2 n T (ω) < (k + 1)/2 n. (1.2)

4 CHAPTER 1. POISSON PROCESSES Define F T = {A F : for each t >, A (T t) F t }. (1.3) This definition of F T, which is supposed to be the collection of events that are known by time T, is not very intuitive. But it turns out that this definition works well in applications. Proposition 1.5 Suppose {F t } is a filtration satisfying the usual conditions. (1) F T is a σ-field. (2) If S T, then F S F T. (3) If F T + = ε> F T +ε, then F T + = F T. (4) If X t has right continuous paths, then X T is F T -measurable. Proof. If A F T, then A c (T t) = (T t) \ [A (T t)] F t, so A c F T. The rest of the proof of (1) is easy. Suppose A F S and S T. Then A (T t) = [A (S t)] (T t). We have A (S t) F t because A F S, while (T t) F t because T is a stopping time. Therefore A (T t) F t, which proves (2). For (3), if A F T +, then A F T +ε for every ε, and so A (T +ε t) F t for all t. Hence A (T t ε) F t for all t, or equivalently A (T t) F t+ε for all t. This is true for all ε, so A (T t) F t+ = F t. This says A F T. (4) Define T n by (1.2). Note (X Tn B) (T n = k/2 n ) = (X k/2 n B) (T n = k/2 n ) F k/2 n. Since T n only takes values in {k/2 n : k }, we conclude (X Tn B) (T n t) F t, and so (X Tn B) F Tn F T +1/2 n. Hence X Tn is F T +1/2 n measurable. If n m, then X Tn is measurable with respect to F T +1/2 n F T +1/2 m. Since X Tn X T, then X T is F T +1/2 m measurable for each m. Therefore X T is measurable with respect to F T + = F T. 1.3 Markov properties Let us begin with the Markov property.

1.3. MARKOV PROPERTIES 5 Theorem 1.6 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process with respect to {F t }. If u is a fixed time, then Y t = P t+u P u is a Poisson process independent of F u. Proof. Let G t = F t+u. It is clear that Y has right continuous paths, is zero at time, has jumps of size one, and is adapted to {G t }. Since Y t Y s = P t+u P s+u, then Y t Y s is a Poisson random variable with mean λ(t s) that is independent of F s+u = G s. The strong Markov property is the Markov property extended by replacing fixed times u by finite stopping times. Theorem 1.7 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process adapted to {F t }. If T is a finite stopping time, then Y t = P T +t P T is a Poisson process independent of F T. Proof. We will first show that whenever m 1, t 1 < < t m, f is a bounded continuous function on R m, and A F T, then E [f(y t1,..., Y tm ); A] = E [f(p t1,..., P tm )] P(A). (1.4) Once we have done this, we will then show how (1.4) implies our theorem. To prove (1.4), define T n by (1.2). We have E [f(p Tn+t1 P Tn,..., P Tn+tm P Tn ); A] (1.5) = E [f(p Tn+t1 P Tn,..., P Tn+tm P Tn ); A, T n = k/2 n ] = k=1 E [f(p t1 +k/2 n P k/2 n,..., P t m+k/2 n P k/2 n); A, T n = k/2 n ]. k=1 Following the usual practice in probability that, means and, we use E [ ; A, T n = k/2 n ] as an abbreviation for E [ ; A (T n = k/2 n )]. Since A F T, then A (T n = k/2 n ) = A ((T < k/2 n )\(T < (k 1)/2 n )) F k/2 n. We use the independent increments property of Poisson process and

6 CHAPTER 1. POISSON PROCESSES the fact that P t P s has the same law as P t s to see that the sum in the last line of (1.5) is equal to E [f(p t1 +k/2 n P k/2 n,..., P t m+k/2 n P k/2 n)] P(A, T n = k/2 n ) k=1 = E [f(p t1,..., P tm )] P(A, T n = k/2 n ) k=1 = E [f(p t1,..., P tm )] P(A), which is the right hand side of (1.4). Thus E [f(p Tn+t 1 P Tn,...,P Tn+t m P Tn ); A] (1.6) = E [f(p t1,..., P tm )] P(A). Now let n. By the right continuity of the paths of P, the boundedness and continuity of f, and the dominated convergence theorem, the left hand side of (1.6) converges to the left hand side of (1.4). If we take A = Ω in (1.4), we obtain E [f(y t1,..., Y tm )] = E [f(p t1,..., P tm )] whenever m 1, t 1,..., t m [, ), and f is a bounded continuous function on R m. This implies that the finite dimensional distributions of Y and P are the same. Since Y has right continuous paths, Y is a Poisson process. Next take A F T. By using a limit argument, (1.4) holds whenever f is the indicator of a Borel subset B of R d, or in other words, whenever B is a cylindrical set. P(Y B, A) = P(Y B)P(A) (1.7) When we discuss the Skorokhod topology, we will be able be more precise for the independence argument. Observe that what was needed for the above proof to work is not that P be a Poisson process, but that the process P have right continuous paths and that P t P s be independent of F s and have the same distribution as P t s. We therefore have the following corollary.

1.4. A CHARACTERIZATION 7 Corollary 1.8 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let X be a process adapted to {F t }. Suppose X has paths that are right continuous with left limits and suppose X t X s is independent of F s and has the same law as X t s whenever s < t. If T is a finite stopping time, then Y t = X T +t X T is a process that is independent of F T and X and Y have the same law. 1.4 A characterization Another characterization of the Poisson process is as follows. Let T 1 = inf{t : X t = 1}, the time of the first jump. Define T i+1 = inf{t > T i : X t = 1}, so that T i is the time of the i th jump. Proposition 1.9 The random variables T 1, T 2 T 1,..., T i+1 T i,... are independent exponential random variables with parameter λ. Proof. In view of Corollary 1.8 it suffices to show that T 1 is an exponential random variable with parameter λ. If T 1 > t, then the first jump has not occurred by time t, so X t is still zero. Hence P(T 1 > t) = P(X t = ) = e λt, using the fact that X t is a Poisson random variable with parameter λt. We can reverse the characterization in Proposition 1.9 to construct a Poisson process. We do one step of the construction, leaving the rest as an exercise. Let U 1, U 2,... be independent exponential random variables with parameter λ and let T j = j i=1 U i. Define X t (ω) = k if T k (ω) t < T k+1 (ω). (1.8) An examination of the densities shows that an exponential random variable has a gamma distribution with parameters λ and r = 1, so T j is a gamma random variable with parameters λ and j. Thus P(X t < k) = P(T k > t) = t λe λx (λx) k 1 dx. Γ(k)

8 CHAPTER 1. POISSON PROCESSES Performing the integration by parts repeatedly shows that k 1 λt (λt)i P(X t < k) = e, i! and so X t is a Poisson random variable with parameter λt. We will use the following proposition later. i= Proposition 1.1 Let {F t } be a filtration satisfying the usual conditions. Suppose X =, a.s., X has paths that are right continuous with left limits, X t X s is independent of F s if s < t, and X t X s has the same law as X t s whenever s < t. If the paths of X are piecewise constant, increasing, all the jumps of X are of size 1, and X is not identically, then X is a Poisson process. Proof. Let T = and T i+1 = inf{t > T i : X t = 1}, i = 1, 2,.... We will show that if we set U i = T i T i 1, then the U i s are i.i.d. exponential random variables. By Corollary 1.8, the U i s are independent and have the same law. Hence it suffices to show U 1 is an exponential random variable. We observe P(U 1 > s + t) = P(X s+t = ) = P(X s+t X s =, X s = ) = P(X t+s X s = )P(X s = ) = P(X t = )P(X s = ) = P(U 1 > t)p(u 1 > s). Setting f(t) = P(U 1 > t), we thus have f(t + s) = f(t)f(s). Since f(t) is decreasing and < f(t) < 1, we conclude P(U 1 > t) = f(t) = e λt for some λ >, or U 1 is an exponential random variable. 1.5 Martingales We define continuous time martingales. Let {F t } be a filtration, not necessarily satisfying the usual conditions.

1.5. MARTINGALES 9 Definition 1.11 M t is a continuous time martingale with respect to the filtration {F t } and the probability measure P if (1) E M t < for each t; (2) M t is F t measurable for each t; (3) E [M t F s ] = M s, a.s., if s < t. Part (2) of the definition can be rephrased as saying M t is adapted to F t. If in part (3) = is replaced by, then M t is a submartingale, and if it is replaced by, then we have a supermartingale. Taking expectations in Definition 1.11(3), we see that if s < t, then E M s E M t is M is a submartingale and E M s E M t if M is a supermartingale. Thus submartingales tend to increase, on average, and supermartingales tend to decrease, on average. If P t is a Poisson process with index λ, then P t λt is a continuous time martingale. To see this, E [P t λt F s ] = E [P t P s F s ] λt + P s = E [P t P s ] λt + P s = λ(t s) = λt + P s = P s λs. We give another example of a martingale. Example 1.12 Recall that given a filtration {F t }, each F t is contained in F, where (Ω, F, P) is our probability space. Let X be an integrable F measurable random variable, and let M t = E [X F t ]. Then and M is a martingale. E [M t F s ] = E [E [X F t ] F s ] = E [X F s ] = M s, We derive the analogs of Doob s inequalities in the stochastic process context. Theorem 1.13 Suppose M t is a martingale or non-negative submartingale with paths that are right continuous with left limits. Then

1 CHAPTER 1. POISSON PROCESSES (1) P(sup M s λ) E M t /λ. s t (2) If 1 < p <, then ( p ) pe E [sup M s ] p Mt p. s t p 1 Proof. We will do the case where M t is a martingale, the submartingale case being nearly identical. Let D n = {kt/2 n : k 2 n }. If we set N (n) k = M kt/2 n and G (n) k = F kt/2 n, it is clear that {N (n) k } is a discrete time martingale with respect to {G (n) k }. Let A n = {sup s t,s Dn M s > λ}. By Doob s inequality for discrete time martingales, P(A n ) = P(max k 2 (n) N n k > λ) E N (n) 2 n λ = E M t λ. Note that the A n are increasing, and since M t is right continuous, Then n A n = {sup M s > λ}. s t P(sup M s > λ) = P( n A n ) = lim P(A n ) E M t /λ. s t n If we apply this with λ replaced by λ ε and let ε, we obtain (1). The proof of (2) is similar. By Doob s inequality for discrete time martingales, ( p ) pe ( E [sup k p (n) p ) pe ] N 2 p 1 n p = Mt p. p 1 k 2 n N (n) Since sup k 2 n N (n) k p increases to sup s t M s p by the right continuity of M, (2) follows by Fatou s lemma. Here is an example. If P t is a Poisson process of index λ, then P t λt is a martingale. So e a(pt λt) is a submartingale for any real number a. Then P(sup s t P s λs A) = P(sup e a(ps λs > e aa ) e aa E e apt e aλt. s t

1.5. MARTINGALES 11 We know ( ) E e apt = exp (e a 1)λt. We substitute this in the above and then optimize over a. We will need Doob s optional stopping theorem for continuous time martingales. Theorem 1.14 Let {F t } be a filtration satisfying the usual conditions. If M t is a martingale or non-negative submartingale whose paths are right continuous, sup t E M 2 t <, and T is a finite stopping time, then E M T E M. Proof. We do the submartingale case, the martingale case being very similar. By Doob s inequality (Theorem 1.13(1)), E [sup Ms 2 ] 4E Mt 2. s t Letting t, we have E [sup t M 2 t ] < by Fatou s lemma. Let us first suppose that T < K, a.s., for some real number K. Define T n by (1.2). Let N (n) k = M k/2 n, G (n) k = F k/2 n, and S n = 2 n T n. By Doob s optional stopping theorem applied to the submartingale N (n) k, we have E M = E N (n) E N (n) S n = E M Tn. Since M is right continuous, M Tn M T, a.s. The random variables M Tn are bounded by 1+sup t M 2 t, so by dominated convergence, E M Tn E M T. We apply the above to the stopping time T K to get E M T K E M. The random variables M T K are bounded by 1+sup t M 2 t, so by dominated convergence, we get E M T E M when we let K. We present the continuous time version of Doob s martingale convergence theorem. We will see that not only do we get limits as t, but also a regularity result. Let D n = {k/2 n : k }, D = n D n. Theorem 1.15 Let {M t : t D} be either a martingale, a submartingale, or a supermartingale with respect to {F t : t D} and suppose sup t D E M t <. Then

12 CHAPTER 1. POISSON PROCESSES (1) lim t M t exists, a.s. (2) With probability one M t has left and right limits along D. The second conclusion says that except for a null set, if t [, ), then both lim t D,t t M t and lim t D,t t M t exist and are finite. The null set does not depend on t. Proof. Martingales are also submartingales and if M t is a supermartingale, then M t is a submartingale, so we may without loss of generality restrict our attention to submartingales. By Doob s inequality, P( sup M t > λ) 1 t D n,t n λ E M n. Letting n and using Fatou s lemma, P(sup t D M t > λ) 1 λ sup E M t. t This is true for all λ, so with probability one, { M t : t D} is a bounded set. Therefore the only way either (1) or (2) can fail is that if for some pair of rationals a < b the number of upcrossings of [a, b] by {M t : t D} is infinite. Recall that we define upcrossings as follows. Given an interval [a, b] and a submartingale M, if S 1 = inf{t : M t a}, T i = inf{t > S i : M t b}, and S i+1 = inf{t > T i : M t a}, then the number of upcrossings up to time u is sup{k : T k u}. Doob s upcrossing lemma tells us that if V n is the number of upcrossings by {M t : t D n [, n]}, then E V n E M n b a. Letting n and using Fatou s lemma, the number of upcrossings of [a, b] by {M t : t D} has finite expectation, hence is finite, a.s. If N a,b is the null set where the number of upcrossings of [a, b] by {M t : t D} is infinite and N = a<b,a,b Q+ N a,b, where Q + is the collection of non-negative rationals, then P(N) =. If ω / N, then (1) and (2) hold. As a corollary we have

1.5. MARTINGALES 13 Corollary 1.16 Let {F t } be a filtration satisfying the usual conditions, and let M t be a martingale with respect to {F t }. Then M has a version that is also a martingale and that in addition has paths that are right continuous with left limits. Proof. Let D be as in the above proof. For each integer N 1, E M t E M N < for t N since M t is a submartingale by the conditional expectation form of Jensen s inequality. Therefore M t N has left and right limits when taking limits along t D. Since N is arbitrary, M t has left and right limits when taking limits along t D, except for a set of ω s that form a null set. Let M t = lim M u. u D,u>t,u t It is clear that M has paths that are right continuous with left limits. Since F t+ = F t and M t is F t+ measurable, then M t is F t measurable. Let N be fixed. We will show {M t ; t N} is a uniformly integrable family of random variables. Let ε >. Since M N is integrable, there exists δ such that if P(A) < δ, then E [ M N ; A] < ε. If L is large enough, P( M t > L) E M t /L E M N /L < δ. Then E [ M t ; M t > L] E [ M N ; M t > L] < ε, since M t is a submartingale and ( M t > L) F t. Uniform integrability is proved. Now let t < N. If B F t, E [ M t ; B] = lim E [M u; B] = E [M t ; B]. u D,u>t,u t Here we used the Vitali convergence theorem and the fact that M t is a martingale. Since M t is F t measurable, this proves that M t = M t, a.s. Since N was arbitrary, we have this for all t. We thus have found a version of M that has paths that are right continuous with left limits. That M t is a martingale is easy. The following technical result will be used in the next chapter. A function f is increasing if s < t implies f(s) f(t). A process A t has increasing paths if the function t A t (ω) is increasing for almost every ω.

14 CHAPTER 1. POISSON PROCESSES Proposition 1.17 Suppose {F t } is a filtration satisfying the usual conditions and suppose A t is an adapted process with paths that are increasing, are right continuous with left limits, and A = lim t A t exists, a.s. Suppose X is a non-negative integrable random variable, and M t is a version of the martingale E [X F t ] which has paths that are right continuous with left limits. Suppose E [XA ] <. Then E X da s = E M s da s. (1.9) Proof. First suppose X and A are bounded. Let n > 1 and let us write E X da s as E [X(A k/2 n A (k 1)/2 n)]. k=1 Conditioning the k th summand on F k/2 n, this is equal to [ ] E E [X F k/2 n](a k/2 n A (k 1)/2 n). k=1 Given s and n, define s n to be that value of k/2 n such that (k 1)/2 n < s k/2 n. We then have E X da s = E M sn da s. (1.1) For any value of s, s n s as n, and since M has right continuous paths, M sn M s. Since X is bounded, so is M. By dominated convergence, the right hand side of (1.1) converges to E M s da s. This completes the proof when X and A are bounded. We apply this to X N and A N, let N, and use monotone convergence for the general case. The only reason we assume X is non-negative is so that the integrals make sense. The equation (1.9) can be rewritten as E X da s = E E [X F s ] da s. (1.11)

1.5. MARTINGALES 15 We also have E t X da s = E t E [X F s ] da s (1.12) for each t. This follows either by following the above proof or by applying Proposition 1.17 to A s t.

16 CHAPTER 1. POISSON PROCESSES

Chapter 2 Lévy processes A Lévy process is a process with stationary and independent increments whose paths are right continuous with left limits. Having stationary increments means that the law of X t X s is the same as the law of X t s X whenever s < t. Saying that X has independent increments means that X t X s is independent of σ(x r ; r s) whenever s < t. We want to examine the structure of Lévy processes. We know three examples: the Poisson process, Brownian motion, and the deterministic process X t = t. It turns out all Lévy processes can be built up out of these as building blocks. We will show how to construct Lévy processes and we will give a representation of an arbitrary Lévy process. Recall that we use X t = lim s<t,s t X s and X t = X t X t. 2.1 Examples Let us begin at looking at some simple Lévy processes. Let P j t, j = 1,..., J, be a sequence of independent Poisson processes with parameters λ j, resp. Each P j t is a Lévy process and the formula for the characteristic function of a Poisson random variable shows that the characteristic function of P j t is E e iup j t = exp(tλ j (e iu 1)). 17

18 CHAPTER 2. LÉVY PROCESSES Therefore the characteristic function of a j P j t is E e iua jp j t = exp(tλ j (e iua j 1)) and the characteristic function of a j P j t a j λ j t is E e iua jp t j a)jλ jt = exp(tλ j (e iua j 1 iua j )). If we let m j be the measure on R defined by m j (dx) = λ j δ aj (dx), where δ aj (dx) is point mass at a j, then the characteristic function for a j P j t can be written as ( ) exp t [e iux 1] m j (dx) (2.1) R and the one for a j P j t a j λ j t as ( ) exp t [e iux 1 iux] m j (dx). (2.2) Now let R X t = J j=1 a j P j t. It is clear that the paths of X t are right continuous with left limits, and the fact that X has stationary and independent increments follows from the corresponding property of the P j s. Moreover the characteristic function of a sum of independent random variables is the product of the characteristic functions, so the characteristic function of X t is given by ( E e iuxt = exp t with m(dx) = J j=1 λ jδ aj (dx). R ) [e iux 1] m(dx) (2.3) The process Y t = X t t J j=1 a jλ j is also a Lévy process and its characteristic function is ( ) E e iuyt = exp t [e iux 1 iux] m(dx), (2.4) R again with m(dx) = J j=1 λ jδ aj (dx).

2.1. EXAMPLES 19 Remark 2.1 Recall that if ϕ is the characteristic function of a random variable Z, then ϕ () = ie Z and ϕ () = E Z 2. If Y t is as in the paragraph above, then clearly E Y t =, and calculating the second derivative of E e iuyt at, we obtain E Y 2 t = t x 2 m(dx). The following lemma is a restatement of Corollary 1.8. Lemma 2.2 If X t is a Lévy process and T is a finite stopping time, then X T +t X T is a Lévy process with the same law as X t X and independent of F T. We will need the following lemma: Lemma 2.3 Suppose X 1,..., X n are independent exponential random variables with parameters a 1,..., a n, resp. (1) Then min(x 1,..., X n ) is an exponential random variable with parameter a 1 + + a n. (2) The probability that X i is the smallest of the n exponentials is a i a 1 + + a n. Proof. (1) Write P(min(X 1,..., X n ) > t) = P(X 1 > t,..., X n > t) = P(X 1 > t) P(X n > t) = e a 1t e ant = e (a 1+ +a n)t. (2) Without loss of generality we may suppose i = 1. Let s first do the case n = 2. The joint density of (X 1, X 2 ) is a 1 e a 1x a 2 e a 2y and we want to integrate this over x < y. Doing the integration yields a 1 /(a 1 + a 2 ). For the general case of n > 2, apply the above to X 1 and min(x 2,..., X n ). If P 1,..., P n are independent Poisson processes with parameters λ 1,..., λ n resp., let X t = n i=1 b ip i (t). By the above lemma, the times between jumps

2 CHAPTER 2. LÉVY PROCESSES of X are independent exponentials with parameter λ 1 + + λ n. At each jump, X jumps b i with probability λ i /(λ 1 + + λ n ). Thus another way to construct X is to let U 1, U 2,... be independent exponentials with parameter λ 1 +... + λ n and let Y 1, Y 2,... be a sequence of i.i.d. random variables independent of the U i s such that P(Y k = b j ) = λ j /(λ 1 +... + λ n ). We then let X be, let X t be piecewise constant, and at time m i=1 U i we let X jump by the amount Y m. 2.2 Construction of Lévy processes A process X has bounded jumps if there exists a real number K > such that sup t X t K, a.s. Lemma 2.4 If X t is a Lévy process with bounded jumps and with X =, then X t has moments of all orders, that is, E X t p < for all positive integers p. Proof. Suppose the jumps of X t are bounded in absolute value by K. Since X t is right continuous with left limits, there exists M > K such that P(sup s t X s M) 1/2. Let T 1 = inf{t : X t M} and T i+1 = inf{t > T i : X t X Ti > M}. For s < T 1, X s M, and then X T1 X T1 + X T1 M + K 2M. We have using Lemma 2.2. Now P(T i+1 < t) P(T i < t, T i+1 T i < t) = P(T i+1 T i < t)p(t i < t) = P(T 1 < t)p(t i < t), P(T 1 < t) P(sup X s M) 1, 2 s t so P(T i+1 < t) 1P(T 2 i < t), and then by induction, P(T i < t) 2 i. Therefore P(sup X s 2(i + 1)M) P(T i < t) 2 i s t

2.2. CONSTRUCTION OF LÉVY PROCESSES 21 and the lemma now follows immediately. A key lemma is the following. Lemma 2.5 Suppose I is a finite interval of the form (a, b), [a, b), (a, b], or [a, b] with a > and m is a finite measure on R giving no mass to I c. Then there exists a Lévy process X t satisfying (2.3) Proof. First let us consider the case where I = [a, b). We approximate m be a discrete measure. If n 1, let z j = a + j(b a)/2 n, j =,..., 2 n 1, and let m n (dx) = 2 n 1 j= m([z j, z j+1 ))δ zj (dx), where δ zj is point mass at z j. The measures m n converge weakly to m as n in the sense that f(x) m n (dx) f(x) dx whenever f is a bounded continuous function on R. We let U 1, U 2,... be independent exponential random variables with parameter m(i). Let Y 1, Y 2,... be i.i.d. random variables independent of the U i s with P(Y i dx) = m(dx)/m(i). We let X t start at and be piecewise constant with jumps of size Y m at times m i=1 U i. If we define Xt n is the exact same way, except that we replace m by m n and we let Yi n = z j if Y i [z j, z j+1 ), then we know from the previous section that Xt n is a Lévy process with Lévy measure m n. Moreover each Yi n differs from Y i by at most (b a)2 n, so sup Xs n X s (b a)2 n N, s t where N is the number of jumps of these processes before time t. N is a Poisson random variable with parameter m(i), so has moments of all orders. It follows that Xt n converges uniformly to X t almost surely on each finite interval, and the difference goes to in L p for each p.

22 CHAPTER 2. LÉVY PROCESSES We conclude that the law of X t X s is independent of F s and has the same law as that of X t s because these hold for each X n. Since x e iux is a bounded continuous function and m n converges weakly to m, starting with ( ) E exp(iuxt n ) = exp t [e iux 1] m n (dx), and passing to the limit, we obtain that the characteristic function of X under P is given by (2.3). If now the interval I contains the point b, we follow the above proof, except we let P 2n 1 t be a Poisson random variable with parameter m([z n 1, b]). Similarly, if I does not contain the point a, we change Pt to be a Poisson random variable with parameter m((a, z 1 )). With these changes, the proof works for intervals I, whether or not they contain either of their endpoints. Remark 2.6 If X is the Lévy process constructed in Lemma 2.5, then Y t = X t E X t will be a Lévy process satisfying (2.4). Here is the main theorem of this section. Theorem 2.7 Suppose m is a measure on R with m({}) = and (1 x 2 )m(dx) <. Suppose b R and σ. There exists a Lévy process X t such that ( { }) E e iuxt = exp t iub σ 2 u 2 /2 + [e iux 1 iux1 ( x 1) ]m(dx). (2.5) R The above equation is called the Lévy-Khintchine formula. The measure m is called the Lévy measure. If we let m(dx) = 1 + x2 m (dx) x 2

2.2. CONSTRUCTION OF LÉVY PROCESSES 23 and b = b + then we can also write ( x 1) ( { E e iuxt = exp t iub σ 2 u 2 /2 + x 3 1 + x m(dx) x 2 ( x >1) 1 + x m(dx), 2 R [ e iux 1 iux ] 1 + x 2 1 + x 2 x 2 }) m (dx). Both expressions for the Lévy-Khintchine formula are in common use. Proof. Let m(dx) be a measure supported on (, 1] with x 2 m(dx) <. Let m n (dx) be the measure m restricted to (2 n, 2 n+1 ]. Let Yt n be independent Lévy processes whose characteristic functions are given by (2.4) with m replaced by m n ; see Remark 2.6. Note E Yt n = for all n by Remark 2.1. By the independence of the Y n s, if M < N, ( N E n=m ) 2 N Yt n = E (Yt n ) 2 = n=m N n=m t 2 M x 2 m n (dx) = t x 2 m(dx). 2 N By our assumption on m, this goes to as M, N, and we conclude that N n= Y t n converges in L 2 for each t. Call the limit Y t. It is routine to check that Y t has independent and stationary increments. Each Yt n has independent increments and is mean, so E [Y n t Y n s F s ] = E [Y n t Y n s ] =, or Y n is a martingale. By Doob s inequalities and the L 2 convergence, E sup s t N Ys n n=m 2 as M, N, and hence there exists a subsequence M k such that M k n=1 Y s n converges uniformly over s t, a.s. Therefore the limit Y t will have paths that are right continuous with left limits. If m is a measure supported in (1, ) with m(r) <, we do a similar procedure starting with Lévy processes whose characteristic functions are of the form (2.3). We let m n (dx) be the restriction of m to (2 n, 2 n+1 ], let Xt n be independent Lévy processes corresponding to m n, and form X t = n= Xn t.

24 CHAPTER 2. LÉVY PROCESSES Since m(r) <, for each t, the number of times t less than t at which any one of the Xt n jumps is finite. This shows X t has paths that are right continuous with left limits, and it is easy to then see that X t is a Lévy process. Finally, suppose x 2 1 m(dx) <. Let Xt 1, Xt 2 be Lévy processes with characteristic functions given by (2.3) with m replaced by the restriction of m to (1, ) and (, 1), resp., let Xt 3, Xt 4 be Lévy processes with characteristic functions given by (2.4) with m replaced by the restriction of m to (, 1] and [ 1, ), resp., let Xt 5 = bt, and let Xt 6 be σ times a Brownian motion. Suppose the X i s are all independent. Then their sum will be a Lévy process whose characteristic function is given by (2.5). A key step in the construction was the centering of the Poisson processes to get Lévy processes with characteristic functions given by (2.4). Without the centering one is forced to work only with characteristic functions given by (2.3). 2.3 Representation of Lévy processes We now work towards showing that every Lévy process has a characteristic function of the form given by (2.5). Lemma 2.8 If X t is a Lévy process and A is a Borel subset of R that is a positive distance from, then N t (A) = s t 1 A ( X s ) is a Poisson process. Saying that A is a positive distance from means that inf{ x : x A} >. Proof. Since X t has paths that are right continuous with left limits and A is a positive distance from, then there can only be finitely many jumps of X that lie in A in any finite time interval, and so N t (A) is finite and has paths that are right continuous with left limits. It follows from the fact that X t has stationary and independent increments that N t (A) also has stationary and independent increments. We now apply Proposition 1.1.

2.3. REPRESENTATION OF LÉVY PROCESSES 25 Our main result is that N t (A) and N t (B) are independent if A and B are disjoint. Theorem 2.9 Let {F t } be a filtration satisfying the usual conditions. Suppose that N t (A) is a Poisson point process with respect to the measure λ. If A 1,..., A n are pairwise disjoint measurable subsets of R with E N t (A k ) < for k = 1,..., n, then the processes N t (A 1 ),..., N t (A n ) are mutually independent. Proof. Define λ(a) = E N 1 (A). The previous lemma shows that if λ(a) <, then N t (A) is a Poisson process, and clearly its parameter is λ(a). We first make the observation that because A 1, A 2,..., A n are disjoint, then no two of the N t (A k ) have jumps at the same time. To prove the theorem, it suffices to let = r < r 1 < < r m and show that the random variables {N rj (A k ) N rj 1 (A k ) : 1 j m, 1 k n} are independent. Since for each j and each k, N rj (A k ) N rj 1 (A k ) is independent of F rj 1, it suffices to show that for each j m, the random variables {N rj (A k ) N rj 1 (A k ) : 1 k n} are independent. We will do the case j = m = 1 and write r for r j for simplicity; the case when j, m > 1 differs only in notation. We will prove this using induction. We start with the case n = 2 and show the independence of N r (A 1 ) and N r (A 2 ). Each N t (A k ) is a Poisson process, and so N t (A k ) has moments of all orders. Let u 1, u 2 R and set φ k = λ(a k )(e iu k 1), k = 1, 2. Let We see that M k t M k t = e iu kn t(a k ) tφ k. is a martingale because E e iu kn t(a k ) = e tφ k, and therefore E [M k t F s ] = M k s E [e iu(nt(a k) N s(a k ))) (t s)φ k F s ] = M k s e (t s)φ k E [e iu(nt(a k) N s(a k )) ] = M k s,

26 CHAPTER 2. LÉVY PROCESSES using the independence and stationarity of the increments of a Poisson process. Now we can write E [M 1 t M 2 t ] = E [M 1 t ] + E = 1 + E t t M 1 s dm 2 s, M 1 t dm 2 s using that M 2 = 1, M 1 is a martingale, and Proposition 1.17. (Here M 2 is the difference of two increasing processes; the adjustments needed are easy.) Since we have argued that no two of the N t (A k ) jump at the same time, the same is true for the Mt k and so the above is equal to 1 + E t M 1 s dm 2 s. It therefore remains to prove that the above integral is equal to. If H s is a process of the form where K is F a measurable, then t H s (ω) = K(ω)1 (a,b] (s) H s dm 2 s = K(M 2 t b M 2 t a), and conditioning on F a, the expectation is zero: E [K(M 2 t b M 2 t a)] = E [KE [M 2 t b M 2 t a F a ] ] =, using that M 2 is a martingale. We are doing Lebesgue-Stieltjes integrals here, but the argument is similar to one used with stochastic integrals. The expectation is also for linear combinations of such H s. Since Ms 1 is left continuous, we can approximate it by such H s, and therefore the integral is as required. We thus have E M 1 r M 2 r = 1.

2.3. REPRESENTATION OF LÉVY PROCESSES 27 This implies [ ] [ ] [ ] E e i(u 1N r(a 1 )+u 2 N r(a 2 )) = e rφ 1 e rφ 2 = E e iu 1N r(a 1 ) E e iu 2N r(a 2 ). Since this holds for all u 1, u 2, then N r (A 1 ) and N r (A 2 ) are independent. We conclude that the processes N t (A 1 ) and N t (A 2 ) are independent. To handle the case n = 3, we first show that Mt 1 Mt 2 write E [M 1 t M 2 t F s ] is a martingale. We = M 1 s M 2 s e (t s)(φ 1+φ 2 ) E [e i(u 1(N t(a 1 ) N s(a 1 ))+u 2 (N t(a 2 ) N s(a 2 ))) F s ] = M 1 s M 2 s e (t s)(φ 1+φ 2 ) E [e i(u 1(N t(a 1 ) N s(a 1 ))+u 2 (N t(a 2 ) N s(a 2 ))) ] = M 1 s M 2 s, using the fact that N t (A 1 ) and N t (A 2 ) are independent of each other and each have stationary and independent increments. Note that M 3 t = e iu 3N t(a 3 ) tφ 3 has no jumps in common with M 1 t or M 2 t. Therefore if M 3 t = M 3 t r, then and as before this leads to E [M 3 (M 1 M 2 )] =, E [M 3 r (M 1 r M 2 r )] = 1. As above this implies that N r (A 1 ), N r (A 2 ), and N r (A 3 ) are independent. To prove the general induction step is similar. We will also need the following corollary. Corollary 2.1 Let F t and N t (A k ) be as in Theorem 4.2. Suppose Y t is a process with paths that are right continuous with left limits such that Y t Y s is independent of F s whenever s < t and Y t Y s has the same law as Y t s for each s < t. Suppose moreover that Y has no jumps in common with any of the N t (A k ). Then the processes N t (A 1 ),..., N t (A n ), and Y t are independent.

28 CHAPTER 2. LÉVY PROCESSES Proof. The law of Y is the same as that of Y t Y t, so Y =, a.s. By the fact that Y has stationary and independent increments, E e iuy s+t = E e iuys E e iu(y s+t Y s) = E e iuys E e iuyt, which implies that the characteristic function of Y is of the form E e iuyt = e tψ(u) for some function ψ(u). We fix u R and define M Y t = e iuyt tψ(u). As in the proof of Theorem 4.2, we see that Mt Y is a martingale. Since M Y has no jumps in common with any of the Mt k, if M Y t = Mt r, Y we see as above that E [M Y (M 1 M n )] = 1, or E [M Y r M 1 r M n r ] = 1. This leads as above to the independence of Y from all the N t (A k ) s. Here is the representation theorem for Lévy processes. Theorem 2.11 Suppose X t is a Lévy process with X =. Then there exists a measure m on R {} with (1 x 2 ) m(dx) < and real numbers b and σ such that the characteristic function of X t is given by (2.5). Proof. Define m(a) = E N 1 (A) if A is a bounded Borel subset of (, ) that is a positive distance from. Since N 1 ( k=1 A k) = k=1 N 1(A k ) if the A k are pairwise disjoint and each is a positive distance from, we see that m is a measure on [a, b] for each < a < b <, and m extends uniquely to a measure on (, ).

2.3. REPRESENTATION OF LÉVY PROCESSES 29 First we want to show that s t X s1 ( Xs>1) is a Lévy process with characteristic function ( ) exp t [e iux 1] m(dx). 1 Since the characteristic function of the sum of independent random variables is equal to the product of the characteristic functions, it suffices to suppose < a < b and to show that ( ) E e iuzt = exp t [e iux 1] m(dx), (a,b] where Z t = s t X s 1 (a,b] ( X s ). Let n > 1 and z j = a + j(b a)/n. By Lemma 2.8, N t ((z j, z j+1 ]) is a Poisson process with parameter l j = E N 1 ((z j 1, z j ]) = m((z j, z j+1 ]). Thus n 1 j= z jn t ((z j, z j+1 ]) has characteristic function n 1 ( n 1 exp(tl j (e iuz j 1)) = exp t (e iuz j 1)l j ), j= j= which is equal to ( exp t ) (e iux 1) m n (dx), (2.6) where m n (dx) = n 1 j= l jδ zj (dx). Since Z n t converges to Z t as n, passing to the limit shows that Z t has a characteristic function of the form (2.5). Next we show that m(1, ) <. (We write m(1, ) instead of m((1, )) for esthetic reasons.) If not, m(1, K) as K. Then for each fixed L and each fixed t, lim sup P(N t (1, K) L) = lim sup K K L j= tm(1,k) m(1, K)j e j! =. This implies that N t (1, ) = for each t. However, this contradicts the fact that X t has paths that are right continuous with left limits.

3 CHAPTER 2. LÉVY PROCESSES We define m on (, ) similarly. We now look at Y t = X t s t X s 1 ( Xs >1). This is again a Lévy process, and we need to examine its structure. This process has bounded jumps, hence has moments of all orders. By subtracting c 1 t for an appropriate constant c 1, we may suppose Y t has mean. Let I 1, I 2,... be an ordering of the intervals {[2 (m+1), 2 m ), ( 2 m, 2 (m+1) ] : m }. Let X k t = s t X s 1 ( Xs I k ) and let Xt k = X t k E X t k. By the fact that all the X k have mean and are independent, k=1 [( E (Xt k ) 2 E Y t k=1 X k t ) 2 ] [( + E k=1 X k t ) 2 ] = E (Y t ) 2 <. Hence [ N E k=m X k t ] 2 N = E (Xt k ) 2 k=m tends to as M, N, and thus X t N k=1 Xk t converges in L 2. The limit, X c t, say, will be a Lévy process independent of all the X k t. Moreover, X c has no jumps, i.e., it is continuous. Since all the X k have mean, then E X c t =. By the independence of the increments, E [X c t X c s F s ] = E [X c t X c s] =, and we see X c is a continuous martingale. Using the stationarity and independence of the increments, E [(X c s+t) 2 ] = E [(X c s) 2 ] + 2E [X c s(x c s+t X c s)] + E [(X c s+t X c s) 2 ] = E [(X c s) 2 ] + E [(X c t ) 2 ],

2.4. SYMMETRIC STABLE PROCESSES 31 which implies that there exists a constant c 2 such that E (X c t ) 2 = c 2 t. We then have E [(X c t ) 2 c 2 t F s ] = (X c s) 2 c 2 s + E [(X c t X c s) 2 F s ] c 2 (t s) = (X c s) 2 c 2 s + E [(X c t X c s) 2 ] c 2 (t s) = (X c s) 2 c 2 s. The quadratic variation process of X c is therefore c 2 t, and by Lévy s theorem, X c t / c 2 is a constant multiple of Brownian motion. To complete the proof, it remains to show that 1 1 x2 m(dx) <. But by Remark 2.1, x 2 m(dx) = E (X1 k ) 2, I k and we have seen that k E (X k 1 ) 2 E Y 2 1 <. Combining gives the finiteness of 1 1 x2 m(dx). 2.4 Symmetric stable processes Let α (, 2). If m(dx) = c dx, x 1+α we have what is called a symmetric stable process of index α. We see that 1 x 2 m(dx) is finite. Because x 1 α is symmetric, in the Lévy-Khintchine formula we can take iux 1 ( x <a) for any a instead of iux 1 ( x <1). Then [ e iux c [ 1 iux 1 ( x <1)] dx = e iux c 1 iux 1 x 1+α ( x <1/ u )] dx x 1+α [ ] = e iy 1+α dy 1 iy 1 ( y <1) u u = c u α.

32 CHAPTER 2. LÉVY PROCESSES In the last line we have the negative sign because the imaginary part of e iy 1 iy 1 ( y <1) is zero and the real part is negative since cos y 1. Therefore is X t is a symmetric stable process of index α, E e iuxt = e c t u α. An exercise is to show that if a > and X t is a symmetric stable process of index α, then X at and a 1/α X t have the same law. By Exercise 6.7.4 of Chung s book, P(X 1 > A) ca α, A, (2.7) where f g means the ratio of the two sides goes to. Since e c t u α is integrable, X t has a continuous density function p t (x). We have p t () = 1 e c t u α du, (2.8) 2π and by a change of variables, If x, then p t (x) = 1 2π e iux e c t u α du = 1 2π p t () = ct 1/α. (2.9) (cos ux i sin ux)e c t u α du. Since sin ux is an odd function of u, the imaginary term is. Since cos ux 1 and in fact is strictly less than 1 except at countably many values of u, we see that p t (x) < p t (). (2.1) If β < 1, we can take m(dx) = c/ x 1+β for x > and for x <. We can also take the Lévy-Khintchine exponent to be just [e iux 1] if we take the drift term to cancel out the iux 1 ( x <1) term. This reflects that here we do not need to subtract the mean to get convergence of the compound Poisson processes. In this case we get the one-sided stable processes of index β. The paths of such a process only increase. There is a notion of subordination which is very curious. Suppose that T t is a one-sided stable process of index β with β < 1. Let W t be a Brownian

2.4. SYMMETRIC STABLE PROCESSES 33 motion independent of T t. Then Y t = W Tt is a symmetric stable process of index 2β. Let s see why that is so. That Y is a Lévy process is not hard to see. We must therefore calculate the Lévy measure m. If P t is a Poisson process with parameter λ, then E e upt = λt (λt)k e e uk = e λt e uλt = e λt(eu 1). k! k= Using that the moment generating function of independent random variables is the product of the moment generating functions and taking limits, we see that E e utt = e cuβ. Then E e iuyt = E e iuws P(T t ds) = e u2s/2 P(T t ds) = E e u2 T t/2 = e ct(u2 /2) β = e c u 2β.

34 CHAPTER 2. LÉVY PROCESSES

Chapter 3 Stochastic calculus In this chapter we investigate the stochastic calculus for processes which may have jumps as well as a continuous component. If X is not a continuous process, it is no longer true that X t TN is a bounded process when T N = inf{t : X t N}, since there could be a large jump at time T N. We investigate stochastic integrals with respect to square integrable (not necessarily continuous) martingales, Itô s formula, and the Girsanov transformation. We prove the reduction theorem that allows us to look at semimartingales that are not necessarily bounded. We will need the Doob-Meyer decomposition, which can be found in Chapter 16 of Bass, Stochastic Processes. That in turn depends on the debut and section theorems. A simpler proof than the standard one for the debut and section theorems can be found in the Arxiv: http://arxiv.org/abs/11.3619. 3.1 Decomposition of martingales We assume throughout this chapter that {F t } is a filtration satisfying the usual conditions. This means that each F t contains every P-null set and ε> F t+ε = F t for each t. Let us with a few definitions and facts. The predictable σ-field is the σ-field of subsets of [, ) Ω generated by the collection of bounded, left continuous processes that are adapted to {F t }. A stopping time T is predictable and 35

36 CHAPTER 3. STOCHASTIC CALCULUS predicted by the sequence of stopping times T n if T n T, and T n < T on the event (T > ). A stopping time T is totally inaccessible if P(T = S) = for every predictable stopping time S. The graph of a stopping time T is [T, T ] = {(t, ω) : t = T (ω) < }. If X t is a process that is right continuous with left limits, we set X t = lim s t,s<t X s and X t = X t X t. Thus X t is the size of the jump of X t at time t. Let s look at some examples. If W t is a Brownian motion and T = inf{t : W t = 1}, then T n = inf{t : W t = 1 (1/n)} are stopping times that predict T. On the other hand, if P t is a Poisson process (with parameter 1, say, for convenience), then we claim that T = inf{t : P t = 1} is totally inaccessible. To show this, suppose S is a stopping time and S n S are stopping times such that S n < S on (S > ). We will show that P(S = T ) =. To do that, it suffices to show that P(S N = T ) = for each positive integer N. Since P t t is a martingale, E P Sn N = E (S n N). Letting n, we obtain (by monotone convergence) that E P (S N) = E (S N). We also know that E P S N = E (S N). Therefore E P (S N) = E P S N. Since P has increasing paths, this implies that P (S N) = P S N, and we conclude P(S N = T ) =. In this chapter we will assume througout for simplicity that every jump time of whichever process we are considering is totally inaccessible. The general case is not much harder, but the differences are only technical. A supermartingale Z is of class D if the family of random variables: is uniformly integrable. {Z T : T a finite stopping time} Theorem 3.1 (Doob-Meyer decomposition) Let {F t } be a filtration satisfying the usual conditions and let Z be a supermartingale of class D whose paths are right continuous with left limits. Then Z can be written Z t = M t A t in one and only one way, where M and A are adapted processes whose paths are right continuous with left limits, A has continuous increasing paths and A = lim t A t is integrable, and M is a uniformly integrable martingale Suppose A t is a bounded increasing process whose paths are right continuous with left limits. Recall that a function f is increasing if s < t implies

3.1. DECOMPOSITION OF MARTINGALES 37 f(s) f(t). Then trivially A t is a submartingale, and by the Doob-Meyer decomposition, there exists a continuous increasing process Ãt such that A t Ãt is a martingale. We call à t the compensator of A t. If A t = B t C t is the difference of two increasing processes B t and C t, then we can use linearity to define Ãt as B t C t. We can even extend the notion of compensator to the case where A t is complex valued and has paths that are locally of bounded variation by looking at the real and imaginary parts. We will use the following lemma. A = lim t A t. For any increasing process A we let Lemma 3.2 Suppose A t has increasing paths that are right continuous with left limits, A t K a.s. for each t, and let B t be its compensator. Then E B 2 2K 2. Proof. If M t = A t B t, then M t is a martingale, and then E [M M t F t ] =. We then write E B 2 = 2E = 2E (B B t ) db t = 2E E [A A t F t ] db t 2KE = 2KE B = 2KE A 2K 2. E [B B t F t ] db t db t From the lemma we get the following corollary. Corollary 3.3 If A t = B t C t, where B t and C t are increasing right continuous processes with B = C =, a.s., and in addition B and C are bounded, then E sup à 2 t <. t