Chapter 1. Poisson processes. 1.1 Definitions

Similar documents
Jump Processes. Richard F. Bass

1. Stochastic Processes and filtrations

(A n + B n + 1) A n + B n

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Probability Theory. Richard F. Bass

Exercises Measure Theoretic Probability

Stochastic integration. P.J.C. Spreij

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Point Process Control

Lecture 2. We now introduce some fundamental tools in martingale theory, which are useful in controlling the fluctuation of martingales.

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

A D VA N C E D P R O B A B I L - I T Y

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

An essay on the general theory of stochastic processes

Probability Theory II. Spring 2016 Peter Orbanz

Doléans measures. Appendix C. C.1 Introduction

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

Math 6810 (Probability) Fall Lecture notes

Martingales. Chapter Definition and examples

Selected Exercises on Expectations and Some Probability Inequalities

4 Sums of Independent Random Variables

n E(X t T n = lim X s Tn = X s

Applications of Ito s Formula

Lecture 12. F o s, (1.1) F t := s>t

( f ^ M _ M 0 )dµ (5.1)

{σ x >t}p x. (σ x >t)=e at.

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Problem Sheet 1. You may assume that both F and F are σ-fields. (a) Show that F F is not a σ-field. (b) Let X : Ω R be defined by 1 if n = 1

CHAPTER 1. Martingales

Exercises Measure Theoretic Probability

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

Markov processes Course note 2. Martingale problems, recurrence properties of discrete time chains.

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Introduction to stochastic analysis

Exponential martingales: uniform integrability results and applications to point processes

P (A G) dp G P (A G)

Brownian Motion and Stochastic Calculus

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Properties of an infinite dimensional EDS system : the Muller s ratchet

An Introduction to Stochastic Processes in Continuous Time

4th Preparation Sheet - Solutions

Martingales, standard filtrations, and stopping times

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

STAT 331. Martingale Central Limit Theorem and Related Results

MATH 56A: STOCHASTIC PROCESSES CHAPTER 6

Branching processes. Chapter Background Basic definitions

Notes on uniform convergence

Chapter 5. Weak convergence

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Lecture 21 Representations of Martingales

7 Convergence in R d and in Metric Spaces

Analysis Finite and Infinite Sets The Real Numbers The Cantor Set

4 Expectation & the Lebesgue Theorems

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

1 Sequences of events and their limits

Advanced Probability

Chapter 6. Markov processes. 6.1 Introduction

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

g 2 (x) (1/3)M 1 = (1/3)(2/3)M.

Stochastic Processes

CONVERGENCE OF RANDOM SERIES AND MARTINGALES

Lecture 17 Brownian motion as a Markov process

9 Brownian Motion: Construction

On the submartingale / supermartingale property of diffusions in natural scale

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

Poisson Processes. Particles arriving over time at a particle detector. Several ways to describe most common model.

Stochastic Analysis I S.Kotani April 2006

3 Integration and Expectation

A Concise Course on Stochastic Partial Differential Equations

Homework 1 due on Monday September 8, 2008

2.2 Some Consequences of the Completeness Axiom

ERRATA: Probabilistic Techniques in Analysis

Useful Probability Theorems

THEOREMS, ETC., FOR MATH 515

The Uniform Integrability of Martingales. On a Question by Alexander Cherny

Solutions to the Exercises in Stochastic Analysis

Convergence of Local Supermartingales

Modern Discrete Probability Branching processes

Manual for SOA Exam MLC.

Random Process Lecture 1. Fundamentals of Probability

1 Independent increments

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Limiting Distributions

ECON 2530b: International Finance

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS

Insert your Booktitle, Subtitle, Edition

16.1. Signal Process Observation Process The Filtering Problem Change of Measure

ELEMENTS OF PROBABILITY THEORY

Notes 15 : UI Martingales

STOCHASTIC MODELS FOR WEB 2.0

Martingale Theory and Applications

18.175: Lecture 8 Weak laws and moment-generating/characteristic functions

STOCHASTIC CALCULUS JASON MILLER AND VITTORIA SILVESTRI

MATH 6605: SUMMARY LECTURE NOTES

Optimal stopping for non-linear expectations Part I

SMSTC (2007/08) Probability.

Transcription:

Chapter 1 Poisson processes 1.1 Definitions Let (, F, P) be a probability space. A filtration is a collection of -fields F t contained in F such that F s F t whenever s<t. A filtration satisfies the usual conditions if it is complete: N 2F t for all t whenever P(N) =and it is right continuous: F t+ = F t for all t, wheref t+ = \ "> F t+". Definition 1.1 Let {F t } be a filtration, not necessarily satisfying the usual conditions. A Poisson process with parameter > is a stochastic process X satisfying the following properties: (1) X =, a.s. (2) The paths of X t are right continuous with left limits. (3) If s<t, then X t X s is a Poisson random variable with parameter (t s). (4) If s<t, then X t X s is independent of F s. Define X t =lim s!t,s<t X s,thelefthandlimitattimet, and X t = X t X t, the size of the jump at time t. Wesayafunctionf is increasing if s<timplies f(s) apple f(t). We use strictly increasing when s<timplies f(s) <f(t). We have the following proposition. 1

2 CHAPTER 1. POISSON PROCESSES Proposition 1.2 Let X be a Poisson process. With probability one, the paths of X t are increasing and are constant except for jumps of size 1. There are only finitely many jumps in each finite time interval. Proof. For any fixed s<t,wehavethatx t X s has the distribution of apoissonrandomvariablewithparameter (t s), hence is non-negative, a.s.; let N s,t be the null set of! s where X t (!) <X s (!). The set of pairs (s, t) withs and t rational is countable, and so N = [ s,t2q+ N s,t is also a null set, where we write Q + for the non-negative rationals. For!/2 N, X t X s whenever s<tare rational. In view of the right continuity of the paths of X, thisshowsthepathsofx are increasing with probability one. Similarly, since Poisson random variables only take values in the nonnegative integers, X t is a non-negative integer, a.s. Using this fact for every t rational shows that with probability one, X t takes values only in the nonnegative integers when t is rational, and the right continuity of the paths implies this is also the case for all t. Since the paths have left limits, there can only be finitely many jumps in finite time. It remains to prove that X t is either or 1 for all t. Let t >. If there were a jump of size 2 or larger at some time t strictly less than t,thenforeachn su ciently large there exists apple k n apple 2 n such that X (kn+1)t /2 n X k nt /2n 2. Therefore P(9s <t : X s 2) apple P(9k apple 2 n : X (k+1)t /2 n X kt /2n 2) (1.1) apple 2 n sup P(X (k+1)t /2 n X kt /2 n 2) kapple2 n =2 n P(X t /2 n 2n ) apple 2 n (1 P(X t /2 n =) P(X t /2 n =1)) =2 n 1 e t /2 n ( t /2 n )e t /2 n. We used Definition 1.1(3) for the two equalities. By l Hôpital s rules, (1 e x xe x )/x! asx!. We apply this with x = t /2 n,andseethat the last line of (1.1) tends to as n!1.sincethelefthandsideof(1.1) does not depend on n, it must be. This holds for each t.

1.2. STOPPING TIMES 3 1.2 Stopping times Throughout this section we suppose we have a filtration {F t } satisfying the usual conditions. Definition 1.3 A random variable T :! [, 1] is a stopping time if for all t, (T <t) 2F t. We say T is a finite stopping time if T<1, a.s. We say T is a bounded stopping time if there exists K 2 [, 1) such that T apple K, a.s. Note that T can take the value infinity. Stopping times are also known as optional times. Given a stochastic process X, wedefinex T (!) tobeequaltox(t (!),!), that is, for each! we evaluate t = T (!) andthenlookatx(,!)atthis time. Proposition 1.4 Suppose F t satisfies the usual conditions. Then (1) T is a stopping time if and only if (T apple t) 2F t for all t. (2) If T = t, a.s., then T is a stopping time. (3) If S and T are stopping times, then so are S _ T and S ^ T. (4) If T n, n =1, 2,..., are stopping times with T 1 apple T 2 apple, then so is sup n T n. (5) If T n, n =1, 2,..., are stopping times with T 1 T 2, then so is inf n T n. (6) If s and S is a stopping time, then so is S + s. Proof. We will just prove part of (1), leaving the rest as an exercise. Note (T apple t) =\ n N (T < t +1/n) 2F t+1/n for each N. Thus (T apple t) 2 \ N F t+1/n F t+ = F t. It is often useful to be able to approximate stopping times from the right. If T is a finite stopping time, that is, T<1, a.s.,define T n (!) =(k +1)/2 n if k/2 n apple T (!) < (k +1)/2 n. (1.2)

4 CHAPTER 1. POISSON PROCESSES Define F T = {A 2F: for each t>, A\ (T apple t) 2F t }. (1.3) This definition of F T,whichissupposedtobethecollectionofeventsthat are known by time T,isnotveryintuitive. Butitturnsoutthatthis definition works well in applications. Proposition 1.5 Suppose {F t } is a filtration satisfying the usual conditions. (1) F T is a -field. (2) If S apple T, then F S F T. (3) If F T + = \ "> F T +", then F T + = F T. (4) If X t has right continuous paths, then X T is F T -measurable. Proof. If A 2F T,thenA c \ (T apple t) =(T apple t) \ [A \ (T apple t)] 2F t,so A c 2F T.Therestoftheproofof(1)iseasy. Suppose A 2F S and S apple T.ThenA \ (T apple t) =[A \ (S apple t)] \ (T apple t). We have A \ (S apple t) 2F t because A 2F S,while(T apple t) 2F t because T is astoppingtime.thereforea \ (T apple t) 2F t,whichproves(2). For (3), if A 2F T +,thena 2F T +" for every ", andsoa\(t +" apple t) 2F t for all t. Hence A\(T apple t ") 2F t for all t,orequivalentlya\(t apple t) 2F t+" for all t. Thisistrueforall", soa \ (T apple t) 2F t+ = F t.thissaysa2f T. (4) Define T n by (1.2). Note (X Tn 2 B) \ (T n = k/2 n )=(X k/2 n 2 B) \ (T n = k/2 n ) 2F k/2 n. Since T n only takes values in {k/2 n : k }, weconclude(x Tn 2 B) \ (T n apple t) 2F t,andso(x Tn 2 B) 2F Tn F T +1/2 n. Hence X Tn is F T +1/2 n measurable. If n m, thenx Tn is measurable with respect to F T +1/2 n F T +1/2 m. Since X Tn! X T,thenX T is F T +1/2 m measurable for each m. Therefore X T is measurable with respect to F T + = F T. 1.3 Markov properties Let us begin with the Markov property.

1.3. MARKOV PROPERTIES 5 Theorem 1.6 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process with respect to {F t }. If u is a fixed time, then Y t = P t+u P u is a Poisson process independent of F u. Proof. Let G t = F t+u. It is clear that Y has right continuous paths, is zero at time, has jumps of size one, and is adapted to {G t }. Since Y t Y s = P t+u P s+u,theny t Y s is a Poisson random variable with mean (t s) thatisindependentoff s+u = G s. The strong Markov property is the Markov property extended by replacing fixed times u by finite stopping times. Theorem 1.7 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let P be a Poisson process adapted to {F t }. If T is a finite stopping time, then Y t = P T +t P T is a Poisson process independent of F T. Proof. We will first show that whenever m 1, t 1 < <t m, f is a bounded continuous function on R m,anda 2F T,then E [f(y t1,...,y tm ); A] =E [f(p t1,...,p tm )] P(A). (1.4) Once we have done this, we will then show how (1.4) implies our theorem. To prove (1.4), define T n by (1.2). We have E [f(p Tn+t1 P Tn,...,P Tn+tm P Tn ); A] (1.5) 1X = E [f(p Tn+t1 P Tn,...,P Tn+tm P Tn ); A, T n = k/2 n ] = k=1 1X E [f(p t1 +k/2 n P k/2 n,...,p t m+k/2 n P k/2 n); A, T n = k/2 n ]. k=1 Following the usual practice in probability that, means and, we use E [ ; A, T n = k/2 n ] as an abbreviation for E [ ; A \ (T n = k/2 n )]. Since A 2F T,thenA\(T n = k/2 n )=A\((T <k/2 n )\(T <(k 1)/2 n )) 2 F k/2 n. We use the independent increments property of Poisson process and

6 CHAPTER 1. POISSON PROCESSES the fact that P t P s has the same law as P t s to see that the sum in the last line of (1.5) is equal to 1X E [f(p t1 +k/2 n P k/2 n,...,p t m+k/2 n P k/2 n)] P(A, T n = k/2 n ) k=1 = 1X E [f(p t1,...,p tm )] P(A, T n = k/2 n ) k=1 = E [f(p t1,...,p tm )] P(A), which is the right hand side of (1.4). Thus E [f(p Tn+t 1 P Tn,...,P Tn+t m P Tn ); A] (1.6) = E [f(p t1,...,p tm )] P(A). Now let n!1. By the right continuity of the paths of P, the boundedness and continuity of f, andthedominatedconvergencetheorem,theleft hand side of (1.6) converges to the left hand side of (1.4). If we take A = in(1.4),weobtain E [f(y t1,...,y tm )] = E [f(p t1,...,p tm )] whenever m 1, t 1,...,t m 2 [, 1), and f is a bounded continuous function on R m.thisimpliesthatthefinitedimensionaldistributionsofy and P are the same. Since Y has right continuous paths, Y is a Poisson process. Next take A 2F T.Byusingalimitargument,(1.4)holdswheneverf is the indicator of a Borel subset B of R d,orinotherwords, whenever B is a cylindrical set. P(Y 2 B,A) =P(Y 2 B)P(A) (1.7) When we discuss the Skorokhod topology, we will be able be more precise for the independence argument. Observe that what was needed for the above proof to work is not that P be a Poisson process, but that the process P have right continuous paths and that P t P s be independent of F s and have the same distribution as P t s. We therefore have the following corollary.

1.4. A CHARACTERIZATION 7 Corollary 1.8 Let {F t } be a filtration, not necessarily satisfying the usual conditions, and let X be a process adapted to {F t }. Suppose X has paths that are right continuous with left limits and suppose X t X s is independent of F s and has the same law as X t s whenever s<t.ift is a finite stopping time, then Y t = X T +t X T is a process that is independent of F T and X and Y have the same law. 1.4 A characterization Another characterization of the Poisson process is as follows. Let T 1 =inf{t : X t =1}, thetimeofthefirstjump. DefineT i+1 =inf{t >T i : X t =1}, so that T i is the time of the i th jump. Proposition 1.9 The random variables T 1,T 2 T 1,...,T i+1 T i,... are independent exponential random variables with parameter. Proof. In view of Corollary 1.8 it su ces to show that T 1 is an exponential random variable with parameter. If T 1 >t,thenthefirstjumphasnot occurred by time t, sox t is still zero. Hence P(T 1 >t)=p(x t =)=e t, using the fact that X t is a Poisson random variable with parameter t. We can reverse the characterization in Proposition 1.9 to construct a Poisson process. We do one step of the construction, leaving the rest as an exercise. Let U 1,U 2,... be independent exponential random variables with parameter and let T j = P j i=1 U i.define X t (!) =k if T k (!) apple t<t k+1 (!). (1.8) An examination of the densities shows that an exponential random variable has a gamma distribution with parameters and r =1,soT j is a gamma random variable with parameters and j. Thus P(X t <k)=p(t k >t)= Z 1 t e x ( x) k 1 dx. (k)

8 CHAPTER 1. POISSON PROCESSES Performing the integration by parts repeatedly shows that Xk 1 P(X t <k)= e i= t ( t)i, i! and so X t is a Poisson random variable with parameter t. We will use the following proposition later. Proposition 1.1 Let {F t } be a filtration satisfying the usual conditions. Suppose X =, a.s., X has paths that are right continuous with left limits, X t X s is independent of F s if s<t, and X t X s has the same law as X t s whenever s<t. If the paths of X are piecewise constant, increasing, all the jumps of X are of size 1, and X is not identically, then X is a Poisson process. Proof. Let T =andt i+1 = inf{t >T i : X t =1}, i =1, 2,...Wewill show that if we set U i = T i T i 1,thentheU i s are i.i.d. exponential random variables. By Corollary 1.8, the U i s are independent and have the same law. Hence it su ces to show U 1 is an exponential random variable. We observe P(U 1 >s+ t) =P(X s+t =)=P(X s+t X s =,X s =) = P(X t+s X s =)P(X s =)=P(X t =)P(X s =) = P(U 1 >t)p(u 1 >s). Setting f(t) =P(U 1 >t), we thus have f(t + s) =f(t)f(s). Since f(t) is decreasing and <f(t) < 1, we conclude P(U 1 >t)=f(t) =e t for some >, or U 1 is an exponential random variable. 1.5 Martingales We define continuous time martingales. Let {F t } be a filtration, not necessarily satisfying the usual conditions.

1.5. MARTINGALES 9 Definition 1.11 M t is a continuous time martingale with respect to the filtration {F t } and the probability measure P if (1) E M t < 1 for each t; (2) M t is F t measurable for each t; (3) E [M t F s ]=M s, a.s., if s<t. Part (2) of the definition can be rephrased as saying M t is adapted to F t. If in part (3) = is replaced by, then M t is a submartingale, andifit is replaced by apple, then we have a supermartingale. Taking expectations in Definition 1.11(3), we see that if s<t,thenem s apple E M t is M is a submartingale and E M s E M t if M is a supermartingale. Thus submartingales tend to increase, on average, and supermartingales tend to decrease, on average. If P t is a Poisson process with index,thenp t t is a continuous time martingale. To see this, E [P t t F s ]=E [P t P s F s ] t + P s = E [P t P s ] t + P s = (t s) = t + P s = P s s. We give another example of a martingale. Example 1.12 Recall that given a filtration {F t },eachf t is contained in F, where(, F, P) is our probability space. Let X be an integrable F measurable random variable, and let M t = E [X F t ]. Then and M is a martingale. E [M t F s ]=E [E [X F t ] F s ]=E [X F s ]=M s, We derive the analogs of Doob s inequalities in the stochastic process context. Theorem 1.13 Suppose M t is a martingale or non-negative submartingale with paths that are right continuous with left limits. Then

1 CHAPTER 1. POISSON PROCESSES (1) P(sup M s ) apple E M t /. sapplet (2) If 1 <p<1, then p pe E [sup M s ] p apple Mt p. sapplet p 1 Proof. We will do the case where M t is a martingale, the submartingale case being nearly identical. Let D n = {kt/2 n :applek apple 2 n }. If we set N (n) k = M kt/2 n and G (n) k = F kt/2 n,itisclearthat{n (n) k } is a discrete time martingale with respect to {G (n) k }. Let A n = {sup sapplet,s2dn M s > }. By Doob s inequality for discrete time martingales, (n) P(A n )=P(max N kapple2n k > ) apple E N (n) 2 n = E M t. Note that the A n are increasing, and since M t is right continuous, Then [ n A n = {sup M s > }. sapplet P(sup M s > )=P([ n A n )= lim P(A n ) apple E M t /. sapplet n!1 If we apply this with replaced by " and let "!, we obtain (1). The proof of (2) is similar. By Doob s inequality for discrete time martingales, p pe E [sup p (n) p pe ] apple N 2 p 1 n p = Mt p. p 1 N (n) k kapple2 n Since sup kapple2 n N (n) k p increases to sup sapplet M s p by the right continuity of M, (2) follows by Fatou s lemma. Here is an example. If P t is a Poisson process of index,thenp t t is amartingale.soe a(pt t) is a submartingale for any real number a. Then P(sup sapplet P s s A) =P(sup e a(ps s > e aa ) apple e aa E e apt e a t. sapplet

1.5. MARTINGALES 11 We know E e apt =exp (e a 1) t. We substitute this in the above and then optimize over a. We will need Doob s optional stopping theorem for continuous time martingales. Theorem 1.14 Let {F t } be a filtration satisfying the usual conditions. If M t is a martingale or non-negative submartingale whose paths are right continuous, sup t E M 2 t < 1, and T is a finite stopping time, then E M T E M. Proof. We do the submartingale case, the martingale case being very similar. By Doob s inequality (Theorem 1.13(1)), Letting t!1,wehavee [sup t E [sup Ms 2 ] apple 4E Mt 2. sapplet M 2 t ] < 1 by Fatou s lemma. Let us first suppose that T < K,a.s.,forsomerealnumberK. Define T n by (1.2). Let N (n) k = M k/2 n, G (n) k = F k/2 n,ands n =2 n T n. By Doob s optional stopping theorem applied to the submartingale N (n) k,wehave E M = E N (n) apple E N (n) S n = E M Tn. Since M is right continuous, M Tn! M T,a.s. Therandomvariables M Tn are bounded by 1+sup t M 2 t,sobydominatedconvergence,e M Tn! E M T. We apply the above to the stopping time T ^ K to get E M T ^K E M. The random variables M T ^K are bounded by 1+sup t Mt 2,sobydominated convergence, we get E M T E M when we let K!1. We present the continuous time version of Doob s martingale convergence theorem. We will see that not only do we get limits as t!1,butalsoa regularity result. Let D n = {k/2 n : k }, D = [ n D n. Theorem 1.15 Let {M t : t 2D}be either a martingale, a submartingale, or a supermartingale with respect to {F t : t 2D}and suppose sup t2d E M t < 1. Then

12 CHAPTER 1. POISSON PROCESSES (1) lim t!1 M t exists, a.s. (2) With probability one M t has left and right limits along D. The second conclusion says that except for a null set, if t 2 [, 1), then both lim t2d,t"t M t and lim t2d,t#t M t exist and are finite. The null set does not depend on t. Proof. Martingales are also submartingales and if M t is a supermartingale, then M t is a submartingale, so we may without loss of generality restrict our attention to submartingales. By Doob s inequality, P( sup M t > ) apple 1 E M n. t2d n,tapplen Letting n!1and using Fatou s lemma, This is true for all set. P(sup t2d M t > ) apple 1 sup E M t. t,sowithprobabilityone,{ M t : t 2D}is a bounded Therefore the only way either (1) or (2) can fail is that if for some pair of rationals a<bthe number of upcrossings of [a, b] by{m t : t 2D}is infinite. Recall that we define upcrossings as follows. Given an interval [a, b] andasubmartingalem, ifs 1 =inf{t : M t apple a}, T i =inf{t >S i : M t b}, ands i+1 =inf{t >T i : M t apple a}, thenthenumber of upcrossings up to time u is sup{k : T k apple u}. Doob s upcrossing lemma tells us that if V n is the number of upcrossings by {M t : t 2D n \ [,n]}, then E V n apple E M n b a. Letting n!1and using Fatou s lemma, the number of upcrossings of [a, b] by {M t : t 2D}has finite expectation, hence is finite, a.s. If N a,b is the null set where the number of upcrossings of [a, b] by{m t : t 2D}is infinite and N = [ a<b,a,b2q+ N a,b,whereq + is the collection of non-negative rationals, then P(N) =. If!/2 N, then(1)and(2)hold. As a corollary we have

1.5. MARTINGALES 13 Corollary 1.16 Let {F t } be a filtration satisfying the usual conditions, and let M t be a martingale with respect to {F t }. Then M has a version that is also a martingale and that in addition has paths that are right continuous with left limits. Proof. Let D be as in the above proof. For each integer N 1, E M t apple E M N < 1 for t apple N since M t is a submartingale by the conditional expectation form of Jensen s inequality. Therefore M t^n has left and right limits when taking limits along t 2D.SinceNis arbitrary, M t has left and right limits when taking limits along t 2D,exceptforasetof! s that form anullset.let fm t = lim M u. u2d,u>t,u!t It is clear that f M has paths that are right continuous with left limits. Since F t+ = F t and f M t is F t+ measurable, then f M t is F t measurable. Let N be fixed. We will show {M t ; t apple N} is a uniformly integrable family of random variables. Let ">. Since M N is integrable, there exists such that if P(A) <,thene[ M N ; A] <".IfLislargeenough, P( M t >L) apple E M t /L apple E M N /L <.Then E [ M t ; M t >L] apple E [ M N ; M t >L] <", since M t is a submartingale and ( M t >L) 2F t. Uniform integrability is proved. Now let t<n.ifb 2F t, E [ f M t ; B] = lim E [M u; B] =E [M t ; B]. u2d,u>t,u!t Here we used the Vitali convergence theorem and the fact that M t is a martingale. Since f M t is F t measurable, this proves that f M t = M t,a.s. SinceN was arbitrary, we have this for all t. WethushavefoundaversionofM that has paths that are right continuous with left limits. That f M t is a martingale is easy. The following technical result will be used in the next chapter. A function f is increasing if s<timplies f(s) apple f(t). A process A t has increasing paths if the function t! A t (!) isincreasingforalmostevery!.

14 CHAPTER 1. POISSON PROCESSES Proposition 1.17 Suppose {F t } is a filtration satisfying the usual conditions and suppose A t is an adapted process with paths that are increasing, are right continuous with left limits, and A 1 =lim t!1 A t exists, a.s. Suppose X is a non-negative integrable random variable, and M t is a version of the martingale E [X F t ] which has paths that are right continuous with left limits. Suppose E [XA 1 ] < 1. Then E Z 1 XdA s = E Z 1 M s da s. (1.9) Proof. First suppose X and A are bounded. Let n>1andletuswrite E R 1 XdA s as 1X E [X(A k/2 n A (k 1)/2 n)]. k=1 Conditioning the k th summand on F k/2 n,thisisequalto h X 1 i E E [X F k/2 n](a k/2 n A (k 1)/2 n). k=1 Given s and n, defines n to be that value of k/2 n such that (k k/2 n.wethenhave E Z 1 XdA s = E Z 1 1)/2 n <sapple M sn da s. (1.1) For any value of s, s n # s as n!1,andsincem has right continuous paths, M sn! M s. Since X is bounded, so is M. By dominated convergence, the right hand side of (1.1) converges to E Z 1 M s da s. This completes the proof when X and A are bounded. We apply this to X ^ N and A ^ N, letn! 1,andusemonotoneconvergenceforthe general case. The only reason we assume X is non-negative is so that the integrals make sense. The equation (1.9) can be rewritten as E Z 1 XdA s = E Z 1 E [X F s ] da s. (1.11)

1.5. MARTINGALES 15 We also have E Z t XdA s = E Z t E [X F s ] da s (1.12) for each t. This follows either by following the above proof or by applying Proposition 1.17 to A s^t.

16 CHAPTER 1. POISSON PROCESSES