Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

Similar documents
An Ergodic Theorem for Fleming-Viot Models in Random Environments 1

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

{σ x >t}p x. (σ x >t)=e at.

Chapter 1. Poisson processes. 1.1 Definitions

1. Stochastic Processes and filtrations

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Jump Processes. Richard F. Bass

Weak convergence and large deviation theory

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Lecture 17 Brownian motion as a Markov process

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Wiener Measure and Brownian Motion

Information and Credit Risk

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Exercises Measure Theoretic Probability

4th Preparation Sheet - Solutions

Lecture 21 Representations of Martingales

Lecture 5. If we interpret the index n 0 as time, then a Markov chain simply requires that the future depends only on the present and not on the past.

SOLUTIONS TO SOME PROBLEMS

Convergence of Markov Processes. Amanda Turner University of Cambridge

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

A D VA N C E D P R O B A B I L - I T Y

An essay on the general theory of stochastic processes

Random Process Lecture 1. Fundamentals of Probability

Stochastic integration. P.J.C. Spreij

ELEMENTS OF PROBABILITY THEORY

Translation Invariant Exclusion Processes (Book in Progress)

Brownian Motion on Manifold

Convergence of Feller Processes

for all f satisfying E[ f(x) ] <.

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Summer Jump-Start Program for Analysis, 2012 Song-Ying Li

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Stochastic Calculus February 11, / 33

Ergodic Theorems. Samy Tindel. Purdue University. Probability Theory 2 - MA 539. Taken from Probability: Theory and examples by R.

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes

Lecture 19 L 2 -Stochastic integration

Properties of an infinite dimensional EDS system : the Muller s ratchet

Reflected Brownian Motion

p 1 ( Y p dp) 1/p ( X p dp) 1 1 p

Feller Processes and Semigroups

Generalized Gaussian Bridges of Prediction-Invertible Processes

Existence Results for Multivalued Semilinear Functional Differential Equations

On semilinear elliptic equations with measure data

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

MATH 6605: SUMMARY LECTURE NOTES

Do stochastic processes exist?

A PECULIAR COIN-TOSSING MODEL

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Universal examples. Chapter The Bernoulli process

Mean-field dual of cooperative reproduction

JUSTIN HARTMANN. F n Σ.

Mean-field SDE driven by a fractional BM. A related stochastic control problem

ADVANCED PROBABILITY: SOLUTIONS TO SHEET 1

(B(t i+1 ) B(t i )) 2

On continuous time contract theory

Ergodic Properties of Markov Processes

Applications of Ito s Formula

Homework #6 : final examination Due on March 22nd : individual work

Chapter 6. Markov processes. 6.1 Introduction

OPTIMAL SOLUTIONS TO STOCHASTIC DIFFERENTIAL INCLUSIONS

An Overview of the Martingale Representation Theorem

9.2 Branching random walk and branching Brownian motions

An invariance result for Hammersley s process with sources and sinks

Verona Course April Lecture 1. Review of probability

PROBABILITY THEORY II

n E(X t T n = lim X s Tn = X s

process on the hierarchical group

Zdzis law Brzeźniak and Tomasz Zastawniak

Nash Type Inequalities for Fractional Powers of Non-Negative Self-adjoint Operators. ( Wroclaw 2006) P.Maheux (Orléans. France)

Introduction and Preliminaries

Reminder Notes for the Course on Measures on Topological Spaces

Real Analysis Math 131AH Rudin, Chapter #1. Dominique Abdi

MATH 6337: Homework 8 Solutions

The Wiener Itô Chaos Expansion

Branching Processes II: Convergence of critical branching to Feller s CSB

Weak convergence and Brownian Motion. (telegram style notes) P.J.C. Spreij

Metric Spaces. Exercises Fall 2017 Lecturer: Viveka Erlandsson. Written by M.van den Berg

Brownian Motion and Stochastic Calculus

MAT 578 FUNCTIONAL ANALYSIS EXERCISES

Markov Property and Ergodicity of the Nonlinear Filter

µ X (A) = P ( X 1 (A) )

Product measures, Tonelli s and Fubini s theorems For use in MAT4410, autumn 2017 Nadia S. Larsen. 17 November 2017.

Lecture 12. F o s, (1.1) F t := s>t

16 1 Basic Facts from Functional Analysis and Banach Lattices

Approximating irregular SDEs via iterative Skorokhod embeddings

Integration on Measure Spaces

Elliptic Operators with Unbounded Coefficients

7 Convergence in R d and in Metric Spaces

Some Properties of NSFDEs

Processes with independent increments.

Distance-Divergence Inequalities

Insert your Booktitle, Subtitle, Edition

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

THE SKOROKHOD OBLIQUE REFLECTION PROBLEM IN A CONVEX POLYHEDRON

Transcription:

s Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi Lectures on Probability and Stochastic Processes III Indian Statistical Institute, Kolkata 20 24 November 2008

Outline Introduction Associated Semigroups and Generators 1 Introduction Associated Semigroups and Generators 2 3 Time-Inhomogeneous Jump Perturbations

Introduction Associated Semigroups and Generators Let X be a E-valued process. X satisfies the Markov property if P (X t+r Γ σ(x s : 0 s t)) = P (X t+r Γ X t ) A function P(t, x, Γ) is called a transition function if P(t, x, ) is a probability measure for all (t, x) [0, ) E P(0, x, ) = δ x for all x E P(,, Γ) is measurable for all Γ E (Interpretation) P (X t Γ X 0 = x) = P(t, x, Γ) X is a Markov process if it admits a transition function so that ( ) P X t+s Γ Ft X = P(s, X t, Γ) t, s 0, Γ E Equivalently [ ] E f (X t+s ) Ft X = f (y)p(s, X t, dy) f C b (E)

Associated Semigroups Introduction Associated Semigroups and Generators Define T t f (x) = f (y)p(t, x, dy) = E [f (X t ) X 0 = x] T t T s = T t+s T t f 0 whenever f 0 T t f f [ E f (X t+s ) Ft X semigroup property positivity contraction ] = T s f (X t )

Generator Introduction Associated Semigroups and Generators A contraction semigroup {T t } satisfying lim T t f = T 0 f = f t 0 is called a Strongly continuous contraction semigroup The Generator L of a strongly continuous contraction semigroup is defined as follows. D(L) = T t f f = { f : lim t 0 T t f f t T t f f Lf = lim. t 0 t t 0 exists }. T s Lfds f D(L) (1)

Introduction Associated Semigroups and Generators Proposition 1 Let X, P, (T t ) and L be as above. Then X is a solution of the martingale problem for L. Proof. Fix f D(L) and let M t = f (X t ) t 0 Lf (X s)ds. [ ] [ ] t+s [ ] E M t+s Ft X = E f (X t+s ) Ft X E Lf (X u ) Ft X du 0 t+s t = T s f (X t ) T u t Lf (X t )du Lf (X u )du = T s f (X t ) t s 0 t T u Lf (X t )du Lf (X u )du 0 t = f (X t ) Lf (X u )du= M t (using (1)) 0 0

Outline 1 Introduction Associated Semigroups and Generators 2 3 Time-Inhomogeneous Jump Perturbations

Finite Dimensional Distributions Lemma 1 A process X is a solution to the martingale problem for A if and only if [ ( tn+1 ) n ] E f (X ) f (X t tn+1 n) Af (X s )ds h k (X tk ) = 0 (2) t n k=1 for all f D(A), 0 t 1 < t 2 <... < t n+1, h 1, h 2,..., h n B(E), and n 1. Thus, (being a) solution of the martingale problem is a finite dimensional property. Thus if X is a solution and Y is a modification of X, then Y is also a solution.

The space D([0, ), E) D([0, ), E) - the space of all E valued functions on [0, ) which are right continuous and have left limits Skorokhod topology on D([0, ), E) D([0, ), E) - complete, separable, metric space S E, the Borel σ-field on D([0, ), E). θ t (ω) = ω t - co-ordinate process r.c.l.l. process - process taking values in D([0, ), E)

r.c.l.l. Solutions Definition 2.1 A probability measure P P(D([0, ), E)) is solution of the martingale problem for (A, µ) if there exists a D([0, ), E)- valued process X with L(X ) = P and such that X is a solution to the martingale problem for (A, µ) Equivalently, P P(D([0, ), E)) is a solution if θ defined on (D([0, ), E), S E, P) is a solution For a r.c.l.l. process X defined on some (Ω, F, P), we will use the dual terminology X is a solution P X 1 is a solution

Well-posedness - Definitions Definition 2.2 The martingale problem for (A, µ) is well - posed in a class C of processes if there exists a solution X C of the martingale problem for (A, µ) and if Y C is also a solution to the martingale problem for (A, µ), then X and Y have the same finite dimensional distributions. i.e. uniqueness holds When C is the class of all measurable processes then we just say that the martingale problem is well - posed. Definition 2.3 The martingale problem for A is well - posed in C if the martingale problem for (A, µ) is well-posed for all µ P(E).

Well-posedness in D([0, ), E) finite dimensional distributions characterize the probability measures on D([0, ), E) Definition 2.4 The D([0, ), E) - martingale problem for (A, µ) is well - posed if there exists a solution P P(D([0, ), E)) of the D([0, ), E) - martingale problem for (A, µ) and if Q is any solution to the D([0, ), E) - martingale problem for (A, µ) then P = Q.

Bounded-pointwise convergence Definition 2.5 Let f k, f B(E). f k converge bounded-ly and pointwise to f bp f k f if f k M and f k (x) f (x) for all x E. A class of functions U B(E) bp-closed if f k U, f k bp f implies f U. bp-closure(u) - the smallest class of functions in B(E) which contains U and is bp-closed. i.e. E 0 - a field; σ(e 0 ) = E H - class of all E 0 -simple functions Then bp-closure(h) = class of all bounded, E - measurable functions.

Separability condition Definition 2.6 The operator A satisfies the separability condition if There exists a countable subset {f n } D(A) such that bp closure({(f n, Af n ) : n 1}) {(f, Af ) : f D(A)}. Let A 0 = A {fn}, the restriction of A to {f n } Solution of martingale problem for A = solution of martingale problem for A 0

Separability condition Definition 2.6 The operator A satisfies the separability condition if There exists a countable subset {f n } D(A) such that bp closure({(f n, Af n ) : n 1}) {(f, Af ) : f D(A)}. Let A 0 = A {fn}, the restriction of A to {f n } Solution of martingale problem for A solution of martingale problem for A 0 from Lemma 1 Use Dominated convergence Theorem to show that the set of all {(g, Ag)} satisfying (2) is bp-closed

Markov Family of Solutions Theorem 1 Let A be an operator on C b (E) satisfying the separability condition. Suppose the D([0, ), E) - martingale problem for (A, δ x ) is well-posed for each x E. Then 1 x P x (C) is measurable for all C S E. 2 For all µ P(E), the D([0, ), E) - martingale problem for (A, µ) is well - posed, with the solution P µ given by P µ (C) = P x (C)µ(dx). 3 Under P µ, θ t is a Markov process with transition function E P(s, x, F ) = P x (θ s F ). (3)

Proof of (1) Choose M C b (E) - countable such that B(E) bp-closure(m). { Let H = η : tm+1 η(θ) = (f n (θ tm+1 ) f n (θ tm ) t m Af n (θ s )ds) m h k (θ tk ) k=1 } where h 1, h 2,..., h m M, 0 t 1 < t 2... < t m+1 Q H is countable Lemma 1 = M 1 = η H {P : ηdp = 0} is the set of solutions of the martingale problem for A. P ηdp is continuous. Hence M 1 is Borel set

Proof of (1) (Contd.) G : P(D([0, ), E)) P(E) G is continuous G(P) = P θ(0) 1. M = M 1 G 1 ({δ x : x E}) = {P x : x E} is Borel Well-posedness = G restricted to M is one-to-one mapping onto {δ x : x E}. G 1 : {δ x : x E} M is Borel G(P x ) = δ x. Hence δ x P x is measurable x P x = x δ x P x is measurable

Proof of (2) For F E P µ θ 1 0 (F ) = For η H, E D([0, ),E) P x θ 1 0 (F )µ(dx) = ηdp µ = E D([0, ),E) E δ x (F )µ(dx) = µ(f ). ηdp x µ(dx) = 0. Hence P µ is a solution to the martingale problem for (A, µ). Let Q be another solution of the D([0, ), E) martingale problem for (A, µ) Let Q ω be the regular conditional probability of Q given θ 0.

Proof of (2) (Contd.) Fix η H, h C b (E). Define η (θ) = η(θ)h(θ 0 ). η H. Thus E Q [η(θ)h(θ 0 )] = E Q [η ] = 0. Since this holds for all h C b (E), E Qω [η] = E Q [η θ 0 ] = 0 a.s. - Q. Since H is countable, ONE Q-null set N 0 satisfying E Qω [η] = 0 ω N 0 Q ω is a solution of the martingale problem for A initial distribution δ θ0 (ω). Well - posedness implies Hence Q = P µ. Q ω = P θ0 (ω) a.s.[q]

Proof of (3) Fix s. Let θ t = θ t+s. Let Q ω be the regular conditional probability distribution of θ (under P x ) given F s. Q ω is a solution to the martingale problem for (A, δ θs(ω)). Well-posedness = Q ω(θ t F ) = P(t, θ s (ω), F ) (See (3)) Hence for f B(E), ] E Px f (θ t+s ) = E [E Px Px [f (θ t+s ) F s ] [ ] = E Px f (y)p(t, θ s ( ), dy) E = f (y 2 )P(t, y 1, dy 2 )P(s, x, dy 1 ). E E P(s + t, x, F ) = P x (θ t+s F ) = E P(t, y, F )P(s, x, dy)

One-dimensional equality Theorem 2 Suppose that for each µ P(E), any two solutions X and Y (defined respectively on (Ω 1, F 1, P 1 ) and (Ω 2, F 2, P 2 )) of the martingale problem for (A, µ) have the same one-dimensional distributions. Then X and Y have the same finite dimensional distributions, i.e. the martingale problem is well - posed. Proof. To show [ m ] [ m ] E P 1 f k (X tk ) = E P 2 f k (Y tk ) (4) k=1 k=1 for all 0 t 1 < t 2 <... t m, f 1, f 2,..., f m B(E) and m 1.

Induction argument Case: m = 1 - true by hypothesis Assume that the Induction hypothesis (4) is true for m = n Fix 0 t 1 < t 2 <... t n, f 1, f 2,..., f n B(E), f k > 0. Define Q 1 (F 1 ) = EP 1 [I F1 n k=1 f k(x tk )] E P 1 [ n k=1 f k(x tk )] Q 2 (F 2 ) = EP 2 [I F2 n k=1 f k(y tk )] E P 2 [ n k=1 f k(y tk )] F 1 F 1 F 2 F 2 Let X t = X tn+t, Ỹ t = Y tn+t. Fix 0 s 1 < s 2 <..., s m+1 = t, h 1, h 2,..., h m B(E) and f D(A).

Induction argument (Contd.) E P 1 η(θ) = [ η(x tn+ ) ( sm+1 ) m f (θ sm+1 ) f (θ sm ) Af (θ s )ds h k (θ tk ) s m k=1 ] n f k (X tk ) = k=1 E P 1[ ( f (X sm+1 +t n ) f (X sm+t n ) m h j (X tn+sj ) j=1 tn+s m+1 t n+s m ) Af (X u )du ] n f k (X tk ) = 0. k=1

Induction argument (Contd.) Hence E Q 1 [η( X )] = EP 1 [η(x n tn+ ) k=1 f k(x tk )] E P 1 [ n k=1 f = 0. k(x tk )] Similarly E Q 2 [η(ỹ )] = 0. X and Ỹ are solutions of the martingale problems for (A, L( X 0 )) and (A, L(Ỹ0)) respectively. E Q 1 [f ( X 0 )] = EP 1 [f (X tn ) n k=1 f k(x tk )] E P 1 [ n k=1 f k(x tk )] = EP 2 [f (Y tn ) n k=1 f k(y tk )] E P 2 [ n k=1 f = E Q 2 [f (Ỹ 0 )] f B(E). k(y tk )] This equality follows from induction hypothesis for m = n

Induction argument (Contd.) Hence X and Ỹ have the same initial distribution. One-dimensional uniqueness implies E Q 1 [f ( X t )] = E Q 2 [f (Ỹ t )] t 0, f B(E). E P 1 [f (X tn+t) n f k (X tk )] = E P 2 [f (X tn+t) k=1 Induction Hypothesis (4) is true for m = n + 1 set t n+1 = t n + t n f k (X tk )] k=1

Semigroup associated with the Suppose A satisfies the conditions of Theorem 1. Associate the Markov semigroup (T t ) t 0 with A - T t f (x) = f (y)p(t, x, dy) The following theorem can be proved exactly as the previous one. E

Strong Markov Property Theorem 3 Suppose that the D([0, ), E)- martingale problem for A is well - posed with associated semigroup T t Let X, defined on (Ω, F, P), be a solution of the martingale problem for A (with respect to (G t ) t 0 ). Let τ be a finite stop time. Then for f B(E), t 0, In particular E[f (X τ+t ) G τ ] = T t f (X τ ) P((X τ+t Γ) G τ ) = P(t, X τ, Γ) Γ E

r.c.l.l. modification Definition 2.7 Let D be a class of functions on E 1 D is measure determining if fdp = fdq forall f D implies P = Q. 2 D separates points in E if x y g D such that g(x) g(y). Theorem 4 Let E be a compact metric space. Let A be an operator on C(E) such that D(A) is measure determining and contains a countable subset that separates points in E. Let X, defined on (Ω, F, P), be a solution to the martingale problem for A. Then X has a modification with sample paths in D([0, ), E).

Proof Proof. Let {g k : k 1} D(A) separate points in E. Define t M k (t) = g k (X t ) Ag k (X s )ds 0 M k is a martingale for all k. Then for all t lim s t s Q M k (s), lim M k (s) exist a.s. s t s Q Hence Ω Ω with P(Ω ) = 1 and lim s t s Q g k (X s (ω)), lim g k (X s (ω)) exist ω Ω, t 0, k 1 s t s Q

Proof (contd.) Fix t 0, {s n } Q with s n > t, lim n s n = t and ω Ω. Since E is compact, a subsequence {s ni } such that lim i X sni (ω) exists. Clearly ( ) g k lim X s ni (ω) i = lim g k (X s (ω)) k. s t s Q Since {g k : k 1} separate points in E, lim s t X s (ω) exists. s Q Similarly lim s t X s (ω) exists s Q Define Y t (ω) = lim X s (ω). s t s Q

Proof (contd.) For ω Ω, Y t (ω) is r.c.l.l. & Define Y suitably for ω Ω Yt (ω) = lim X s (ω) s t s Q Then Y has sample paths in D([0, ), E). Since X is a solution to the martingale problem for A, for f D(A), a measure determining set = X = Y a.s. E[f (Y t ) Ft X ] = lim E[f (X s ) Ft X ] = f (X t ). s t s Q

Outline Time-Inhomogeneous Jump Perturbations 1 Introduction Associated Semigroups and Generators 2 3 Time-Inhomogeneous Jump Perturbations

Definitions Time-Inhomogeneous Jump Perturbations For t 0, let (A t ) t 0 be linear operators on M(E) with a common domain D M(E). Definition 3.1 A measurable process X defined on some probability space (Ω, F, P) is a solution to the martingale problem for (A t ) t 0 with respect to a filtration (G t ) t 0 if for any f D t f (X t ) A s f (X s )ds 0 is a (G t ) - martingale. Let µ P(E). The martingale problem for ((A t ) t 0, µ) is well-posed if there exists an unique solution for the martingale problem

Space-Time Process Time-Inhomogeneous Jump Perturbations Let E 0 = [0, ) E. Let Xt 0 = (t, X t ) Define { } k D(A 0 ) = g(t, x) = h i (t)f i (x) h i Cc 1 ([0, )), f i D i=1 A 0 g(t, x) = k [f i (x) t h i (t) + h i (t)a t f i (x)] i=1 Theorem 5 X is a solution to the martingale problem for (A t ) t 0 if and only if X 0 is a solution to the martingale problem for A 0.

Time-Inhomogeneous Jump Perturbations Proof. Let X be a solution (with respect to a filtration (G t ) t 0 ) to the martingale problem for (A t ) t 0. Let fh D(A 0 ) For 0 < s < t, let g(t) = E[f (X t ) G s ]. Then g(t) g(s) = g(t)h(t) g(s)h(s) = = = t s t s t s t s E[A u f (X u ) G s ]du u [g(u)h(u)] du {h(u)e[a u f (X u ) G s ] + g(u) u h(u)} du E [ A 0 (fh)(x 0 u ) G s ] du.

Proof (Contd.) Time-Inhomogeneous Jump Perturbations fh(x 0 (t)) t 0 A0 fh(x 0 (s))ds is a martingale X 0 is a solution to the martingale problem for A 0. The converse follows by taking h = 1 on [0, T ], T > 0.

A more General Result Time-Inhomogeneous Jump Perturbations State spaces E 1 and E 2 Operators A 1 on M(E 1 ) and A 2 on M(E 2 ) Solutions X 1 and X 2 Define D(A) = {f 1 f 2 : f 1 D(A 1 ), f 2 D(A 2 )} A(f 1 f 2 ) = (A 1 f 1 )f 2 + f 1 (A 2 f 2 ) (X 1, X 2 ) is a solution of the martingale problem for A Theorem 6 Suppose uniqueness holds for the martingale problem for A 1, A 2. Then uniqueness holds for the martingale problem for A.

A perturbed operator Time-Inhomogeneous Jump Perturbations Let A be an operator with D(A) C b (E). Let λ > 0 and let η(x, Γ) be a transition function on E E. Let Bf (x) = λ (f (y) f (x))η(x, dy) f B(E). E Theorem 7 Suppose that for every µ P(E), there exists a solution to the D([0, ), E) martingale problem for (A, µ). Then for every µ P(E) there exists a solution to the martingale problem for (A + B, µ).

Proof Time-Inhomogeneous Jump Perturbations Proof. For k 1, let Ω k = D([0, ), E), Ω 0 k = [0, ) Let Ω = k=1 Ω k Ω 0 k Let θ k and ξ k denote the co-ordinate random variables Borel σ-fields - F k, F 0 k Let F be the product σ-field on Ω. Let G k the σ-algebra generated by cylinder sets C 1 i=k+1 (Ω i Ω 0 i ), where C 1 F 1 F1 0... F k Fk 0. Let G k be the σ-algebra generated by k i=1 (Ω i Ω 0 i ) C 2, where C 2 F k F 0 k....

Perturbed Solution X Time-Inhomogeneous Jump Perturbations X evolves in E as a solution to the martingale problem for A till an exponentially distributed time with parameter λ which is independent of the past. At this time if the process is at x, it jumps to y with probability η(x, dy) and then continues evolving as a solution to the martingale problem for (A, δ y ). To put this in a mathematical framework, we consider that between the k th and the (k + 1) th jump (dictated by B), the process lies in Ω k. The k th copy of the exponential time is a random variable in Ω 0 k.

Proof (Contd.) Time-Inhomogeneous Jump Perturbations Let P x, P µ be solutions of the martingale problems for (A, δ x ), (A, µ) respectively. Let γ be the exponential distribution with parameter λ. Fix µ P(E). Define, for Γ 1 F 1,..., Γ k F k, F 1 F 0 1,..., F k F 0 k, P 1 (Γ 1 ) = P µ (Γ 1 ) ; P 0 1 (θ 1, F 1 ) = γ(f 1 ).. P k (θ 1, ξ 1,..., θ k 1, ξ k 1, Γ k ) = P 0 k (θ 1,..., ξ k 1, θ k, F k ) = γ(f k ) E P x (Γ k )η(θ k 1 (ξ k 1 ), dx)

Proof(Contd.) Time-Inhomogeneous Jump Perturbations P 1 P(Ω 1 ) and P1 0, P 2, P2 0,... are transition probability functions. an unique P on (Ω, F) satisfying For C G k and C G k+1 [ P(C C ) = E C ] P(C θ k+1 (0) = x)η(θ k (ξ k ), dx). Define τ 0 = 0, τ k = k i=1 ξ i N t = k for τ k t < τ k+1. Note that N is a Poisson process with parameter λ.

Proof(Contd.) Time-Inhomogeneous Jump Perturbations Define F t = F X t F N t. For f D(A) X t = θ k+1 (t τ k ), τ k t < τ k+1 f (θ k+1 ((t τ k ) τ k+1 τ k )) f (θ k+1 (0)) is an (F t ) t 0 martingale. Summing over k we get f (X t ) f (X 0 ) t is an (F t ) t 0 martingale. 0 (t τk ) τ k+1 τ k Af (θ k+1 (s τ k ))ds N(t) Af (X (s))ds (f (θ k+1 (0)) f (θ k (ξ k ))) k=1

Proof(Contd.) Time-Inhomogeneous Jump Perturbations Also, the following are (F t ) t 0 martingales. Hence N t k=1 t 0 ( ) f (θ k+1 (0)) f (y)(η(θ k (ξ k ), dy)) E E (f (y) f (X s )) η(x s, dy)d(n s λs) t f (X t ) f (X 0 ) (Af (X s ) + Bf (X s )) ds 0 is an (F t ) t 0 martingale.