LogFeller et Ray Knight

Similar documents
Branching Processes II: Convergence of critical branching to Feller s CSB

Dynamics of the evolving Bolthausen-Sznitman coalescent. by Jason Schweinsberg University of California at San Diego.

Exercises. T 2T. e ita φ(t)dt.

Tools of stochastic calculus

Properties of an infinite dimensional EDS system : the Muller s ratchet

LAN property for ergodic jump-diffusion processes with discrete observations

Topics in fractional Brownian motion

Erdős-Renyi random graphs basics

WXML Final Report: Chinese Restaurant Process

Applications of Ito s Formula

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Lecture 21 Representations of Martingales

Exact Simulation of Diffusions and Jump Diffusions

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Weak solutions of mean-field stochastic differential equations

Reflected Brownian Motion

A Central Limit Theorem for Fleming-Viot Particle Systems Application to the Adaptive Multilevel Splitting Algorithm

Look-down model and Λ-Wright-Fisher SDE

Viscosity Solutions of Path-dependent Integro-Differential Equations

n E(X t T n = lim X s Tn = X s

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

1. Stochastic Processes and filtrations

Nonlinear representation, backward SDEs, and application to the Principal-Agent problem

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

GENERALIZED RAY KNIGHT THEORY AND LIMIT THEOREMS FOR SELF-INTERACTING RANDOM WALKS ON Z 1. Hungarian Academy of Sciences

Multiple points of the Brownian sheet in critical dimensions

Malliavin Calculus in Finance

Exponential martingales: uniform integrability results and applications to point processes

Asymptotic properties of maximum likelihood estimator for the growth rate for a jump-type CIR process

Point Process Control

Appendix B for The Evolution of Strategic Sophistication (Intended for Online Publication)

The Contour Process of Crump-Mode-Jagers Branching Processes

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Multivariate Generalized Ornstein-Uhlenbeck Processes

Lecture 22 Girsanov s Theorem

Stochastic Processes III/ Wahrscheinlichkeitstheorie IV. Lecture Notes

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

A. Bovier () Branching Brownian motion: extremal process and ergodic theorems

Information and Credit Risk

Wiener Measure and Brownian Motion

The Pedestrian s Guide to Local Time

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

Introduction to self-similar growth-fragmentations

Effective dynamics for the (overdamped) Langevin equation

Stochastic integration. P.J.C. Spreij

UPPER DEVIATIONS FOR SPLIT TIMES OF BRANCHING PROCESSES

Chapter 2 Event-Triggered Sampling

LAN property for sde s with additive fractional noise and continuous time observation

Some Properties of NSFDEs

e - c o m p a n i o n

Introduction to Random Diffusions

Stochastic Calculus. Kevin Sinclair. August 2, 2016

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Large Deviations for Perturbed Reflected Diffusion Processes

Mean-field SDE driven by a fractional BM. A related stochastic control problem

analysis for transport equations and applications

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

The tree-valued Fleming-Viot process with mutation and selection

Approximation of BSDEs using least-squares regression and Malliavin weights

The Cameron-Martin-Girsanov (CMG) Theorem

1. Stochastic Process

A NOTE ON STATE ESTIMATION FROM DOUBLY STOCHASTIC POINT PROCESS OBSERVATION. 0. Introduction

An Introduction to Malliavin calculus and its applications

Random measures, intersections, and applications

MA8109 Stochastic Processes in Systems Theory Autumn 2013

Branching Brownian motion seen from the tip

AN EXTREME-VALUE ANALYSIS OF THE LIL FOR BROWNIAN MOTION 1. INTRODUCTION

The strictly 1/2-stable example

Brownian Motion on Manifold

Richard F. Bass Krzysztof Burdzy University of Washington

ELEMENTS OF STOCHASTIC CALCULUS VIA REGULARISATION. A la mémoire de Paul-André Meyer

Brownian Motion and lorentzian manifolds

Theoretical Tutorial Session 2

Solutions to the Exercises in Stochastic Analysis

Spatial diffusions with singular drifts: The construction of Super Brownian motion with immigration at unoccupied sites

A Concise Course on Stochastic Partial Differential Equations

Preliminary Exam: Probability 9:00am 2:00pm, Friday, January 6, 2012

Functions of Several Variables: Chain Rules

Two viewpoints on measure valued processes

Other properties of M M 1

Asymptotic Properties of an Approximate Maximum Likelihood Estimator for Stochastic PDEs

I forgot to mention last time: in the Ito formula for two standard processes, putting

Brownian survival and Lifshitz tail in perturbed lattice disorder

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Solving Stochastic Partial Differential Equations as Stochastic Differential Equations in Infinite Dimensions - a Review

1 Brownian Local Time

for all f satisfying E[ f(x) ] <.

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Stochastic Demography, Coalescents, and Effective Population Size

Pathwise construction of tree-valued Fleming-Viot processes

The Moran Process as a Markov Chain on Leaf-labeled Trees

A Central Limit Theorem for Fleming-Viot Particle Systems 1

THE COALESCENT Lectures given at CIMPA school St Louis, Sénégal, April Étienne Pardoux

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 22 12/09/2013. Skorokhod Mapping Theorem. Reflected Brownian Motion

An essay on the general theory of stochastic processes

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

ON THE POLICY IMPROVEMENT ALGORITHM IN CONTINUOUS TIME

Existence and Comparisons for BSDEs in general spaces

Transcription:

LogFeller et Ray Knight Etienne Pardoux joint work with V. Le and A. Wakolbinger Etienne Pardoux (Marseille) MANEGE, 18/1/1 1 / 16

Feller s branching diffusion with logistic growth We consider the diffusion dz t = Z t (θ γz t )dt + 2 Z t dw t, t, (1) with Z = x, θ, γ >. Z t is not a branching process : the quadratic term introduces interactions between the branches. However there is still an interpretation in terms of the evolution of a population of the solution of the Fellog process Z t, see A. Lambert (25). Consider dh s = H =, ( ) θ 2 γl s(h s ) ds + db s + 1 2 dl s(), s, (2) where B is a standard Brownian motion. In this SDE, the term dl s ()/2 takes care of the reflection of H at the origin. Etienne Pardoux (Marseille) MANEGE, 18/1/1 2 / 16

Feller s branching diffusion with logistic growth We consider the diffusion dz t = Z t (θ γz t )dt + 2 Z t dw t, t, (1) with Z = x, θ, γ >. Z t is not a branching process : the quadratic term introduces interactions between the branches. However there is still an interpretation in terms of the evolution of a population of the solution of the Fellog process Z t, see A. Lambert (25). Consider dh s = H =, ( ) θ 2 γl s(h s ) ds + db s + 1 2 dl s(), s, (2) where B is a standard Brownian motion. In this SDE, the term dl s ()/2 takes care of the reflection of H at the origin. Etienne Pardoux (Marseille) MANEGE, 18/1/1 2 / 16

Feller s branching diffusion with logistic growth We consider the diffusion dz t = Z t (θ γz t )dt + 2 Z t dw t, t, (1) with Z = x, θ, γ >. Z t is not a branching process : the quadratic term introduces interactions between the branches. However there is still an interpretation in terms of the evolution of a population of the solution of the Fellog process Z t, see A. Lambert (25). Consider dh s = H =, ( ) θ 2 γl s(h s ) ds + db s + 1 2 dl s(), s, (2) where B is a standard Brownian motion. In this SDE, the term dl s ()/2 takes care of the reflection of H at the origin. Etienne Pardoux (Marseille) MANEGE, 18/1/1 2 / 16

Results Using Girsanov s theorem, one can show Proposition The SDE (2) has a unique weak solution. H being the solution of (2), L s (t) denoting its local time, let S x := inf{s > : L s () > x}. Our main result is the Theorem L Sx (t), t, solves the Fellog SDE (1). Etienne Pardoux (Marseille) MANEGE, 18/1/1 3 / 16

Results Using Girsanov s theorem, one can show Proposition The SDE (2) has a unique weak solution. H being the solution of (2), L s (t) denoting its local time, let S x := inf{s > : L s () > x}. Our main result is the Theorem L Sx (t), t, solves the Fellog SDE (1). Etienne Pardoux (Marseille) MANEGE, 18/1/1 3 / 16

Results Using Girsanov s theorem, one can show Proposition The SDE (2) has a unique weak solution. H being the solution of (2), L s (t) denoting its local time, let S x := inf{s > : L s () > x}. Our main result is the Theorem L Sx (t), t, solves the Fellog SDE (1). Etienne Pardoux (Marseille) MANEGE, 18/1/1 3 / 16

Discrete approximation Our proof of the above extended Ray Knight theorem is based on an approximation by finite population. For N N, let Z N,x be the total mass of a population of individuals, each of which has mass 1/N. The initial mass is Z N,x = Nx /N, and Z N,x t follows a Markovian jump dynamics : from its current state k/n, { (k + 1)/N at rate 2kN + kθ Z N,x jumps to (k 1)/N at rate 2kN + k(k 1)γ/N. For γ =, this is a GW process in cont. time : each individual independently spawns a child at rate 2N + θ, and dies (childless) at rate 2N. For γ, the quadratic death rate destroys independence. Viewing the individuals alive at time t as being arranged from left to right, and by decreeing that each of the pairwise fights (which happens at rate 2γ) is won by the individual to the left, we arrive at the additional death rate 2γL i (t)/n for individual i, where L i (t) denotes the number of indiv. currently living to the left of i at time t. Etienne Pardoux (Marseille) MANEGE, 18/1/1 4 / 16

Discrete approximation Our proof of the above extended Ray Knight theorem is based on an approximation by finite population. For N N, let Z N,x be the total mass of a population of individuals, each of which has mass 1/N. The initial mass is Z N,x = Nx /N, and Z N,x t follows a Markovian jump dynamics : from its current state k/n, { (k + 1)/N at rate 2kN + kθ Z N,x jumps to (k 1)/N at rate 2kN + k(k 1)γ/N. For γ =, this is a GW process in cont. time : each individual independently spawns a child at rate 2N + θ, and dies (childless) at rate 2N. For γ, the quadratic death rate destroys independence. Viewing the individuals alive at time t as being arranged from left to right, and by decreeing that each of the pairwise fights (which happens at rate 2γ) is won by the individual to the left, we arrive at the additional death rate 2γL i (t)/n for individual i, where L i (t) denotes the number of indiv. currently living to the left of i at time t. Etienne Pardoux (Marseille) MANEGE, 18/1/1 4 / 16

The just described reproduction dynamics gives rise to a forest F N of trees of descent. At any branch point, we imagine the new branch being placed to the right of the mother branch. Because of the asymmetric killing, the trees further to the right have a tendency to stay smaller : they are under attack by the trees to their left. For a given realization of F N, we read off a continuous and piecewise linear R + -valued path H N (called the exploration path associated with F N ) in the following way : Starting (, ), H N goes upwards at speed 2N until the top of the first mother branch (which is the leftmost leaf of the tree) is hit. There H N turns and goes downwards, again at speed 2N, until arriving at the closest branch point (which is the last birth time of a child of the first ancestor before his death). From there one goes upwards into the (yet unexplored) next branch, and proceeds in a similar fashion until being back at height, which means that the exploration of the leftmost tree is completed. Then explore the next tree, etc., until the exploration of the forest F N is completed. Etienne Pardoux (Marseille) MANEGE, 18/1/1 5 / 16

The just described reproduction dynamics gives rise to a forest F N of trees of descent. At any branch point, we imagine the new branch being placed to the right of the mother branch. Because of the asymmetric killing, the trees further to the right have a tendency to stay smaller : they are under attack by the trees to their left. For a given realization of F N, we read off a continuous and piecewise linear R + -valued path H N (called the exploration path associated with F N ) in the following way : Starting (, ), H N goes upwards at speed 2N until the top of the first mother branch (which is the leftmost leaf of the tree) is hit. There H N turns and goes downwards, again at speed 2N, until arriving at the closest branch point (which is the last birth time of a child of the first ancestor before his death). From there one goes upwards into the (yet unexplored) next branch, and proceeds in a similar fashion until being back at height, which means that the exploration of the leftmost tree is completed. Then explore the next tree, etc., until the exploration of the forest F N is completed. Etienne Pardoux (Marseille) MANEGE, 18/1/1 5 / 16

The just described reproduction dynamics gives rise to a forest F N of trees of descent. At any branch point, we imagine the new branch being placed to the right of the mother branch. Because of the asymmetric killing, the trees further to the right have a tendency to stay smaller : they are under attack by the trees to their left. For a given realization of F N, we read off a continuous and piecewise linear R + -valued path H N (called the exploration path associated with F N ) in the following way : Starting (, ), H N goes upwards at speed 2N until the top of the first mother branch (which is the leftmost leaf of the tree) is hit. There H N turns and goes downwards, again at speed 2N, until arriving at the closest branch point (which is the last birth time of a child of the first ancestor before his death). From there one goes upwards into the (yet unexplored) next branch, and proceeds in a similar fashion until being back at height, which means that the exploration of the leftmost tree is completed. Then explore the next tree, etc., until the exploration of the forest F N is completed. Etienne Pardoux (Marseille) MANEGE, 18/1/1 5 / 16

M 3 M 2 m 2 M 1 * M 4 m 1 * * m 3 Etienne Pardoux (Marseille) MANEGE, 18/1/1 6 / 16

The exploration process H N in case θ = γ = We have the following SDE ((H N, V N ) takes values in IR + { 1, 1}) dh N s ds = 2NVs N, H N =, dvs N = 21 {V N s = 1} dpn s 21 {V N r =1} dpn r + 2NdL N s (), V N = 1, where {P N s, s } is a Poisson processes with intensity 4N 2, i. e. M N s = P N s 4N 2 s is a martingale, and L N s (t) := the local time accumulated by H N at level t up to time s 1 := lim ε ε s 1 {t H N u <t+ε}du. Etienne Pardoux (Marseille) MANEGE, 18/1/1 7 / 16

The exploration process H N in case θ = γ = We have the following SDE ((H N, V N ) takes values in IR + { 1, 1}) dh N s ds = 2NVs N, H N =, dvs N = 21 {V N s = 1} dpn s 21 {V N r =1} dpn r + 2NdL N s (), V N = 1, where {P N s, s } is a Poisson processes with intensity 4N 2, i. e. M N s = P N s 4N 2 s is a martingale, and L N s (t) := the local time accumulated by H N at level t up to time s 1 := lim ε ε s 1 {t H N u <t+ε}du. Etienne Pardoux (Marseille) MANEGE, 18/1/1 7 / 16

If we let S N x It is easily seen that := inf{s > : L N s () [Nx]/N}, Zt N,x = L N S (t), t. x N Etienne Pardoux (Marseille) MANEGE, 18/1/1 8 / 16

If we let S N x It is easily seen that := inf{s > : L N s () [Nx]/N}, Zt N,x = L N S (t), t. x N Etienne Pardoux (Marseille) MANEGE, 18/1/1 8 / 16

Taking the limit As N, V N oscillates faster and faster. In order to study the limit of H N, we consider for f C 2 (IR), the perturbed test function, see e. g. Ethier, Kurtz (1986) f N (h, v) = f (h) + v 4N f (h). Implementing this with f (h) = h, we get where M 1,N s = 1 2N s H N s + V s N 4N = M1,N s Ms 2,N + 1 2 LN s (), 1 {V N r = 1} dmn r and Ms 2,N = 1 s 1 2N {V N r =1} dmn r are two mutually orthogonal martingales. Etienne Pardoux (Marseille) MANEGE, 18/1/1 9 / 16

Taking the limit As N, V N oscillates faster and faster. In order to study the limit of H N, we consider for f C 2 (IR), the perturbed test function, see e. g. Ethier, Kurtz (1986) f N (h, v) = f (h) + v 4N f (h). Implementing this with f (h) = h, we get where M 1,N s = 1 2N s H N s + V s N 4N = M1,N s Ms 2,N + 1 2 LN s (), 1 {V N r = 1} dmn r and Ms 2,N = 1 s 1 2N {V N r =1} dmn r are two mutually orthogonal martingales. Etienne Pardoux (Marseille) MANEGE, 18/1/1 9 / 16

Taking the limit as N We obtain the following joint convergence Theorem For any x >, as n, ( {H N s, Ms 1,N, Ms 2,N, s }, {L N s (t), s, t }, Sx N ) ( {Hs, Bs 1, Bs 2 ), s }, {L s (t), s, t }, S x, where B 1 and B 2 are two mutually independent B. M.s, if B = ( 2) 1 (B 1 B 2 ), H is B reflected above, L its local time, and S x = inf{s > ; L s () > x}. Etienne Pardoux (Marseille) MANEGE, 18/1/1 1 / 16

A Girsanov transformation 1 Let s Xs N,1 := X N,2 s := s X N := X N,1 + X N,2, θ 2N 1 {V N r = 1}dMN r, γl N r (Hr N ) 1 N {V N r =1} dmn r, and Y N := E(X N ) denote the Doléans exponential of X N, i. e. Y N solves the SDE s Ys N = 1 + Yr dx N r N, s. We prove Proposition Y N is a martingale ( EY N s = 1, s > ). Etienne Pardoux (Marseille) MANEGE, 18/1/1 11 / 16

A Girsanov transformation 1 Let s Xs N,1 := X N,2 s := s X N := X N,1 + X N,2, θ 2N 1 {V N r = 1}dMN r, γl N r (Hr N ) 1 N {V N r =1} dmn r, and Y N := E(X N ) denote the Doléans exponential of X N, i. e. Y N solves the SDE s Ys N = 1 + Yr dx N r N, s. We prove Proposition Y N is a martingale ( EY N s = 1, s > ). Etienne Pardoux (Marseille) MANEGE, 18/1/1 11 / 16

A Girsanov transformation 1 Let s Xs N,1 := X N,2 s := s X N := X N,1 + X N,2, θ 2N 1 {V N r = 1}dMN r, γl N r (Hr N ) 1 N {V N r =1} dmn r, and Y N := E(X N ) denote the Doléans exponential of X N, i. e. Y N solves the SDE s Ys N = 1 + Yr dx N r N, s. We prove Proposition Y N is a martingale ( EY N s = 1, s > ). Etienne Pardoux (Marseille) MANEGE, 18/1/1 11 / 16

A Girsanov transformation 2 Define new probability measures P N s. t. for all s >, d P N Fs dp Fs = Y N s, s Consider the 2-variate point process ( r (Qr 1,N, Qr 2,N ) = 1 {V N u = 1} dpn u, Under P N, r 1 {V N u =1} dpn u Q 1,N r has intensity (4N 2 + 2θN)1 {V N r = 1} dr Q 2,N r has intensity 4[N 2 + γnl N r (H N r )]1 {V N r =1} dr. ), r, This means that under P N, H N has the requested approximate Log Feller dynamics. Etienne Pardoux (Marseille) MANEGE, 18/1/1 12 / 16

A Girsanov transformation 2 Define new probability measures P N s. t. for all s >, d P N Fs dp Fs = Y N s, s Consider the 2-variate point process ( r (Qr 1,N, Qr 2,N ) = 1 {V N u = 1} dpn u, Under P N, r 1 {V N u =1} dpn u Q 1,N r has intensity (4N 2 + 2θN)1 {V N r = 1} dr Q 2,N r has intensity 4[N 2 + γnl N r (H N r )]1 {V N r =1} dr. ), r, This means that under P N, H N has the requested approximate Log Feller dynamics. Etienne Pardoux (Marseille) MANEGE, 18/1/1 12 / 16

A Girsanov transformation 2 Define new probability measures P N s. t. for all s >, d P N Fs dp Fs = Y N s, s Consider the 2-variate point process ( r (Qr 1,N, Qr 2,N ) = 1 {V N u = 1} dpn u, Under P N, r 1 {V N u =1} dpn u Q 1,N r has intensity (4N 2 + 2θN)1 {V N r = 1} dr Q 2,N r has intensity 4[N 2 + γnl N r (H N r )]1 {V N r =1} dr. ), r, This means that under P N, H N has the requested approximate Log Feller dynamics. Etienne Pardoux (Marseille) MANEGE, 18/1/1 12 / 16

A Girsanov transformation 2 We have that as N, X N X = θ B 1 + 2γ L r (H r )dbr 2 2 ( s [ ] ) θ Y N 2 Y = exp X s 4 + γ2 L 2 r (H r ) dr. Define the probability P s. t. for all s >, Under P, d P Fs dp Fs = Y s, s. H s = θ s 2 s γ L r (H r )dr + B r + 1 2 L s(), s. Etienne Pardoux (Marseille) MANEGE, 18/1/1 13 / 16

A Girsanov transformation 2 We have that as N, X N X = θ B 1 + 2γ L r (H r )dbr 2 2 ( s [ ] ) θ Y N 2 Y = exp X s 4 + γ2 L 2 r (H r ) dr. Define the probability P s. t. for all s >, Under P, d P Fs dp Fs = Y s, s. H s = θ s 2 s γ L r (H r )dr + B r + 1 2 L s(), s. Etienne Pardoux (Marseille) MANEGE, 18/1/1 13 / 16

A Girsanov transformation 2 We have that as N, X N X = θ B 1 + 2γ L r (H r )dbr 2 2 ( s [ ] ) θ Y N 2 Y = exp X s 4 + γ2 L 2 r (H r ) dr. Define the probability P s. t. for all s >, Under P, d P Fs dp Fs = Y s, s. H s = θ s 2 s γ L r (H r )dr + B r + 1 2 L s(), s. Etienne Pardoux (Marseille) MANEGE, 18/1/1 13 / 16

Conclusion Our main theorem follows taking the limit as N in the identity Z N,x t = L N S N x (t), Lemma under the measures P N and P, thanks to the following elementary Let (ξ N, η N ), (ξ, η) be random pairs defined on a probability space (Ω, F, P), with η N, η nonnegative scalar random variables, and ξ N, ξ taking values in some complete separable metric space X. Assume that E[η N ] = E[η] = 1. Write ( ξ N, η N ) for the random pair (ξ N, η N ) defined under the probability measure P N which has density η N with respect to P, and ( η, ξ) for the random pair (η, ξ) defined under the probability measure P which has density η with respect to P. Then ( ξ N, η N ) converges in distribution to ( η, ξ), provided that (ξ N, η N ) converges in distribution to (ξ, η). Etienne Pardoux (Marseille) MANEGE, 18/1/1 14 / 16

Conclusion Our main theorem follows taking the limit as N in the identity Z N,x t = L N S N x (t), Lemma under the measures P N and P, thanks to the following elementary Let (ξ N, η N ), (ξ, η) be random pairs defined on a probability space (Ω, F, P), with η N, η nonnegative scalar random variables, and ξ N, ξ taking values in some complete separable metric space X. Assume that E[η N ] = E[η] = 1. Write ( ξ N, η N ) for the random pair (ξ N, η N ) defined under the probability measure P N which has density η N with respect to P, and ( η, ξ) for the random pair (η, ξ) defined under the probability measure P which has density η with respect to P. Then ( ξ N, η N ) converges in distribution to ( η, ξ), provided that (ξ N, η N ) converges in distribution to (ξ, η). Etienne Pardoux (Marseille) MANEGE, 18/1/1 14 / 16

Appendix : Girsanov s theorem for Poisson processes 1 Let {(Q (1) s,..., Q s (d) ), s } be a d-variate point process adapted to, s } be the predictable some filtration F, and let {λ (i) s (P, F) intensity of Q (i), 1 i d. In other words, M (i) r := Q (i) s s λ(i) r dr is a (P, F) martingale, 1 i d. Assume that none of the Q (i), Q (j), i j, jump simultaneously (so that the M (i) s are mutually orthogonal). Let {µ r (i), r }, 1 i d, be nonnegative F-predictable processes such that for all s and all 1 i d s µ (i) r λ (i) r dr < P -a.s. Etienne Pardoux (Marseille) MANEGE, 18/1/1 15 / 16

Appendix : Girsanov s theorem for Poisson processes 1 Let {(Q (1) s,..., Q s (d) ), s } be a d-variate point process adapted to, s } be the predictable some filtration F, and let {λ (i) s (P, F) intensity of Q (i), 1 i d. In other words, M (i) r := Q (i) s s λ(i) r dr is a (P, F) martingale, 1 i d. Assume that none of the Q (i), Q (j), i j, jump simultaneously (so that the M (i) s are mutually orthogonal). Let {µ r (i), r }, 1 i d, be nonnegative F-predictable processes such that for all s and all 1 i d s µ (i) r λ (i) r dr < P -a.s. Etienne Pardoux (Marseille) MANEGE, 18/1/1 15 / 16

Appendix : Girsanov s theorem for Poisson processes 1 Let {(Q (1) s,..., Q s (d) ), s } be a d-variate point process adapted to, s } be the predictable some filtration F, and let {λ (i) s (P, F) intensity of Q (i), 1 i d. In other words, M (i) r := Q (i) s s λ(i) r dr is a (P, F) martingale, 1 i d. Assume that none of the Q (i), Q (j), i j, jump simultaneously (so that the M (i) s are mutually orthogonal). Let {µ r (i), r }, 1 i d, be nonnegative F-predictable processes such that for all s and all 1 i d s µ (i) r λ (i) r dr < P -a.s. Etienne Pardoux (Marseille) MANEGE, 18/1/1 15 / 16

Appendix : Girsanov s theorem for Poisson processes 2 Theorem {T i k, k = 1, 2...} denoting the jump times of Q(i), let for s ( Y s (i) ) := µ (i) exp T i k 1:Tk i s k { s (1 µ (i) r )λ (i) r } dr and Y s = d j=1 Y (j) s. If E[Y s ] = 1, s, then, for each 1 i d, the process Q (i) has the (i) ( P, F)-intensity λ r = µ (i) r λ (i) r, r, where the probability measure P is defined by d P Fs = Y s, s. dp Fs Note that Y (i) = E(X (i) ), with X (i) s := s (µ (i) r 1)dM r (i), 1 i d, s. Etienne Pardoux (Marseille) MANEGE, 18/1/1 16 / 16