Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Similar documents
Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Poisson random measure: motivation

Jump-type Levy Processes

Jump Processes. Richard F. Bass

ELEMENTS OF PROBABILITY THEORY

Infinitely divisible distributions and the Lévy-Khintchine formula

µ X (A) = P ( X 1 (A) )

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Stochastic Analysis. Prof. Dr. Andreas Eberle

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Stochastic Calculus (Lecture #3)

Numerical Methods with Lévy Processes

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Stochastic Calculus and Black-Scholes Theory MTH772P Exercises Sheet 1

Lecture 21 Representations of Martingales

Elementary properties of functions with one-sided limits

STAT331 Lebesgue-Stieltjes Integrals, Martingales, Counting Processes

An essay on the general theory of stochastic processes

Lectures on Lévy Processes, Stochastic Calculus and Financial Applications, Ovronnaz September 2005

FE 5204 Stochastic Differential Equations

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

STOCHASTIC ANALYSIS FOR JUMP PROCESSES

6. Brownian Motion. Q(A) = P [ ω : x(, ω) A )

1. Stochastic Processes and filtrations

(B(t i+1 ) B(t i )) 2

Point Process Control

n E(X t T n = lim X s Tn = X s

Stochastic integration. P.J.C. Spreij

Lecture 4: Introduction to stochastic processes and stochastic calculus

Hardy-Stein identity and Square functions

Stability of Stochastic Differential Equations

Random Process Lecture 1. Fundamentals of Probability

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Survival Analysis: Counting Process and Martingale. Lu Tian and Richard Olshen Stanford University

Martingale Problems. Abhay G. Bhatt Theoretical Statistics and Mathematics Unit Indian Statistical Institute, Delhi

SUMMARY OF RESULTS ON PATH SPACES AND CONVERGENCE IN DISTRIBUTION FOR STOCHASTIC PROCESSES

1 Independent increments

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

Lecture 19 L 2 -Stochastic integration

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Lectures 16-17: Poisson Approximation. Using Lemma (2.4.3) with θ = 1 and then Lemma (2.4.4), which is valid when max m p n,m 1/2, we have

Basic Definitions: Indexed Collections and Random Functions

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20

Verona Course April Lecture 1. Review of probability

Supermodular ordering of Poisson arrays

Brownian Motion. Chapter Stochastic Process

Pathwise Construction of Stochastic Integrals

On pathwise stochastic integration

{σ x >t}p x. (σ x >t)=e at.

Stochastic Processes II/ Wahrscheinlichkeitstheorie III. Lecture Notes

Optional Stopping Theorem Let X be a martingale and T be a stopping time such

1.1 Definition of BM and its finite-dimensional distributions

1 Infinitely Divisible Random Variables

An Introduction to Stochastic Processes in Continuous Time

MASSACHUSETTS INSTITUTE OF TECHNOLOGY 6.265/15.070J Fall 2013 Lecture 7 9/25/2013

Brownian Motion and Stochastic Calculus

Feller Processes and Semigroups

Stochastic Differential Equations.

Exponential Distribution and Poisson Process

Set-Indexed Processes with Independent Increments

Exponential martingales: uniform integrability results and applications to point processes

7 Poisson random measures

Lecture 17 Brownian motion as a Markov process

Elementary Probability. Exam Number 38119

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

Probability and Measure

The strictly 1/2-stable example

of the set A. Note that the cross-section A ω :={t R + : (t,ω) A} is empty when ω/ π A. It would be impossible to have (ψ(ω), ω) A for such an ω.

Lecture Characterization of Infinitely Divisible Distributions

Brownian Motion and Conditional Probability

Exercises. T 2T. e ita φ(t)dt.

STATISTICS 385: STOCHASTIC CALCULUS HOMEWORK ASSIGNMENT 4 DUE NOVEMBER 23, = (2n 1)(2n 3) 3 1.

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

I forgot to mention last time: in the Ito formula for two standard processes, putting

Lecture Notes 3 Convergence (Chapter 5)

JUSTIN HARTMANN. F n Σ.

Let (Ω, F) be a measureable space. A filtration in discrete time is a sequence of. F s F t

A D VA N C E D P R O B A B I L - I T Y

The multidimensional Ito Integral and the multidimensional Ito Formula. Eric Mu ller June 1, 2015 Seminar on Stochastic Geometry and its applications

Doléans measures. Appendix C. C.1 Introduction

A Change of Variable Formula with Local Time-Space for Bounded Variation Lévy Processes with Application to Solving the American Put Option Problem 1

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Optimal investment strategies for an index-linked insurance payment process with stochastic intensity

A Concise Course on Stochastic Partial Differential Equations

13 The martingale problem

Stochastic Proceses Poisson Process

I. ANALYSIS; PROBABILITY

Stochastic Processes

STAT 331. Martingale Central Limit Theorem and Related Results

Introduction to Empirical Processes and Semiparametric Inference Lecture 09: Stochastic Convergence, Continued

In terms of measures: Exercise 1. Existence of a Gaussian process: Theorem 2. Remark 3.

Convergence at first and second order of some approximations of stochastic integrals

1. Stochastic Process

Admin and Lecture 1: Recap of Measure Theory

Lecture 12. F o s, (1.1) F t := s>t

Small-time asymptotics of stopped Lévy bridges and simulation schemes with controlled bias

Transcription:

Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department, University of Sheffield, UK July 22nd - 24th 2010 We recall the probability space (Ω, F, P) which underlies our investigations. F contains all possible events in Ω. When we introduce the arrow of time, its convenient to be able to consider only those events which can occur up to and including time t. We denote by F t this sub-σ-algebra of F. To be able to consider all time instants on an equal footing, we define a filtration to be an increasing family (F t, t 0) of sub-σ-algebras of F,, i.e. 0 s t < F s F t. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 1 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 2 / 44 stochastic process X = (X(t), t 0) is adapted to the given filtration if each X(t) is F t -measurable. e.g. any process is adapted to its natural filtration, Ft X = σ{x(s); 0 s t}. n adapted process X = (X(t), t 0) is a Markov process if for all f B b (R d ), 0 s t <, E(f (X(t)) F s ) = E(f (X(t)) X(s)) (a.s.). (0.1) (i.e. past and future are independent, given the present). The transition probabilities of a Markov process are p s,t (x, ) = P(X(t) X(s) = x), i.e. the probability that the process is in the Borel set at time t given that it is at the point x at the earlier time s. Theorem If X is a Lévy process (adapted to its own natural filtration) wherein each X(t) has law q t, then it is a Markov process with transition probabilities p s,t (x, ) = q t s ( x). Proof. This essentially follows from E(f (X(t)) F s ) = E(f (X(s) + X(t) X(s)) F s ) = f (X(s) + y)q t s (dy). R d Dave pplebaum (Sheffield UK) Lecture 3 July 2010 3 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 4 / 44

Now let X be an adapted process defined on a filtered probability space which also satisfies the integrability requirement E( X(t) ) < for all t 0. We say that it is a martingale if for all 0 s < t <, E(X(t) F s ) = X(s) a.s. Note that if X is a martingale, then the map t E(X(t)) is constant. n adapted Lévy process with zero mean is a martingale (with respect to its natural filtration) since in this case, for 0 s t < and using the convenient notation E s ( ) := E( F s ): E s (X(t)) = E s (X(s) + X(t) X(s)) = X(s) + E(X(t) X(s)) = X(s) lthough there is no good reason why a generic Lévy process should be a martingale (or even have finite mean), there are some important examples: Dave pplebaum (Sheffield UK) Lecture 3 July 2010 5 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 6 / 44 Càdlàg Paths e.g. the processes whose values at time t are σb(t) where B(t) is a standard Brownian motion, and σ is an r d matrix. Ñ(t) where Ñ is a compensated Poisson process with intensity λ. Some important martingales associated to Lévy processes include: exp{i(u, X(t)) tη(u)}, where u R d is fixed. σb(t) 2 tr()t where = σ T σ. Ñ(t) 2 λt. function f : R + R d is càdlàg if it is continue à droite et limité à gauche, i.e. right continuous with left limits. Such a function has only jump discontinuities. Define f (t ) = lim s t f (s) and f (t) = f (t) f (t ). If f is càdlàg, {0 t T, f (t) 0} is at most countable. If the filtration satisfies the usual hypotheses of right continuity and completion, then every Lévy process has a càdlàg modification which is itself a Lévy process. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 7 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 8 / 44

The Jumps of Lévy Process - Poisson Random Measures From now on, we will always make the following assumptions:- (Ω, F, P) will be a fixed probability space equipped with a filtration (F t, t 0) which satisfies the usual hypotheses. Every Lévy process X = (X(t), t 0) will be assumed to be F t -adapted and have càdlàg sample paths. X(t) X(s) is independent of F s for all 0 s < t <. The jump process X = ( X(t), t 0) associated to a Lévy process is defined by for each t 0. Theorem X(t) = X(t) X(t ), If N is a Lévy process which is increasing (a.s.) and is such that ( N(t), t 0) takes values in {0, 1}, then N is a Poisson process. Proof. Define a sequence of stopping times recursively by T 0 = 0 and T n = inf{t > T n 1 ; N(t + T n 1 ) N(T n 1 )) 0} for each n N. It follows from (L2) that the sequence (T 1, T 2 T 1,..., T n T n 1,...) is i.i.d. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 9 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 10 / 44 So T 1 has an exponential distribution with parameter λ and By (L2) again, we have for each s, t 0, P(T 1 > s + t) = P(N(s) = 0, N(t + s) N(s) = 0) = P(T 1 > s)p(t 1 > t) From the fact that N is increasing (a.s.), it follows easily that the map t P(T 1 > t) is decreasing and by a straightforward application of stochastic continuity (L3) we find that the map t P(T 1 > t) is continuous at t = 0. Hence there exists λ > 0 such that P(T 1 > t) = e λt for each t 0. P(N(t) = 0) = P(T 1 > t) = e λt, for each t 0. Now assume as an inductive hypothesis that P(N(t) = n) = e λt (λt) n n!, then P(N(t) = n + 1) = P(T n+2 > t, T n+1 t) = P(T n+2 > t) P(T n+1 > t). But T n+1 = T 1 + (T 2 T 1 ) + + (T n+1 T n ) is the sum of (n + 1) i.i.d. exponential random variables, and so has a gamma distribution with density f Tn+1 (s) = e λs λn+1 s n for s > 0. n! The required result follows on integration. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 11 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 12 / 44

The following result shows that X is not a straightforward process to analyse. Lemma If X is a Lévy process, then for fixed t > 0, X(t) = 0 (a.s.). Proof. Let (t(n), n N) be a sequence in R + with t(n) t as n, then since X has càdlàg paths, lim n X(t(n)) = X(t ).However, by (L3) the sequence (X(t(n)), n N) converges in probability to X(t), and so has a subsequence which converges almost surely to X(t). The result follows by uniqueness of limits. Much of the analytic difficulty in manipulating Lévy processes arises from the fact that it is possible for them to have X(s) = a.s. 0 s t and the way in which these difficulties is overcome exploits the fact that we always have X(s) 2 < a.s. 0 s t We will gain more insight into these ideas as the discussion progresses. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 13 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 14 / 44 Rather than exploring X itself further, we will find it more profitable to count jumps of specified size. More precisely, let 0 t < and B(R d {0}). Define N(t, ) = #{0 s t; X(s) } = 1 ( X(s)). 0 s t Note that for each ω Ω, t 0, the set function N(t, )(ω) is a counting measure on B(R d {0}) and hence E(N(t, )) = N(t, )(ω)dp(ω) is a Borel measure on B(R d {0}). We write µ( ) = E(N(1, )). We say that B(R d {0}) is bounded below if 0 / Ā. Lemma If is bounded below, then N(t, ) < (a.s.) for all t 0. Proof. Define a sequence of stopping times (Tn, n N) by T1 = inf{t > 0; X(t) }, and for n > 1, Tn = inf{t > Tn 1 ; X(t) }. Since X has càdlàg paths, we have T1 > 0 (a.s.) and lim n Tn = (a.s.). Indeed suppose that T1 = 0 with non-zero probability and let N = {ω Ω : T1 0}. ssume that ω Ω N. Then given any u > 0, we can find 0 < δ, δ < u and ɛ > 0 such that X(δ)(ω) X(δ )(ω) > ɛ and this contradicts the (almost sure) right continuity of X( )(ω) at the origin. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 15 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 16 / 44

Similarly, we assume that lim n T n = T < with non-zero probability and define M = {ω Ω : lim n T n = }. If ω Ω M then we obtain a contradiction with the fact that X has a left limit (almost surely) at T (ω). Hence, for each t 0, N(t, ) = n N 1 {T n t} < a.s. Be aware that if fails to be bounded below, then this lemma may no longer hold, because of the accumulation of large numbers of small jumps. The following result should at least be plausible, given Theorem 2 and Lemma 4. Theorem 1 If is bounded below, then (N(t, ), t 0) is a Poisson process with intensity µ(). 2 If 1,..., m B(R d {0}) are disjoint, then the random variables N(t, 1 ),..., N(t, m ) are independent. It follows immediately that µ() < whenever is bounded below, hence the measure µ is σ-finite. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 17 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 18 / 44 Poisson Integration The main properties of N, which we will use extensively in the sequel, are summarised below:-. 1 For each t > 0, ω Ω, N(t, )(ω) is a counting measure on B(R d {0}). 2 For each bounded below, (N(t, ), t 0) is a Poisson process with intensity µ() = E(N(1, )). 3 The compensator (Ñ(t, ), t 0) is a martingale-valued measure where Ñ(t, ) = N(t, ) tµ(), for bounded below, i.e. For fixed bounded below, (Ñ(t, ), t 0) is a martingale. Let f be a Borel measurable function from R d to R d and let be bounded below, then for each t > 0, ω Ω, we may define the Poisson integral of f as a random finite sum by f (x)n(t, dx)(ω) := f (x)n(t, {x})(ω). x Note that each f (x)n(t, dx) is an Rd -valued random variable and gives rise to a càdlàg stochastic process, as we vary t. Now since N(t, {x}) 0 X(u) = x for at least one 0 u t, we have f (x)n(t, dx) = f ( X(u))1 ( X(u)). (0.2) 0 u t Dave pplebaum (Sheffield UK) Lecture 3 July 2010 19 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 20 / 44

In the sequel, we will sometimes use µ to denote the restriction to of the measure µ. in the following theorem, Var stands for variance. Theorem Let be bounded below, then ( 1 f (x)n(t, dx), t 0) is a compound Poisson process, with characteristic function ( { ( )}) [ ] E exp i u, f (x)n(t, dx) = exp t (e i(u,x) 1)µ f, (dx) R d Theorem 3 If f L 2 (, µ ), then ( ) Var f (x)n(t, dx) = t f (x) 2 µ(dx). for each u R d, where µ f, (B) := µ( f 1 (B)), for each B B(R d ). 2 If f L 1 (, µ ), then ( ) E f (x)n(t, dx) = t f (x)µ(dx). Dave pplebaum (Sheffield UK) Lecture 3 July 2010 21 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 22 / 44 Proof. - part of it! 1) For simplicity, we will prove this result in the case where f L 1 (, µ ). First let f be a simple function and write f = n j=1 c j1 j where each c j R d. We can assume, without loss of generality, that the j s are disjoint Borel subsets of. By Theorem 5, we find that ( { ( )}) E exp i u, f (x)n(t, dx) = E exp i u, = = n c j N(t, j ) j=1 n E ( exp { i ( u, c j N(t, j ) )}) j=1 n j=1 = exp { ( ) } exp t e i(u,c j ) 1 µ( j ) { } t (e i(u,f (x)) 1)µ(dx). Dave pplebaum (Sheffield UK) Lecture 3 July 2010 23 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 24 / 44

Now for an arbitrary f L 1 (, µ ), we can find a sequence of simple functions converging to f in L 1 and hence a subsequence which converges to f almost surely. Passing to the limit along this subsequence in the above yields the required result, via dominated convergence. (2) and (3) follow from (1) by differentiation. It follows from Theorem 6 (2) that a Poisson integral will fail to have a finite mean if f / L 1 (, µ). For each f L 1 (, µ ), t 0, we define the compensated Poisson integral by f (x)ñ(t, dx) = f (x)n(t, dx) t f (x)µ(dx). straightforward argument shows that ( f (x)ñ(t, dx), t 0 ) is a martingale and we will use this fact extensively in the sequel. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 25 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 26 / 44 Processes of Finite Variation Note that by Theorem 6 (2) and (3), we can easily deduce the following two important facts: ( { ( )}) E exp i u, f (x)ñ(t, dx) { } = exp t (e i(u,x) 1 i(u, x))µ f, (dx), (0.3) R d for each u R d, and for f L 2 (, µ ), ( ) 2 E f (x)ñ(t, dx) = t f (x) 2 µ(dx). (0.4) We begin by introducing a useful class of functions. Let P = {a = t 1 < t 2 < < t n < t n+1 = b} be a partition of the interval [a, b] in R, and define its mesh to be δ = max 1 i n t i+1 t i. We define the variation Var P (g) of a càdlàg mapping g : [a, b] R d over the partition P by the prescription Var P (g) = n g(t i+1 ) g(t i ). i=1 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 27 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 28 / 44

If V (g) = sup P Var P (g) <, we say that g has finite variation on [a, b]. If g is defined on the whole of R (or R + ), it is said to have finite variation if it has finite variation on each compact interval. It is a trivial observation that every non-decreasing g is of finite variation. Conversely if g is of finite variation, then it can always be written as the difference of two non-decreasing functions - to see this, just write g = V (g)+g 2 V (g) g 2, where V (g)(t) is the variation of g on [a, t]. Functions of finite variation are important in integration, for suppose that we are given a function g which we are proposing as an integrator, then as a minimum we will want to be able to define the Stieltjes integral I fdg, for all continuous functions f (where I is some finite interval). In fact a necessary and sufficient condition for obtaining such an integral as a limit of Riemann sums is that g has finite variation. stochastic process (X(t), t 0) is of finite variation if the paths (X(t)(ω), t 0) are of finite variation for almost all ω Ω. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 29 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 30 / 44 The following is an important example for us. Example Poisson Integrals Let N be a Poisson random measure with intensity measure µ and let f : R d R d be Borel measurable. For bounded below, let Y = (Y (t), t 0) be given by Y (t) = f (x)n(t, dx), then Y is of finite variation on [0, t] for each t 0. To see this, we observe that for all partitions P of [0, t], we have Var P (Y ) f ( X(s)) 1 ( X(s)) < a.s. (0.5) 0 s t In fact, a necessary and sufficient condition for a Lévy process to be of finite variation is that there is no Brownian part (i.e. a = 0 in the Lévy-Khinchine formula), and x ν(dx) <. where X(t) = xn(t, dx), for each t 0. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 31 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 32 / 44

The Lévy-Itô Decomposition This is the key result of this lecture. First, note that for bounded below, for each t 0 xn(t, dx) = X(u)1 ( X(u)) 0 u t is the sum of all the jumps taking values in the set up to the time t. Since the paths of X are càdlàg, this is clearly a finite random sum. In particular, x 1 xn(t, dx) is the sum of all jumps of size bigger than one. It is a compound Poisson process, has finite variation but may have no finite moments. On the other hand it can be shown that X(t) x 1 xn(t, dx) is a Lévy process having finite moments to all orders. Now lets turn our attention to the small jumps. We study compensated integrals, which we know are martingales. Introduce the notation M(t, ) := xñ(t, dx) for t 0 and bounded below. For each m N, let { B m = x R d 1, m + 1 < x 1 } m and for each n N, let n = n m=1 B m. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 33 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 34 / 44 Define xñ(t, dx) := L2 lim n M(t, n ), which is a martingale. Moreover, on taking limits in (0.3), we get ( ( )) { } E exp i u, xñ(t, dx) = exp t (e i(u,x) 1 i(u, x))µ(dx). Consider B (t) = X(t) bt xñ(t, dx) x 1 xn(t, dx), where b = E (X(1) ) x 1 xn(1, dx). The process B is a centred martingale with continuous sample paths. With a little more work, we can show that Cov(B i (t)bj (t)) = ij t. Using Lévy s characterisation of Brownian motion (see later) we have that B is a Brownian motion with covariance a. Hence we have: Dave pplebaum (Sheffield UK) Lecture 3 July 2010 35 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 36 / 44

Theorem (The Lévy-Itô Decomposition) If X is a Lévy process, then there exists b R d, a Brownian motion B a with covariance matrix in R d and an independent Poisson random measure N on R + (R d {0}) such that for each t 0, X(t) = bt + B (t) + xñ(t, dx) + xn(t, dx) (0.6) x 1 Note that the three processes in this decomposition are all independent. n interesting by-product of the Lévy-Itô decomposition is the Lévy-Khintchine formula, which follows easily by independence in the Lévy-Itô decomposition:- Corollary If X is a Lévy process, then for each u R d, t 0, + E(e i(u,x(t)) ) = ( [ exp t i(b, u) 1 (u, u) 2 R d {0} (e i(u,y) 1 i(u, y)1 B (y))µ(dy) ]) (0.7) so the intensity measure µ is the Lévy measure for X and from now on we write µ as ν. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 37 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 38 / 44 The process xñ(t, dx) is the compensated sum of small jumps. The compensation takes care of the analytic complications in the Lévy-Khintchine formula in a probabilistically pleasing way, since it is an L 2 -martingale. The process x 1 xn(t, dx) describes the large jumps - it is a compound Poisson process, but may have no finite moments. Lévy process has finite variation iff its Lévy-Itô decomposition takes the form X(t) = γt + xn(t, dx) x 0 = γt + X(s), 0 s t where γ = b xν(dx). Dave pplebaum (Sheffield UK) Lecture 3 July 2010 39 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 40 / 44

H.Geman, D.Madan and M.Yor have proposed a nice financial interpretation for the jump terms in the Lévy-Itô decomposition:- where the intensity measure is infinite, the stock price manifests infinite activity and this is the mathematical signature of the jitter arising from the interaction of pure supply shocks and pure demand shocks. On the other hand, where the intensity measure is finite, we have finite activity, and this corresponds to sudden shocks that can cause unexpected movements in the market, such as a terrorist atrocity or a major earthquake. If a pure jump Lévy process (no Brownian part) has finite activity then it has finite variation. The converse is false. The first three terms on the rhs of (0.6) have finite moments to all orders, so if a Lévy process fails to have a moment, this is due entirely to the large jumps / finite activity part. In fact: E( X(t) n ) < for all t > 0 if and only if x 1 x n ν(dx) <. Dave pplebaum (Sheffield UK) Lecture 3 July 2010 41 / 44 Dave pplebaum (Sheffield UK) Lecture 3 July 2010 42 / 44 Semimartingales Lévy process is a martingale iff it is integrable and b + xν(dx) = 0. x 1 square-integrable Lévy process is a martingale iff it is centred and then X(t) = B (t) + xñ(t, dx). R d {0} Dave pplebaum (Sheffield UK) Lecture 3 July 2010 43 / 44 stochastic process X is a semimartingale if it is an adapted process such that for each t 0, X(t) = X(0) + M(t) + C(t), where M = (M(t), t 0) is a local martingale and C = (C(t), t 0) is an adapted process of finite variation. In particular Every Lévy process is a semimartingale. To see this, use the Lévy-Itô decomposition to write M(t) = B a (t) + xñ(t, dx) - a martingale, C(t) = bt + x 1 xn(t, dx). Dave pplebaum (Sheffield UK) Lecture 3 July 2010 44 / 44