Lecture Characterization of Infinitely Divisible Distributions

Similar documents
Infinitely divisible distributions and the Lévy-Khintchine formula

Poisson random measure: motivation

Statistical inference on Lévy processes

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Notes 9 : Infinitely divisible and stable laws

STOCHASTIC ANALYSIS FOR JUMP PROCESSES

18.175: Lecture 15 Characteristic functions and central limit theorem

Jump-type Levy Processes

the convolution of f and g) given by

18.175: Lecture 13 Infinite divisibility and Lévy processes

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Numerical Methods with Lévy Processes

General Theory of Large Deviations

1 Infinitely Divisible Random Variables

Math 341: Probability Seventeenth Lecture (11/10/09)

The strictly 1/2-stable example

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20

Levy Process and Infinitely Divisible Law

9 Brownian Motion: Construction

Lecture 12. F o s, (1.1) F t := s>t

PROBABILITY THEORY LECTURE 3

18.175: Lecture 17 Poisson random variables

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32

Central Limit Theorem using Characteristic functions

1-D ISING MODELS, COMPOUND GEOMETRIC DISTRIBUTIONS AND SELFDECOMPOSABILITY

McGill University Math 354: Honors Analysis 3

E[X n ]= dn dt n M X(t). ). What is the mgf? Solution. Found this the other day in the Kernel matching exercise: 1 M X (t) =

Finite-dimensional spaces. C n is the space of n-tuples x = (x 1,..., x n ) of complex numbers. It is a Hilbert space with the inner product

Proof. We indicate by α, β (finite or not) the end-points of I and call

Hardy-Stein identity and Square functions

If α is a probability distribution on the line, its characteristic function is defined by

ON ADDITIVE TIME-CHANGES OF FELLER PROCESSES. 1. Introduction

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

1* (10 pts) Let X be a random variable with P (X = 1) = P (X = 1) = 1 2

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

2. As we shall see, we choose to write in terms of σ x because ( X ) 2 = σ 2 x.

Vector fields Lecture 2

Stochastic Analysis. Prof. Dr. Andreas Eberle

are harmonic functions so by superposition

Asymptotics for posterior hazards

1.5 Approximate Identities

Characteristic Functions and the Central Limit Theorem

We denote the space of distributions on Ω by D ( Ω) 2.

Complex Analysis Slide 9: Power Series

Part 3.3 Differentiation Taylor Polynomials

Introduction to self-similar growth-fragmentations

Fourier analysis, measures, and distributions. Alan Haynes

ON THE COMPOUND POISSON DISTRIBUTION

ELEMENTS OF PROBABILITY THEORY

Asymptotic statistics using the Functional Delta Method

The Analog Formulation of Sparsity Implies Infinite Divisibility and Rules Out Bernoulli-Gaussian Priors

Beyond the color of the noise: what is memory in random phenomena?

Jump Processes. Richard F. Bass

Lecture 8: The Field B dr

Estimation of the characteristics of a Lévy process observed at arbitrary frequency

1. Theorem. (Archimedean Property) Let x be any real number. There exists a positive integer n greater than x.

Kernel families of probability measures. Saskatoon, October 21, 2011

Risk Bounds for Lévy Processes in the PAC-Learning Framework

MATH 6605: SUMMARY LECTURE NOTES

Notes on uniform convergence

u xx + u yy = 0. (5.1)

IEOR 3106: Introduction to Operations Research: Stochastic Models. Fall 2011, Professor Whitt. Class Lecture Notes: Thursday, September 15.

Lévy Processes in Cones of Banach Spaces

Math 259: Introduction to Analytic Number Theory More about the Gamma function

LECTURE 10: REVIEW OF POWER SERIES. 1. Motivation

e (x y)2 /4kt φ(y) dy, for t > 0. (4)

Problems for Chapter 3.

. Get closed expressions for the following subsequences and decide if they converge. (1) a n+1 = (2) a 2n = (3) a 2n+1 = (4) a n 2 = (5) b n+1 =

L p Spaces and Convexity

Convergence of price and sensitivities in Carr s randomization approximation globally and near barrier

Lecture 6: Ideal gas ensembles

On rational approximation of algebraic functions. Julius Borcea. Rikard Bøgvad & Boris Shapiro

u( x) = g( y) ds y ( 1 ) U solves u = 0 in U; u = 0 on U. ( 3)

The Diffusion Equation with Piecewise Smooth Initial Conditions ABSTRACT INTRODUCTION

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

221A Lecture Notes Steepest Descent Method

An introduction to Lévy processes

Supermodular ordering of Poisson arrays

Examples of the Fourier Theorem (Sect. 10.3). The Fourier Theorem: Continuous case.

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

Asymptotic Expansions

n E(X t T n = lim X s Tn = X s

Some Fun with Divergent Series

Self-similar Markov processes

1 Fourier Integrals of finite measures.

Assignment 4. u n+1 n(n + 1) i(i + 1) = n n (n + 1)(n + 2) n(n + 2) + 1 = (n + 1)(n + 2) 2 n + 1. u n (n + 1)(n + 2) n(n + 1) = n

Maxwell s equations for electrostatics

1 Lyapunov theory of stability

13. Examples of measure-preserving tranformations: rotations of a torus, the doubling map

Statistical test for some multistable processes

Stability and Sensitivity of the Capacity in Continuous Channels. Malcolm Egan

Small-time asymptotics of stopped Lévy bridges and simulation schemes with controlled bias

Lecture 4: Introduction to stochastic processes and stochastic calculus

Introduction to Rare Event Simulation

B553 Lecture 1: Calculus Review

In this chapter we study elliptical PDEs. That is, PDEs of the form. 2 u = lots,

Translation Invariant Experiments with Independent Increments

This is a Gaussian probability centered around m = 0 (the most probable and mean position is the origin) and the mean square displacement m 2 = n,or

Transcription:

Lecture 10 1 Characterization of Infinitely Divisible Distributions We have shown that a distribution µ is infinitely divisible if and only if it is the weak limit of S n := X n,1 + + X n,n for a uniformly infinitesimal triangular array of independent random variables (X n,i ) 1 i n. Examples include the delta measure ported at a constant, the Gaussian distribution, the compound distribution, and their convolutions. We will show that every infinitely divisible distribution is essentially the convolution of a delta measure, a Gaussian distribution, and a compound Poisson distribution. Our starting point is the result (Accompanying Laws) that every infinitely divisible µ is necessarily the weak limit of a uniformly infinitesimal triangular array (X n,i ) 1 i n, where each X n,i a n,i is compound distributed for some a n,i R. Let φ n and φ n,i denote respectively the characteristic functions of S n and X n,i a n,i. Since X n,i a n,i is a compound Poisson random variable, we have { ( φ n,i (t) = exp c n,i 1 )} { e itx ( µ n,i (dx) =: exp e itx 1 ) } M n,i (dx) for some c n,i 0 and some probability measure µ n,i on R, where we have set M n,i (dx) := c n,i µ n,i (dx). Let A n := n i=1 a n,i and M n (dx) = n i=1 M n,i(dx). Then φ n (t) = n i=1 { (e e ian,it φ n,i (t) = exp ita n + itx 1 ) } M n (dx) φ(t). n Identifying the limit φ is thus reduced to identifying the possible limits for the exponent ita n + ( e itx 1 ) M n (dx). If A n converges to some A R and M n converges weakly to some finite measure M on R, then clearly φ n (t) φ(t) = e ita+ (e itx 1)M(dx), which is the characteristic function of a compound Poisson random variable shifted by A. However, it is in general not necessary that M n converges weakly to a finite measure. Recall that a compound Poisson random variable with characteristic function e (e itx 1)M n(dx) can be written as N n i=1 ξ n,i, where N n is a Poisson random variable with mean M n (R), while (ξ n,i ) i N are i.i.d. random variables with distribution M n (dx)/m n (R). It may happen that M n (R) so that the number of summands N n tends to infinity in probability, however it is compensated by M n (dx)/m n (R), the distribution of ξ n,i, converging weakly to 0 as n. In other words, M n may have more and more of its mass near the origin. To make sense of such convergence, we write (e Ψ n (t) := ita n + itx 1 ) (e M n (dx) = itb n + itx 1 itθ(x) ) M n (dx) e itx 1 itθ(x) = itb n + x (x )M n (dx), (1.1) where θ : R R is any bounded continuous function with θ(x) = x + O( x 3 ) near 0, and B n = A n + θ(x)m n (dx). 1

The standard choices for θ are θ(x) = x, or θ(x) = sin x. Note that eitx 1 itθ(x) is a 1+x bounded continuous function on R and is equal to t at x = 0, and hence Ψ n in (1.1) converges to a well-defined limit Ψ(t) := itb σ t e itx 1 itθ(x) x (x )M(dx) (1.) if B n B R, and (x )M n (dx) converges weakly to a finite measure ν with σ := ν({0}) and M(dx) := ν(dx) on R\{0}. We remark that even when B n converges, A n and θ(x)mn (dx) may both diverge as n. We will make the above heuristics rigorous and show that every infinitely divisible distribution is uniquely determined by a triple (B, σ, M) as in (1.). First, we show that e Ψ(t) is indeed a characteristic function. For the following results, we assume that θ( ) is bounded and continuous, with θ(x) = x + O( x 3 ) as x 0, which will be relaxed at the end of our discussion. Theorem 1.1 [Existence of i.d.d. with exponent Ψ] Let θ( ) be a bounded and continuous function with θ(x) = x + O( x 3 ) as x 0. Let B R, σ 0, and M(dx) be a measure on R satisfying M({0}) = 0 and (x )M(dx) <. (1.3) Then e Ψ(t), defined as in (1.), is the characteristic function of an infinitely divisible distribution (i.d.d.) µ. We call (B, σ, M) the Lévy triple for µ w.r.t. θ, and M the Lévy measure. Remark. Note that in the Lévy triple, B depends on θ( ), while σ and M do not. Proof. Let M n (dx) := 1 { x 1/n} M(dx), and let Ψ n (t) = itb σ t e itx 1 itθ(x) x (x )M n (dx). Since M n is a finite measure, e Ψn(t) is the characteristic function of a probability distribution, which is the convolution of a delta measure, a Gaussian distribution, and a compound Poisson distribution. Note that eitx 1 itθ(x) is a bounded continuous function on R by the choice of θ, while (x )M n (x) is a sequence of measures increasing to the finite measure (x )M(dx). Therefore by the bounded convergence theorem, Ψ n (t) Ψ(t), and hence e Ψn(t) e Ψ(t) for each t R. Lastly we note that Ψ(0) = 0. To show the continuity of Ψ in t, note that in Ψ(t) = itb σ t e itx 1 itθ(x) x (x )M(dx), the integrand eitx 1 itθ(x) is continuous in t for each x R\{0}, and is uniformly bounded in x R\{0} and t [ T, T ] for any T > 0, which can be seen by Taylor expanding e itx and θ(x) in a small neighborhood of 0. We can then apply the bounded convergence theorem to conclude that Ψ(t) is continuous in t, and hence so is e Ψ(t). By Lévy s continuity theorem, e Ψ(t) must be the characteristic function of some distribution µ. The same argument shows that for any a 0, e aψ(t) is also a characteristic function, and hence µ is infinitely divisible. Next we study the convergence for a sequence of distributions with Lévy triples (B n, σ n, M n ) w.r.t. θ( ), which will be used to show that every infinitely divisible µ admits a Lévy triple.

Theorem 1. [Convergence] Let (µ n ) n N be a sequence of probability distributions with Lévy triples (B n, σ n, M n ) n N w.r.t. a bounded continuous θ with θ(x) = x + O( x 3 ) as x 0. Then µ n converges weakly to a limit µ if and only if the following conditions are satisfied: (i) B n B R as n, (ii) ν n (dx) := σ nδ 0 (dx) + (x )M n (dx) converges weakly to some ν(dx) := σ δ 0 (dx) + (x )M(dx) with M({0}) = 0, i.e., fν n (dx) fν(dx) as n for all bounded continuous f : R R, in which case (B, σ, M) is the Lévy triple for µ w.r.t. θ. Proof. Note that for each t R, g t (x) := eitx 1 itθ(x) defines a bounded continuous function on R with g t (0) := t. If conditions (i) and (ii) are satisfied, then for each t R, Ψ n (t) := itb n σ nt e itx 1 itθ(x) + (x )M n (dx) = itb n + g t (x)ν n (dx) itb + n x g t (x)ν(dx) = itb σ t e itx 1 itθ(x) x (x )M(dx) =: Ψ(t). The assumption ν n ν implicitly implies that ν(r) < since fν(dx) must be well-defined for all bounded continuous f, and hence M satisfies (1.3) and is a Lévy measure. Therefore by Theorem 1.1, (B, σ, M) is the Lévy triple for a probability measure µ. Since e Ψn(t) e Ψ(t) for each t R, it follows that µ n µ. Conversely, let us assume that µ n µ. Our strategy is to show first that the sequence of measures (ν n ) n N is tight, and then show that for every subsequence along which ν n converges, the limits in (i) and (ii) exist and are unique. Let φ n and φ be respectively the characteristic function of µ n and µ. Then φ n (t) = e Ψn(t) converges uniformly to φ(t) on bounded intervals. In particular, on [ T, T ] for T small enough such that φ(t) does not vanish, log φ n (t) converges uniformly to log φ(t), which is continuous and vanishes at t = 0. For each ɛ > 0, we can first choose n 0 N sufficiently large and then δ > 0 sufficiently small to obtain that n n 0 t [ δ,δ] log φ n (t) = n n 0 t [ δ,δ] n n 0 t [ T,T ] ( log φ n (t) log φ(t) + log φ(t) ) log φ n (t) log φ(t) + log φ(t) ɛ. t [ δ,δ] By choosing δ even smaller, we can extend the uniformity to all n N, i.e., ɛ > 0, δ > 0 such that t [ δ,δ] n N log φ n (t) ɛ. (1.4) Note that log φ n (t) = Re(Ψ n (t)) = σ nt 1 cos tx + x (x )M n (dx) = h t (x)ν n (dx), where h t (x) := 1 cos tx 0 for x 0 and h t (0) = t /. We can then restate (1.4) as lim δ 0 t [ δ,δ] n N h t (x)ν n (dx) = 0. (1.5) 3

For each l > 0, we can choose δ > 0 sufficiently small such that inf x: x l h δ (x) = C(δ, l) > 0, which can be verified by Taylor expansion. Using (1.5) with t = δ, we obtain h δ (x) ν n ([ l, l]) n N n N C(δ, l) ν n(dx) C(δ, l) 1 h δ (x)ν n (dx) <. (1.6) n N [ l,l] On the other hand, (1.5) also implies that lim δ 0 t [ δ,δ] n N x 1 (1 cos tx)ν n (dx) = 0. We have seen time and again the close link between (1 cos tx)ν n (dx) for t close to 0, and ν n {x : x > l} for l large. More precisely, if we integrate the above equation over t [ δ, δ] and divide by δ, we then obtain lim δ 0 n N x 1 ( 1 sin δx ) ν n (dx) = 0. δx Since inf y 1 (1 sin y/y) = C > 0, if we let δ = 1/l and restrict the integral to x l, we then obtain lim C ν n {x : x l} = 0. (1.7) l n N Together with (1.6), this implies that n N ν n (R) <, and the family of finite measures {ν n } n N on R is tight. Therefore {ν n } n N is relatively compact. Let ν be the weak limit along a subsequence (ν ni ) i N. By what we have shown in the first part of the proof, the sequence of probability measures µ ni with Lévy triple (0, σ n i, M ni ), where we have replaced B ni by 0, converges weakly to a limit. Since µ ni with Lévy triple (B ni, σ n i, M ni ) also converges weakly, the following exercise shows that B ni must converge to some B R. Exercise 1.3 For each n N, let B n R and let µ n be a probability measure on R. If µ n converges weakly to a limit, and so does µ n δ Bn, then B n B for some B R. We have thus verified conditions (i) and (ii) along the subsequence (µ ni ) i N, and the limit µ admits a Lévy triple (B, σ, M), with σ ν(dx) := ν({0}) and M(dx) := 1 {x 0}. To extend to the full sequence (µ n ) n N, it only remains to show that (ν n ) n N has a unique subsequential weak limit (this also implies the uniqueness of the limit B by Exercise 1.3), which is equivalent to showing that µ admits a unique Lévy triple. This is the content of Theorem 1.4 below. Theorem 1.4 [Uniqueness] Each probability distribution µ is associated with at most one Lévy triple (B, σ, M) for a given θ( ). Proof. Suppose that µ has characteristic function φ, with φ(t) = e Ψ i(t) and Ψ i (t) = itb i σ i t e itx 1 itθ(x) x (x )M i (dx) for two Lévy triples (B i, σ i, M i), i = 1,. Since φ is continuous and never zero, while Ψ 1 (0) = Ψ (0) and Ψ 1, Ψ are both continuous as shown in the proof of Theorem 1.1, we can take logarithm of φ to conclude that Ψ 1 = Ψ. We leave it as an exercise to show that Exercise 1.5 If M is a Lévy measure, then lim t 1 t ( e itx 1 itθ(x) ) M(dx) = 0. 4

Therefore namely that σ1 = σ, which gives Ψ 1 (t) lim t t = σ 1 = lim Ψ (t) t t = σ, ψ(t) := itb 1 + (e itx 1 itθ(x) ) M 1 (dx) = itb (e itx 1 itθ(x) ) M (dx). Let us compute the following discrete analogue of ψ, i.e., for any s > 0, ψ(t + s) + ψ(t s) ψ(t) s ψ(t) := = e itx (1 cos sx)m 1 (dx) = e itx (1 cos sx)m (dx). Since 0 1 cos sx s x, (1 cos sx)m 1 (dx) and (1 cos sx)m (dx) are finite measures with the same Fourier transform, and hence they are equal. Therefore M 1 = M on {x R : 1 cos sx 0}. Note that {x : 1 cos sx 0 for some s > 0} = R\{0}, therefore M 1 = M on R\{0}, which together with the assumption M 1 ({0}) = M ({0}) = 0 imply that M 1 = M. Given σ 1 = σ and M 1 = M, it is then immediate that B 1 = B. Theorems 1. and 1.4 can be used to show that every infinitely divisible distribution µ admits a unique Lévy triple with respect to θ, which is bounded and continuous with θ(x) = x + O( x 3 ) as x 0. This is known as the Lévy-Khintchine representation of µ. We formulate below a version which relaxes the assumption on θ so as to accommodate the standard choice of θ(x) = x1 { x 1}. We emphasize however that in Theorem 1., we cannot let θ be discontinuous due to the weak convergence statement. Corollary 1.6 [Lévy-Khintchine representation] Given θ : R R, which is bounded with θ(x) = x + O(x ) as x 0, every infinitely divisible distribution µ admits a unique Lévy triple (B, σ, M) with respect to θ( ). Proof. By the theorem on Accompanying Laws, every infinitely divisible µ necessarily arises as the weak limit of a uniformly infinitesimal triangular array of independent shifted compound Poisson random variables, which clearly admit representations in terms of Lévy triples. Therefore Theorem 1. implies that µ also admits a Lévy triple (B, σ, M) with respect to θ(x) = sin x, which is furthermore unique by Theorem 1.4. For more general θ, we can write ibt σ t (e itx 1 it sin x ) M(dx) = i(b C)t σ t (e itx 1 itθ(x) ) M(dx), so that (B, σ, M) is a Lévy triple for µ w.r.t. sin x if and only if (B C, σ, M) is a Lévy triple for µ with respect to θ( ), where C := (sin x θ(x))m(dx) is finite by our assumption on θ. The uniqueness of the Lévy triple w.r.t. θ( ) then follows from that for sin x. Remark 1.7 [Computing Lévy triple] Theorem 1. can be used to identify the Lévy triple for an infinitely divisible distribution µ, assuming that µ n n µ for a given sequence of probability measures µ n. We can use µ n to construct an accompanying triangular array of shifted compound Poisson random variables which also converges to µ. The Lévy triples for the shifted compound Poisson random variables can be easily identified in terms of µ n. We can then apply Theorem 1. to identify the Lévy triple for µ. 5

Exercise 1.8 Show that the mean 1 exponential distribution is infinitely divisible, and compute its Lévy triple w.r.t. θ(x) = x1 { x 1}. Remark 1.9 [Lévy Processes] A stochastic process X := (X t ) t 0 is called a Lévy process, if it satisfies: (i) For any k N and 0 = t 0 < t 1 < t k, (X ti X ti 1 ) 1 i k is a collection of independent random variables, (ii) For any 0 s t, X t X s is equally distributed with X t s. These properties imply that for any t > 0, the distribution µ t of X t is infinitely divisible, and hence is uniquely characterized by a Lévy triple (B t, σ t, M t ) (let us take θ(x) = x1 { x 1} to fix B t ). The characteristic function of X t is of the form e Ψt(u), where Ψ t (u) := iub t σ t u + (e iux 1 iuθ(x) ) M t (dx) is known as the characteristic exponent for µ t. Properties (i) and (ii) also imply that for any s 1,..., s k > 0, k Ψ s1 + +s k = Ψ si. This implies that in fact Ψ t = tψ 1. The law of the Lévy process (X t ) t 0 is thus uniquely determined by the Lévy exponent Ψ := Ψ 1, or equivalently, determined by its Lévy triple (B, σ, M) w.r.t. θ(x) = x1 { x 1}. If (B, σ, M) = (0, 1, 0), X is the standard Brownian motion. If (B, σ, M) = (0, 0, δ 1 ), X is the Poisson process. Generally, B encodes a part of the constant drift for the process (the other part being encoded in θ( )), σ encodes the Gaussian (or Brown motion) component of the process, while M encodes the jumps of the process analogous to a compound Poisson process. Note that M assigns finite measure to the complement of any ball centered at the origin, which implies that X can only make finitely many jumps of a given size or larger during any finite time interval. However, M may assign infinite measure to every neighborhood of the origin, in which case X makes infinitely many small jumps in any finite time interval. In such a case, either x 1 x M(dx) < so that the jumps are absolutely summable over any finite time interval; or x 1 x M(dx) = and the jumps are not absolutely summable, and it is necessary to introduce an infinite drift (represented by the term iuθ(x)m(dx) in Ψ(u), which formally is undefined because x 1 x M(dx) = ), so that after such a centering, the infinitely many small positive and negative jumps balance each other out in such a way that X t is well-defined for each t 0. In particular, for the Lévy process with B = σ = 0 and M(dx) = 1 x>0 dx, even though the Lévy measure is ported on (0, ) so that only positive x jumps are possible, the process X can become negative due to the infinite negative drift represented by the term iuθ(x)m(dx) in Ψ(u). i=1 6