Poisson random measure: motivation

Size: px
Start display at page:

Download "Poisson random measure: motivation"

Transcription

1 : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps of size greater than 1 occurring by time unit. Now, if we want to describe the jump behaviour, we need not only to know the expected number by time unit: we need info about the number of jumps occurring during any given time period, and have information about the jump size, and this for any trajectory. So to study the jump behaviour, we need info about WHEN do jump occur, and WHICH SIZE they have. 1/49

2 : motivation So we need to be able to associate to any state of the world ω, and any pair (]t 1, t 2], A) where A R Borel set: (ω, ]t 1, t 2] A) # jumps on ]t 1, t 2] with size in A for trajectory associated to ω This number of jumps is clearly stochastic, as it depends on the trajectory. This leads to the introduction of the jump measure, a particular case of Poisson random measure 2/49

3 : Definition Let (Ω, F, P) be a probability space, E R 2 and µ a positive Radon measure on (E, B). A on E with intensity measure µ is a mapping: M : Ω B N : (ω, B) M(ω, B) such that: 1 For almost all ω Ω, M(ω,.) is a Radon measure with integer values on (E, B) 2 For all B E measurable, M(., B) = M(B) is a random variable Poisson distributed with intensity parameter λ = µ(b): k N : P[M(B) = k] = e µ(b) µ(b) k k! 3 If B 1,..., B n B are disjoint, then M(B 1),..., M(B n) are independent random variables In the framework of Lévy processes, M(B) = M(]t 1, t 2] A) will be the number of jumps on ]t 1, t 2] and of size in A 3/49

4 of a compound Poisson process For any compound Poisson process (X t), we associate a random measure on [0, ) R that will capture the jump behaviour of (X t): J X (B) = #{(t, X t X t ) B s.t. X t X t 0} B [0, ) R J X is called the jump measure of process X. Interpretation: J X ([t 1, t 1] A) counts the number of jumps of (X t) between instants t 1 and t 2 and whose jump size is in A (i.e. such that X t A). The jump measure hence gives information on the frequency and size of jumps. We have the following result: Proposition: Let (X t) be a compound Poisson process of intensity λ and distribution of jumps F. Then its jump measure J X is a Poisson random measure on [0, ) R, of intensity µ(dt dx) = ν(dx)dt = λf (dx)dt. 4/49

5 of a compound Poisson process We have already interpreted the Lévy measure of a compound Poisson process as the expected number of jumps of a given size by time unit. Now, J X gives additional information: it tells us when they arrive, and what size they have, for each trajectory (not only in average). Moreover, ν appears in the intensity measure of J X. This leads a more general definition of the Lévy measure, also valid for any Lévy process (not only for compound Poisson processes): Definition: Let (X t ) be a Lévy process on R. The measure on R defined by: ν(a) = E[#{t [0, 1] : X t 0, X t A}] A B is called the Lévy measure of X. 5/49

6 of a compound Poisson process: computation Case of a Poisson process (N t ): Intensity: λ Jump instants :T n J N ([0, t] A) = { #{n 1 : Tn [0, t]} if 1 A 0 otherwise = n 1 I (Tn,1) [0,t] A J N = n 1 δ (Tn,1) 6/49

7 of a compound Poisson process: computation Case of a compound Poisson process : X t = N t n=1 Y n, jump instants denoted by T i, i N 0 J X ([0, t] A) = #{n 1 : T n [0, t] and Y n A} J X = n 1 δ (Tn,Y n) = n 1 δ (Tn, X Tn ) 7/49

8 of a compound Poisson process: computation In particular, as X t is equal to the sum of its jumps (pure jump process), we can write: X t = X s On the other hand, xj X (ds dx) = [0,t] A s [0,t] [0,t] A x n 1 = n 1 Y n I Tn ti Yn A = s [0,t] δ (Tn,Y n)(dtdx) X s I Xs 0, X s A 8/49

9 of a compound Poisson process: computation In particular, if we set A = R in this formula, we get: xj X (ds dx) = X s = X t [0,t] R s [0,t] X s 0 This is a particular case of Lévy-Ito decomposition theorem (in the case of a compound Poisson process). More generally, any (measurable) function of the jumps of the process can be expressed by means of an integral with respect to the jump measure: f (x)j X (ds dx) = [0,t] R s [0,t] X s 0 f ( X s) 9/49

10 Lévy-Ito decomposition: Introduction Let us consider a Lévy process (Xt 0 ) with piecewise constant trajectories. This is necessarily a compound Poisson process, and we have seen that it can be represented as the sum of its jumps, as well as by means of an integral w.r.t. the jump measure: X 0 t = s [0,t] X s 0 X 0 s = xj X (ds dx) [0,t] R where J X is a with intensity measure ν(dx)dt. The Lévy measure ν is a finite measure (ν(r) = λ <,) defined by: ν(a) = E[#{t [0, 1] : X 0 t 0, X 0 t A}] 10/49

11 Lévy-Ito decomposition: Introduction Now, given a Brownian motion with drift, γt + σw t, independent from (X 0 t ), the sum: X t = X 0 t + γt + σw t defines another Lévy process, which can hence be decomposed in: X t = γt + σw t + xj X (ds dx) = γt + σw t + [0,t] R Is this true for any Lévy process? s [0,t] X s 0 X s 11/49

12 Lévy-Ito decomposition: Introduction Given a Lévy process (X t), we can still define its Lévy measure ν as above (same definition). If A is a compact set that does not contain 0, one can see that ν(a) is still finite (any cadlag process has on each bounded time interval a finite number of jumps of size ɛ for a given ɛ > 0) ν is a Radon measure on R\{0} Now, ν is not necessarily finite as it can blow up close to 0: this is the case if X can have an number of jumps of size arbitrarily small. In that case, the sum s [0,t], X s 0 X s becomes a series, and is not necessarily converging. So the result cannot be extended as such. 12/49

13 Theorem: Lévy-Ito decomposition Let X t be a Lévy process on R and ν its Lévy measure. Then: ν is a Radon measure on R\{0} and verifies: x 2 ν dx <, x <1 ν dx = ν x 1 < x 1 The jump measure of X, J X, is a on [0, ) R, with intensity measure ν dx dt There exists γ R, a Brownian motion B t with volatility σ 0, such that: X t = γt + B t + X t l + lim ε 0 X t ε (1) X t l = xj X ds dx = ΔX s x 1 s [0,t] ΔX s 1 s [0,t] X t ε = x ε x <1 s [0,t] J X ds dx ν dx ds The different terms in (1) are independent, and the limit exists in the sense of a.s. convergence on Ω and uniformly in t [0, T] The triplet σ, γ, ν is called the characteristic triplet or Lévy triplet of process (X t ) 13/49

14 Lévy-Ito decomposition: Comments The result says that X t = γt + σw t + J t where γt is a (deterministic) drift, σw t is a Brownian motion, and J t is a pure jump process, governing the jump part of X The first part γt + σw t is a continuous process (diffusion). This is the continuous part of the Lévy process, and is captured by γ and σ. The profile of jumps is described thanks to the jump measure J X, which is itself a of intensity measure ν(dx)dt. This Lévy measure ν drives hence the jump part. The triplet (σ, γ, ν) hence entirely characterizes the process (...) 14/49

15 Lévy-Ito decomposition: Comments The condition ν(dy) < means that X has in average a finite y 1 number of jumps of size greater than 1 by time unit. So, the second part in the decomposition : Xt l = X s s [0,t], X s 1 has actually a finite number of terms, and is a compound Poisson process (since J X is a Poisson measure), that we can denote as J (1) t. Remark that there is nothing special w.r.t. the threshold chosen for the size of jumps (1 here). Another threshold could have been chosen for expressing this decomposition result (see also later). 15/49

16 Lévy-Ito decomposition: Comments The last term is more complicated, and includes the Lévy measure ν. ν has no weight at 0 (a jump of size 0 is not a jump...) but might have a singularity in a neighborhood of 0: This corresponds to the case where there is an infinite number of jumps of size arbitrarily small, which is possible for a cadlag trajectory This is why in this result we only consider the limit for ɛ 0 and that we compensate the jump measure in this term. J X (ds dx) ν(dx)ds is by definition the compensated Poisson random measure associated to J X 16/49

17 Lévy-Ito decomposition: Comments Indeed, if we consider Xt ɛ defined by: Xt ɛ = X s = xj X (ds dx) ɛ x <1,0 s t s [0,t],ɛ X s <1 (so, X ɛ t, but without the compensation) this is also a compound Poisson process, having jump size between ɛ and 1. The problem is that we cannot just pass to the limit of Xt ɛ for ɛ 0 due to the potential singularities of ν close to 0 (making this sum become a series, non necessarily convergent). To get convergence, one has to consider the compensated version of this Poisson process (in order to get a 0 mean and being able to use a central limit type theorem): X ɛ t = x(j X (ds dx) ν(dx)ds) ɛ x <1,0 s t 17/49

18 Lévy-Ito decomposition: Comments The last term lim X t ɛ can be seen as a (possibly infinite) superposition of independent compound Poisson processes : consider for this Y n = X ɛ n+1 t for a sequence ɛ n 0 (so that lim X t ɛ following result: X ɛn t n = lim n Yi), and use the i=1 Proposition: Let (X t, Y t) be a Lévy process. If (Y t) is a compound Poisson process and (X t) and (Y t) never jump together, then they are independent. Each term Y n is hence a compound Poisson process (with support of the distribution of jumps, F n, contained in [ɛ n+1, ɛ n) ( ɛ n, ɛ n+1]), and hence never jumps simultaneously with another Y m (for n m) 18/49

19 Lévy-Ito decomposition: Comments This theorem in summary means that a Lévy process can be seen as a combination of a Brownian motion with drift, and of a sum (or superposition), possibly infinite, of independent compound Poisson processes, but with compensation Actually, the proof uses a central limit theorem to get convergence, applied on this sequence of centered (compensated) random variables (one can see that Yn converges a.s. uniformly for t in a compact set, n and in distribution, thanks to the compensation present in the X t ɛn ) This also means that any Lévy process can be approximated, with an arbitrarily good precision, by the sum of a Brownian motion with drift and a compound Poisson process, i.e. a jump-diffusion process (cf. Monte Carlo simulations, see later) It suffices for that to consider the decomposition for some ɛ > 0 fixed, and not passing to the limit for ɛ 0 19/49

20 Lévy-Ito decomposition: Comments In summary, the result says that we can write X t as: X t = γt + σw t + J t = γt + σw t + J (1) t + J (0) t where J t is a pure jump process corresponding to the jump part of X, and: J (1) t = J t Λ, Λ = {x : x 1} J (0) t = J t Λ 0, Λ 0 = {x : x < 1}\{0} is a simple compound Poisson process J (0) t is the superposition of compensated Poisson processes, that might be infinite (case of an infinite activity process) or not (case of a finite activity process), with or without finite variation (see later), in function of the behaviour of νclose to 0 In fact we can also write: J (1) t X t = J (0) t + σw t }{{} Martingale + γt + J (1) t }{{} Finite variation process which implies that X t is a semi-martingale (see later) 20/49

21 Corollary: Lévy-Khinchin representation Corollary Let (X t ) be a Lévy process on R with characteristic triplet (σ, γ, ν). Then Φ t (z) = E[e izxt ] = e tψ(z) z R where the characteristic exponent ψ(z) has the form: ψ(z) = 1 2 σ2 z 2 + iγz + (e izx 1 izxi x <1 )ν(dx) R 21/49

22 Corollary: Lévy-Khinchin representation Proof: We have seen that for a compound Poisson process, the characteristic exponent ψ was equal to: ψ(z) = (e izx 1)ν(dx), ν(dx) = λdf (x) R and for a compensated compound Poisson process ( X t = X t E[X t]): ψ(z) = (e izx 1 izx)ν(dx). R By Lévy - Ito decomposition theorem, for each t, γt + σw t + X l t + X ɛ t ɛ 0 X t a.s. which implies the convergence in distribution. Now convergence in distribution is equivalent to convergence of the characteristic functions 22/49

23 Corollary: Lévy-Khinchin representation The characteristic function of γt + σw t + X l t + X ɛ t is: E[e iz(γt+σwt +X l t + X ɛ t ) ] = E[e iz(γt+σwt ) ]E[e izx l t ]E[e iz X ɛ t ] as the different terms are independent (cf. proof of L-I). Now, E[e iz(γt+σwt ) ] = e σ2 z 2 t+itγz 2 (the characteristic function of X N(µ, σ 2 ) is Φ X (z) = exp( σ2 z 2 + iµz)), 2 { } and E[e izx l t ] = exp E[e iz X ɛ t ] = exp { t t (e izx 1)ν(dx) x 1 (e izx 1 izx)ν(dx) ɛ x <1 Clearly, the product of (1), (2), (3) converges to the formula of the Lévy-Khinchin theorem. } (1) (2) (3) 23/49

24 Infinite activity vs Finite activity ν(a) measures the expected number of jumps by time unit with size in A ν(r) is the expected number of jumps by time unit For instance, for a Poisson process of intensity parameter λ, ν(r) = λ <. A process for which ν(r) = is said to be of infinite activity In this case, any (even bounded) time interval, even very small, contains an infinite number of jumps in average One can see that it is not only in average: the number of jumps in any time interval will be infinite a.s. Moreover, one can show that the set of jump instants is countable, and dense in [0, ) 24/49

25 Remark: truncation convention The truncation of jumps at 1 in Lévy-Ito decomposition is purely conventional. We could express the result with any other truncation threshold ɛ > 0. Lévy-Khinchin formula becomes then: ψ(z) = 1 2 σ2 z 2 + iγ ɛ z + (e izx 1 izxi x <ɛ )ν(dx) with R γ ɛ = γ xi ɛ x <1 ν(dx) R (by using the fact that I x <1 = I ɛ x <1 + I x <ɛ ) We truncate here with the function I x <1. 25/49

26 Determination of the characteristic triplet (σ, γ, ν): examples Standard Brownian motion If X N(µ, σ 2 ), then Φ X (z) = exp( σ2 z iµz). So if W t N(0, t) is a standard B.M., its characteristic function is equal to: Φ Wt (u) = e u2 2 t = e tψ(z) ψ(u) = u2, and hence the characteristic triplet of a standard Brownian 2 motion is: (σ, γ, ν) = (1, 0, 0) In particular, the Lévy measure is 0 (no jump part). Similarly, the triplet of X t = µt + σw t is (σ, µ, 0). 26/49

27 Determination of the characteristic triplet: examples Poisson process N t of intensity λ We have seen that E[e iznt ] = exp{λt(e iz 1)}. This implies ψ(z) = λ(e iz 1) Now, we need to re-write this by making appear as in Lévy-Khinchin formula the expression: (e izx 1 izxi x <1 )ν(dx) R From the definition of ν, as jumps have only a size 1, we see that ν(dx) = λδ 1 (Dirac measure at 1). So : (e izx 1 izxi x <1 )ν(dx) = λ(e iz 1) = ψ(z) R So the triplet is simply: (σ, γ, ν) = (0, 0, λδ 1). 27/49

28 Determination of the characteristic triplet: examples Compound Poisson process X t of intensity λ and jump distribution F We have seen that Φ(z) = E[e izxt ] = exp{tλ R (eizx 1)dF (x)} and ν(dx) = λdf (x). So: ψ(z) = λ (e izx 1)dF (x) R = λ (e izx 1 izxi x <1 )df (x) + izλ R = σ2 z 2 + iγz + λ 2 xi x <1 df (x) R (e izx 1 izxi x <1 )df (x) R required that: γ = λ σ = 0 xi x <1 df (x) = λ R x df (x) x <1 28/49

29 Determination of the characteristic triplet: examples Gamma process: X t Γ(αt, λ), where the density of a Gamma distribution Γ(a, b) is: { b a f Γ (x; a, b) = x a 1 e bx x > 0 Γ(a) 0 else where the Gamma function is defined for z C with R(z) > 0 by: Γ(z) = 0 t z 1 e t dt (function generalizing the factorial : if z = n N, Γ(n) = (n 1)!, and satisfying the recurrence: Γ(z + 1) = zγ(z)). 29/49

30 Determination of the characteristic triplet: examples Gamma process (X t Γ(αt, λ)): The Gamma law can also be defined by its characteristic function: ( Φ Γ (z; a, b) = E[e izx ] = 1 iz ) a. b From this expression, we easily see that the Gamma law is infinitely divisible, as its characteristic function can be written as the n th power of another characteristic function of an identical law: E[e iux ] = where Y i i.i.d. Γ( a n, b). ( 1 iu b ) a = ( ( 1 iu b = Π n k=1 E[eiuY k ] = E[e iu(y Y n) ] In consequence, it defines a Lévy process. ) ) a n n ( = Π n k=1 1 iu ) a n b 30/49

31 Determination of the characteristic triplet: examples Gamma process (X t Γ(αt, λ)): By using an integral representation of ψ(z) = ln ( 1 iz λ ) α: ( α ln 1 iz ) = α λ 0 (e izx 1)x 1 e λx dx (Frullani integral 1 ), and using Lévy-Khinchin representation theorem, we get: and γ = ν(dx) = α x e λx I x>0dx 1 and finally that σ = 0 (no diffusion part!). 0 x α x e λx dx = α 1 e λ λ 1 If f is C 1 and if the integral equal to (f (0) f ( )) ln b a 0 f (ax) f (bx) dx converges, then this integral is x 31/49

32 Determination of the characteristic triplet: examples Gamma process (X t Γ(αt, λ)): We see that for any ɛ > 0: ν( x ɛ) = x ɛ α x e λx I x>0dx < which means that the number of jumps of size ɛ is finite. However, α ν(r) = ν(dx) = x e λx dx =. R R + The Gamma process is hence an infinite activity process: the average number of jumps by time unit is infinite. 32/49

33 Infinite activity processes: remark One can see generally that for any infinite activity process (i.e. a process s.t. ν(r) = ), any time interval contains an infinity of jumps. Moreover, one can see that the set of jumps instants is countable and dense in [0, ). The countable character is due to the cadlag property of trajectories. The density (as well as the infinity of the jumps number) can be proven by the following: Consider a time interval [a, b] and { ɛ(n) = sup r : x r ν(dx) = ν( x r) n } ν( x r) is clearly a decreasing function of r ɛ(n) represents a jump size: it is the greatest admissible jump size for which we will find at least n jumps of that size or greater by time unit in average ɛ(n) is a decreasing function of n 33/49

34 Infinite activity processes: remark One then defines Y n = J X (dx dt) ɛ(n) x <ɛ(n 1),t [a,b] = J X (([ɛ(n), ɛ(n 1)) ( ɛ(n 1), ɛ(n)]) [a, b]) which corresponds to the number of jumps on [a, b] with absolute size between ɛ(n) and ɛ(n 1). By Lévy Ito decomposition theorem, the random variables Y n are independent Poisson distributed random variables. These variables are also identically distributed: they are Poisson distributed, with a mean equal to the expected number of jumps of size within [ɛ(n), ɛ(n 1)), which is 1 by construction of the (ɛ(n)). Yn hence provides the total number of jumps on [a, b], which n=1 appears as an infinite sum of i.i.d variables with mean 1. By the law of large numbers, this series converges to. This is true for any interval [a, b], which implies the density property. 34/49

35 Path properties The following proposition is a consequence of the results seen up to now Proposition A Lévy process has piecewise constant trajectories it is a compound Poisson process its characteristic triplet satisfies: σ = 0 ν(dx) < R γ = its characteristic exponent is of the form: with ν(r) <. ψ(z) = xν(dx) x <1 (e izx 1)ν(dx) R 35/49

36 Path properties: finite variation One defines the total variation of a function f : [a, b] R by TV (f ) = sup n f (t i) f (t i 1) where the sup is taken on all finite subdivisions / partitions of [a, b]: i=1 a = t 0 < t 1 <... < t n = b In particular, any increasing or decreasing function has a finite variation, and every finite variation function can be written as the difference of two increasing functions. A Lévy process is said of finite variation if its trajectories are of finite variation with probability one 36/49

37 Path properties: finite variation Proposition: A Lévy process is of finite variation its triplet satisfies σ = 0 and x ν(dx) < x 1 Idea of the proof:( ): Use Lévy Ito decomposition: X t = γt + X ɛ t = x 1,0 s t ɛ x <1,0 s t which can be rewritten here without compensation: X ɛ t = X t = bt + ɛ x <1,0 s t x 1,0 s t xj X (ds dx) + lim ɛ 0 X ɛ t x(j X (ds dx) ν(dx)ds) xj X (ds dx) + lim Xt ɛ ɛ 0 xj X (ds dx) = where b is the absolute drift: b = γ 0< x <1 xν(dx). ɛ X s <1,s t X s 37/49

38 Path properties: finite variation as by hypothesis, xν(dx) exists, so that this part can be separated from x <1 the limit (and grouped with γ to provide a new drift b). The first two terms have a finite variation. We then consider the variation of the third one on [0, t]: TV (Xt ɛ ) = X s = x J X (ds dx) ɛ x <1,0 s t ɛ X s <1,s t We take the expectation of both members and use Fubini theorem: E[TV (X ɛ t )] = t x ν(dx) t ɛ x <1 so that E[TV (X ɛ t )] converges towards a finite limit. x ν(dx) < if ɛ 0 x <1 38/49

39 Path properties: finite variation On the other hand, TV (Xt ɛ ) X s a.s. 0< X s <1,s t if ɛ 0. One easily sees that this last series is equal to TV ( ) X s = TV xj X (ds dx) x <1,0 s t 0< X s <1,s t By monotone convergence ( thm, E[TV (Xt ɛ )] E[limTV (Xt ɛ )] = E[TV (lim Xt ɛ )] = E[TV xj x <1,0 s t X (ds dx)) ]. Hence TV (lim Xt ɛ ) is a random variable with a finite expectation, and is hence finite a.s. 39/49

40 Path properties: finite variation ( ): TV (X t) TV(pure jump part) (i.e. the part moving only by jumps), and is supposed to be finite. The TV of the pure jump part restricted to jumps larger than ɛ is: = x J X (ds dx) ɛ x <1,s [0,t] x ν(dx) + ɛ x <1 x (J X (ds dx) ν(dx)ds) ɛ x <1,s [0,t] We then show that the second term converges a.s. to something finite (similar arguments as in the proof of Lévy-Ito decomposition). This implies that the first term converges to a finite limit. Indeed, if this is not the case, then the TV of the jump part would not be finite, a contradiction. 40/49

41 Path properties: finite variation So the limit of this first term, x ν(dx), is <, and this implies that 0 x <1 xj [0,t] R X (ds dx) is well defined and has a finite variation. So we can decompose X t in : X t = Xt c + xj [0,t] R X (ds dx), where Xt c is the continuous part of process X, and this implies that Xt c must also be of finite TV. This implies that σ = 0 (as a Brownian motion has an infinite variation). 41/49

42 Path properties: finite variation Corollary (Lévy-Ito and Lévy-Khinchin theorems in the finite variation case): Let (X t ) be a Lévy process of finite variation with Lévy triplet (0, ν, γ). Then X can be expressed as the sum of its jumps between 0 and t and a linear drift: X t = bt + x J X (ds dx) = bt + X s [0,t] R where b = γ x <1 x ν(dx). Its characteristic function can be expressed as: Φ t (z) = e t{ibz+ R (eizx 1)ν(dx)}. s [0,t], X s 0 42/49

43 Classification This leads to the following classification (for the jump part): Finite activity process : ν(r) <. This is the jump diffusion case. It has a finite variation if it has moreover no Brownian component (its jump part is anyway of finite variation), and can be expressed as the sum of its jumps between 0 and t (+ linear drift) Infinite activity but of finite variation process: ν(r) = but x ν(dx) < 0< x <1 infinite number of jumps on any time interval BUT the process (without drift) can be expressed as the sum of its jumps between 0 and t (no need to compensate to make the series converge). Infinite activity and infinite variation process: only condition: x 2 ν(dx) < 0< x <1 43/49

44 Classification The infinite activity and/or finite variation property depends on the behavior of ν in a neighborhood of 0. 44/49

45 Path properties: Examples Compound Poisson process: ν(dx) = λδ 1 (dx) in the non compound case, and ν(dx) = λdf (x) in the general case. It is clearly of finite activity, and hence also of finite variation ( x ν(dx) < clearly) x <1 Gamma process: we have seen that it is of infinite activity (ν(r) = ) BUT it is of finite variation! Indeed: ν(dx) = ax 1 e bx I x>0 dx, hence x ν(dx) = ae bx I x>0 and hence x ν(dx) = ae bx dx < x <1 0<x<1 The same will hold for the Inverse Gaussian process (see later). Before seeing other examples, the introduce the notion of subordinator: 45/49

46 Path properties: Subordinators Definition: A subordinator is an increasing Lévy process. Subordinators are important in financial modelling as they can be used as time change within other Lévy processes, and in particular Brownian motion: W (t) W (Z t ), Z t subordinator This is like observing a Brownian motion on a new time scale, a stochastic one, that can be interpreted as a kind of business time i.e. the integrated rate of information arrival 46/49

47 Path properties: Subordinators Proposition Let(X t ) be a Lévy process on R. Then the following points are equivalent: 1 X t 0 a.s. for some t > 0 2 X t 0 a.s. for all t > 0 3 (X t ) is a subordinator: sample paths of (X t ) are a.s. non decreasing 4 The characteristic triplet satisfies: 0 σ = 0 ν((, 0]) = 0 min(1, x)ν(dx) < b 0 where b = γ x <1 xν(dx). 47/49

48 Path properties: Subordinators Examples (1) Proposition: If X is a Lévy process on R, f : R [0, ) such that f (x) = O( x 2 ) for x 0. Then S t = f ( X s) is a Lévy process, and a subordinator. s t, X s 0 Proof: exercise (hint: show that E[S t] < to show that S t is well defined) In particular, by choosing f (x) = x 2, we get that the sum of the squared jumps of a Lévy process leads to a subordinator: S t = X s 2 s t, X s 0 This new process is called the discontinuous quadratic variation of X (see later). 48/49

49 Path properties: Subordinators Examples (2) What we have seen on the Gamma process implies that it is a subordinator (the Gamma distribution is positive). We have see that: ν(dx) = αx 1 e x I x>0dx ν has hence a positive support, as announced by proposition of slide 47. Moreover, we have seen that there is no Brownian component (σ = 0), and that its (relative) drift γ is equal to: Hence its absolute drift b is equal to: γ = α 1 e λ λ b = γ 1 0 αe λx dx = γ + α λ [e λx ] 1 0 = γ + α λ (e λ 1) = 0 By subordinating a Brownian motion with drift with a Gamma process, we obtain a new process, the variance gamma process. This is still a Lévy process (see later). 49/49

Infinitely divisible distributions and the Lévy-Khintchine formula

Infinitely divisible distributions and the Lévy-Khintchine formula Infinitely divisible distributions and the Cornell University May 1, 2015 Some definitions Let X be a real-valued random variable with law µ X. Recall that X is said to be infinitely divisible if for every

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Numerical Methods with Lévy Processes

Numerical Methods with Lévy Processes Numerical Methods with Lévy Processes 1 Objective: i) Find models of asset returns, etc ii) Get numbers out of them. Why? VaR and risk management Valuing and hedging derivatives Why not? Usual assumption:

More information

STOCHASTIC ANALYSIS FOR JUMP PROCESSES

STOCHASTIC ANALYSIS FOR JUMP PROCESSES STOCHASTIC ANALYSIS FOR JUMP PROCESSES ANTONIS PAPAPANTOLEON Abstract. Lecture notes from courses at TU Berlin in WS 29/1, WS 211/12 and WS 212/13. Contents 1. Introduction 2 2. Definition of Lévy processes

More information

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem

Definition: Lévy Process. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes. Theorem Definition: Lévy Process Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 2: Lévy Processes David Applebaum Probability and Statistics Department, University of Sheffield, UK July

More information

Lecture Characterization of Infinitely Divisible Distributions

Lecture Characterization of Infinitely Divisible Distributions Lecture 10 1 Characterization of Infinitely Divisible Distributions We have shown that a distribution µ is infinitely divisible if and only if it is the weak limit of S n := X n,1 + + X n,n for a uniformly

More information

Jump-type Levy Processes

Jump-type Levy Processes Jump-type Levy Processes Ernst Eberlein Handbook of Financial Time Series Outline Table of contents Probabilistic Structure of Levy Processes Levy process Levy-Ito decomposition Jump part Probabilistic

More information

Stochastic Analysis. Prof. Dr. Andreas Eberle

Stochastic Analysis. Prof. Dr. Andreas Eberle Stochastic Analysis Prof. Dr. Andreas Eberle March 13, 212 Contents Contents 2 1 Lévy processes and Poisson point processes 6 1.1 Lévy processes.............................. 7 Characteristic exponents.........................

More information

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc

CIMPA SCHOOL, 2007 Jump Processes and Applications to Finance Monique Jeanblanc CIMPA SCHOOL, 27 Jump Processes and Applications to Finance Monique Jeanblanc 1 Jump Processes I. Poisson Processes II. Lévy Processes III. Jump-Diffusion Processes IV. Point Processes 2 I. Poisson Processes

More information

1 Infinitely Divisible Random Variables

1 Infinitely Divisible Random Variables ENSAE, 2004 1 2 1 Infinitely Divisible Random Variables 1.1 Definition A random variable X taking values in IR d is infinitely divisible if its characteristic function ˆµ(u) =E(e i(u X) )=(ˆµ n ) n where

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Statistical inference on Lévy processes

Statistical inference on Lévy processes Alberto Coca Cabrero University of Cambridge - CCA Supervisors: Dr. Richard Nickl and Professor L.C.G.Rogers Funded by Fundación Mutua Madrileña and EPSRC MASDOC/CCA student workshop 2013 26th March Outline

More information

Hardy-Stein identity and Square functions

Hardy-Stein identity and Square functions Hardy-Stein identity and Square functions Daesung Kim (joint work with Rodrigo Bañuelos) Department of Mathematics Purdue University March 28, 217 Daesung Kim (Purdue) Hardy-Stein identity UIUC 217 1 /

More information

Multilevel Monte Carlo for Lévy Driven SDEs

Multilevel Monte Carlo for Lévy Driven SDEs Multilevel Monte Carlo for Lévy Driven SDEs Felix Heidenreich TU Kaiserslautern AG Computational Stochastics August 2011 joint work with Steffen Dereich Philipps-Universität Marburg supported within DFG-SPP

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32 Lévy Processes and Infinitely Divisible Measures in the Dual of a Nuclear Space David Applebaum School of Mathematics and Statistics, University of Sheffield, UK Talk at "Workshop on Infinite Dimensional

More information

n E(X t T n = lim X s Tn = X s

n E(X t T n = lim X s Tn = X s Stochastic Calculus Example sheet - Lent 15 Michael Tehranchi Problem 1. Let X be a local martingale. Prove that X is a uniformly integrable martingale if and only X is of class D. Solution 1. If If direction:

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

MA8109 Stochastic Processes in Systems Theory Autumn 2013

MA8109 Stochastic Processes in Systems Theory Autumn 2013 Norwegian University of Science and Technology Department of Mathematical Sciences MA819 Stochastic Processes in Systems Theory Autumn 213 1 MA819 Exam 23, problem 3b This is a linear equation of the form

More information

An introduction to Lévy processes

An introduction to Lévy processes with financial modelling in mind Department of Statistics, University of Oxford 27 May 2008 1 Motivation 2 3 General modelling with Lévy processes Modelling financial price processes Quadratic variation

More information

Small-time asymptotics of stopped Lévy bridges and simulation schemes with controlled bias

Small-time asymptotics of stopped Lévy bridges and simulation schemes with controlled bias Small-time asymptotics of stopped Lévy bridges and simulation schemes with controlled bias José E. Figueroa-López 1 1 Department of Statistics Purdue University Seoul National University & Ajou University

More information

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2)

Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2) Some Terminology and Concepts that We will Use, But Not Emphasize (Section 6.2) Statistical analysis is based on probability theory. The fundamental object in probability theory is a probability space,

More information

Jump Processes. Richard F. Bass

Jump Processes. Richard F. Bass Jump Processes Richard F. Bass ii c Copyright 214 Richard F. Bass Contents 1 Poisson processes 1 1.1 Definitions............................. 1 1.2 Stopping times.......................... 3 1.3 Markov

More information

ON ADDITIVE TIME-CHANGES OF FELLER PROCESSES. 1. Introduction

ON ADDITIVE TIME-CHANGES OF FELLER PROCESSES. 1. Introduction ON ADDITIVE TIME-CHANGES OF FELLER PROCESSES ALEKSANDAR MIJATOVIĆ AND MARTIJN PISTORIUS Abstract. In this note we generalise the Phillips theorem [1] on the subordination of Feller processes by Lévy subordinators

More information

Malliavin Calculus in Finance

Malliavin Calculus in Finance Malliavin Calculus in Finance Peter K. Friz 1 Greeks and the logarithmic derivative trick Model an underlying assent by a Markov process with values in R m with dynamics described by the SDE dx t = b(x

More information

Lecture 21 Representations of Martingales

Lecture 21 Representations of Martingales Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let

More information

The strictly 1/2-stable example

The strictly 1/2-stable example The strictly 1/2-stable example 1 Direct approach: building a Lévy pure jump process on R Bert Fristedt provided key mathematical facts for this example. A pure jump Lévy process X is a Lévy process such

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

1. Stochastic Process

1. Stochastic Process HETERGENEITY IN QUANTITATIVE MACROECONOMICS @ TSE OCTOBER 17, 216 STOCHASTIC CALCULUS BASICS SANG YOON (TIM) LEE Very simple notes (need to add references). It is NOT meant to be a substitute for a real

More information

Stochastic Calculus February 11, / 33

Stochastic Calculus February 11, / 33 Martingale Transform M n martingale with respect to F n, n =, 1, 2,... σ n F n (σ M) n = n 1 i= σ i(m i+1 M i ) is a Martingale E[(σ M) n F n 1 ] n 1 = E[ σ i (M i+1 M i ) F n 1 ] i= n 2 = σ i (M i+1 M

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Lecture 17 Brownian motion as a Markov process

Lecture 17 Brownian motion as a Markov process Lecture 17: Brownian motion as a Markov process 1 of 14 Course: Theory of Probability II Term: Spring 2015 Instructor: Gordan Zitkovic Lecture 17 Brownian motion as a Markov process Brownian motion is

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Local times for functions with finite variation: two versions of Stieltjes change of variables formula

Local times for functions with finite variation: two versions of Stieltjes change of variables formula Local times for functions with finite variation: two versions of Stieltjes change of variables formula Jean Bertoin and Marc Yor hal-835685, version 2-4 Jul 213 Abstract We introduce two natural notions

More information

Recall that if X is a compact metric space, C(X), the space of continuous (real-valued) functions on X, is a Banach space with the norm

Recall that if X is a compact metric space, C(X), the space of continuous (real-valued) functions on X, is a Banach space with the norm Chapter 13 Radon Measures Recall that if X is a compact metric space, C(X), the space of continuous (real-valued) functions on X, is a Banach space with the norm (13.1) f = sup x X f(x). We want to identify

More information

Exponential martingales: uniform integrability results and applications to point processes

Exponential martingales: uniform integrability results and applications to point processes Exponential martingales: uniform integrability results and applications to point processes Alexander Sokol Department of Mathematical Sciences, University of Copenhagen 26 September, 2012 1 / 39 Agenda

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

µ X (A) = P ( X 1 (A) )

µ X (A) = P ( X 1 (A) ) 1 STOCHASTIC PROCESSES This appendix provides a very basic introduction to the language of probability theory and stochastic processes. We assume the reader is familiar with the general measure and integration

More information

Stochastic integration. P.J.C. Spreij

Stochastic integration. P.J.C. Spreij Stochastic integration P.J.C. Spreij this version: April 22, 29 Contents 1 Stochastic processes 1 1.1 General theory............................... 1 1.2 Stopping times...............................

More information

II - REAL ANALYSIS. This property gives us a way to extend the notion of content to finite unions of rectangles: we define

II - REAL ANALYSIS. This property gives us a way to extend the notion of content to finite unions of rectangles: we define 1 Measures 1.1 Jordan content in R N II - REAL ANALYSIS Let I be an interval in R. Then its 1-content is defined as c 1 (I) := b a if I is bounded with endpoints a, b. If I is unbounded, we define c 1

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

Harmonic Functions and Brownian motion

Harmonic Functions and Brownian motion Harmonic Functions and Brownian motion Steven P. Lalley April 25, 211 1 Dynkin s Formula Denote by W t = (W 1 t, W 2 t,..., W d t ) a standard d dimensional Wiener process on (Ω, F, P ), and let F = (F

More information

Probability and Measure

Probability and Measure Part II Year 2018 2017 2016 2015 2014 2013 2012 2011 2010 2009 2008 2007 2006 2005 2018 84 Paper 4, Section II 26J Let (X, A) be a measurable space. Let T : X X be a measurable map, and µ a probability

More information

Lecture 12. F o s, (1.1) F t := s>t

Lecture 12. F o s, (1.1) F t := s>t Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let

More information

Signed Measures. Chapter Basic Properties of Signed Measures. 4.2 Jordan and Hahn Decompositions

Signed Measures. Chapter Basic Properties of Signed Measures. 4.2 Jordan and Hahn Decompositions Chapter 4 Signed Measures Up until now our measures have always assumed values that were greater than or equal to 0. In this chapter we will extend our definition to allow for both positive negative values.

More information

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor)

Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Dynkin (λ-) and π-systems; monotone classes of sets, and of functions with some examples of application (mainly of a probabilistic flavor) Matija Vidmar February 7, 2018 1 Dynkin and π-systems Some basic

More information

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE Surveys in Mathematics and its Applications ISSN 1842-6298 (electronic), 1843-7265 (print) Volume 5 (2010), 275 284 UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE Iuliana Carmen Bărbăcioru Abstract.

More information

Supermodular ordering of Poisson arrays

Supermodular ordering of Poisson arrays Supermodular ordering of Poisson arrays Bünyamin Kızıldemir Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological University 637371 Singapore

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

I. ANALYSIS; PROBABILITY

I. ANALYSIS; PROBABILITY ma414l1.tex Lecture 1. 12.1.2012 I. NLYSIS; PROBBILITY 1. Lebesgue Measure and Integral We recall Lebesgue measure (M411 Probability and Measure) λ: defined on intervals (a, b] by λ((a, b]) := b a (so

More information

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility José Enrique Figueroa-López 1 1 Department of Statistics Purdue University Statistics, Jump Processes,

More information

Change of Measure formula and the Hellinger Distance of two Lévy Processes

Change of Measure formula and the Hellinger Distance of two Lévy Processes Change of Measure formula and the Hellinger Distance of two Lévy Processes Erika Hausenblas University of Salzburg, Austria Change of Measure formula and the Hellinger Distance of two Lévy Processes p.1

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Nonuniform Random Variate Generation

Nonuniform Random Variate Generation Nonuniform Random Variate Generation 1 Suppose we have a generator of i.i.d. U(0, 1) r.v. s. We want to generate r.v. s from other distributions, such as normal, Weibull, Poisson, etc. Or other random

More information

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have 362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications

More information

Feller Processes and Semigroups

Feller Processes and Semigroups Stat25B: Probability Theory (Spring 23) Lecture: 27 Feller Processes and Semigroups Lecturer: Rui Dong Scribe: Rui Dong ruidong@stat.berkeley.edu For convenience, we can have a look at the list of materials

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010 Optimal stopping for Hunt and Lévy processes Ernesto Mordecki 1 Lecture III. PASI - Guanajuato - June 2010 1Joint work with Paavo Salminen (Åbo, Finland) 1 Plan of the talk 1. Motivation: from Finance

More information

MOMENTS AND CUMULANTS OF THE SUBORDINATED LEVY PROCESSES

MOMENTS AND CUMULANTS OF THE SUBORDINATED LEVY PROCESSES MOMENTS AND CUMULANTS OF THE SUBORDINATED LEVY PROCESSES PENKA MAYSTER ISET de Rades UNIVERSITY OF TUNIS 1 The formula of Faa di Bruno represents the n-th derivative of the composition of two functions

More information

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales

Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Fundamental Inequalities, Convergence and the Optional Stopping Theorem for Continuous-Time Martingales Prakash Balachandran Department of Mathematics Duke University April 2, 2008 1 Review of Discrete-Time

More information

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik

Stochastic Processes. Winter Term Paolo Di Tella Technische Universität Dresden Institut für Stochastik Stochastic Processes Winter Term 2016-2017 Paolo Di Tella Technische Universität Dresden Institut für Stochastik Contents 1 Preliminaries 5 1.1 Uniform integrability.............................. 5 1.2

More information

Asymptotics for posterior hazards

Asymptotics for posterior hazards Asymptotics for posterior hazards Pierpaolo De Blasi University of Turin 10th August 2007, BNR Workshop, Isaac Newton Intitute, Cambridge, UK Joint work with Giovanni Peccati (Université Paris VI) and

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20

The Lévy-Itô decomposition and the Lévy-Khintchine formula in31 themarch dual of 2014 a nuclear 1 space. / 20 The Lévy-Itô decomposition and the Lévy-Khintchine formula in the dual of a nuclear space. Christian Fonseca-Mora School of Mathematics and Statistics, University of Sheffield, UK Talk at "Stochastic Processes

More information

Integration on Measure Spaces

Integration on Measure Spaces Chapter 3 Integration on Measure Spaces In this chapter we introduce the general notion of a measure on a space X, define the class of measurable functions, and define the integral, first on a class of

More information

Brownian Motion and Conditional Probability

Brownian Motion and Conditional Probability Math 561: Theory of Probability (Spring 2018) Week 10 Brownian Motion and Conditional Probability 10.1 Standard Brownian Motion (SBM) Brownian motion is a stochastic process with both practical and theoretical

More information

Differentiation of Measures and Functions

Differentiation of Measures and Functions Chapter 6 Differentiation of Measures and Functions This chapter is concerned with the differentiation theory of Radon measures. In the first two sections we introduce the Radon measures and discuss two

More information

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION

LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION LECTURE 2: LOCAL TIME FOR BROWNIAN MOTION We will define local time for one-dimensional Brownian motion, and deduce some of its properties. We will then use the generalized Ray-Knight theorem proved in

More information

Brownian Motion. Chapter Definition of Brownian motion

Brownian Motion. Chapter Definition of Brownian motion Chapter 5 Brownian Motion Brownian motion originated as a model proposed by Robert Brown in 1828 for the phenomenon of continual swarming motion of pollen grains suspended in water. In 1900, Bachelier

More information

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries

Chapter 1. Measure Spaces. 1.1 Algebras and σ algebras of sets Notation and preliminaries Chapter 1 Measure Spaces 1.1 Algebras and σ algebras of sets 1.1.1 Notation and preliminaries We shall denote by X a nonempty set, by P(X) the set of all parts (i.e., subsets) of X, and by the empty set.

More information

Ernesto Mordecki. Talk presented at the. Finnish Mathematical Society

Ernesto Mordecki. Talk presented at the. Finnish Mathematical Society EXACT RUIN PROBABILITIES FOR A CLASS Of LÉVY PROCESSES Ernesto Mordecki http://www.cmat.edu.uy/ mordecki Montevideo, Uruguay visiting Åbo Akademi, Turku Talk presented at the Finnish Mathematical Society

More information

We denote the space of distributions on Ω by D ( Ω) 2.

We denote the space of distributions on Ω by D ( Ω) 2. Sep. 1 0, 008 Distributions Distributions are generalized functions. Some familiarity with the theory of distributions helps understanding of various function spaces which play important roles in the study

More information

the convolution of f and g) given by

the convolution of f and g) given by 09:53 /5/2000 TOPIC Characteristic functions, cont d This lecture develops an inversion formula for recovering the density of a smooth random variable X from its characteristic function, and uses that

More information

An essay on the general theory of stochastic processes

An essay on the general theory of stochastic processes Probability Surveys Vol. 3 (26) 345 412 ISSN: 1549-5787 DOI: 1.1214/1549578614 An essay on the general theory of stochastic processes Ashkan Nikeghbali ETHZ Departement Mathematik, Rämistrasse 11, HG G16

More information

Jump-diffusion models driven by Lévy processes

Jump-diffusion models driven by Lévy processes Jump-diffusion models driven by Lévy processes José E. Figueroa-López Purdue University Department of Statistics West Lafayette, IN 4797-266 figueroa@stat.purdue.edu Abstract: During the past and this

More information

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2)

X n D X lim n F n (x) = F (x) for all x C F. lim n F n(u) = F (u) for all u C F. (2) 14:17 11/16/2 TOPIC. Convergence in distribution and related notions. This section studies the notion of the so-called convergence in distribution of real random variables. This is the kind of convergence

More information

When is a Moving Average a Semimartingale?

When is a Moving Average a Semimartingale? 29 Barrett Lectures Ph.D.-student under supervision of Jan Pedersen, Thiele Centre, University of Aarhus, Denmark. 29 Barrett Lectures at The University of Tennessee: Stochastic Analysis and its Applications

More information

Homework #6 : final examination Due on March 22nd : individual work

Homework #6 : final examination Due on March 22nd : individual work Université de ennes Année 28-29 Master 2ème Mathématiques Modèles stochastiques continus ou à sauts Homework #6 : final examination Due on March 22nd : individual work Exercise Warm-up : behaviour of characteristic

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Part III Stochastic Calculus and Applications

Part III Stochastic Calculus and Applications Part III Stochastic Calculus and Applications Based on lectures by R. Bauerschmidt Notes taken by Dexter Chua Lent 218 These notes are not endorsed by the lecturers, and I have modified them often significantly

More information

Fourier analysis, measures, and distributions. Alan Haynes

Fourier analysis, measures, and distributions. Alan Haynes Fourier analysis, measures, and distributions Alan Haynes 1 Mathematics of diffraction Physical diffraction As a physical phenomenon, diffraction refers to interference of waves passing through some medium

More information

Lecture 19 L 2 -Stochastic integration

Lecture 19 L 2 -Stochastic integration Lecture 19: L 2 -Stochastic integration 1 of 12 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 19 L 2 -Stochastic integration The stochastic integral for processes

More information

Real Analysis Problems

Real Analysis Problems Real Analysis Problems Cristian E. Gutiérrez September 14, 29 1 1 CONTINUITY 1 Continuity Problem 1.1 Let r n be the sequence of rational numbers and Prove that f(x) = 1. f is continuous on the irrationals.

More information

Stable Lévy motion with values in the Skorokhod space: construction and approximation

Stable Lévy motion with values in the Skorokhod space: construction and approximation Stable Lévy motion with values in the Skorokhod space: construction and approximation arxiv:1809.02103v1 [math.pr] 6 Sep 2018 Raluca M. Balan Becem Saidani September 5, 2018 Abstract In this article, we

More information

Exact Simulation of Diffusions and Jump Diffusions

Exact Simulation of Diffusions and Jump Diffusions Exact Simulation of Diffusions and Jump Diffusions A work by: Prof. Gareth O. Roberts Dr. Alexandros Beskos Dr. Omiros Papaspiliopoulos Dr. Bruno Casella 28 th May, 2008 Content 1 Exact Algorithm Construction

More information

LOCATION OF THE PATH SUPREMUM FOR SELF-SIMILAR PROCESSES WITH STATIONARY INCREMENTS. Yi Shen

LOCATION OF THE PATH SUPREMUM FOR SELF-SIMILAR PROCESSES WITH STATIONARY INCREMENTS. Yi Shen LOCATION OF THE PATH SUPREMUM FOR SELF-SIMILAR PROCESSES WITH STATIONARY INCREMENTS Yi Shen Department of Statistics and Actuarial Science, University of Waterloo. Waterloo, ON N2L 3G1, Canada. Abstract.

More information

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables

Why study probability? Set theory. ECE 6010 Lecture 1 Introduction; Review of Random Variables ECE 6010 Lecture 1 Introduction; Review of Random Variables Readings from G&S: Chapter 1. Section 2.1, Section 2.3, Section 2.4, Section 3.1, Section 3.2, Section 3.5, Section 4.1, Section 4.2, Section

More information

Jump-diffusion models driven by Lévy processes

Jump-diffusion models driven by Lévy processes Jump-diffusion models driven by Lévy processes José E. Figueroa-López Purdue University Department of Statistics West Lafayette, IN 4797-266 figueroa@stat.purdue.edu Abstract: During the past and this

More information

COGARCH Processes: Theory and Asymptotics for the Pseudo-Maximum Likelihood Estimator

COGARCH Processes: Theory and Asymptotics for the Pseudo-Maximum Likelihood Estimator University of Milano-Bicocca Department of Statistics and Quantitative Methods Ph.D. School in Statistics and Mathematical Finance Ph.D. in Statistics and Applications COGARCH Processes: Theory and Asymptotics

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

An Introduction to Malliavin calculus and its applications

An Introduction to Malliavin calculus and its applications An Introduction to Malliavin calculus and its applications Lecture 3: Clark-Ocone formula David Nualart Department of Mathematics Kansas University University of Wyoming Summer School 214 David Nualart

More information

Lectures on Lévy Processes, Stochastic Calculus and Financial Applications, Ovronnaz September 2005

Lectures on Lévy Processes, Stochastic Calculus and Financial Applications, Ovronnaz September 2005 Lectures on Lévy Processes, Stochastic Calculus and Financial Applications, Ovronnaz September 005 David Applebaum Probability and Statistics Department, University of Sheffield, Hicks Building, Hounsfield

More information

Regular Variation and Extreme Events for Stochastic Processes

Regular Variation and Extreme Events for Stochastic Processes 1 Regular Variation and Extreme Events for Stochastic Processes FILIP LINDSKOG Royal Institute of Technology, Stockholm 2005 based on joint work with Henrik Hult www.math.kth.se/ lindskog 2 Extremes for

More information

FE610 Stochastic Calculus for Financial Engineers. Stevens Institute of Technology

FE610 Stochastic Calculus for Financial Engineers. Stevens Institute of Technology FE610 Stochastic Calculus for Financial Engineers Lecture 3. Calculaus in Deterministic and Stochastic Environments Steve Yang Stevens Institute of Technology 01/31/2012 Outline 1 Modeling Random Behavior

More information

are harmonic functions so by superposition

are harmonic functions so by superposition J. Rauch Applied Complex Analysis The Dirichlet Problem Abstract. We solve, by simple formula, the Dirichlet Problem in a half space with step function boundary data. Uniqueness is proved by complex variable

More information