Karhunen-Loève Expansions of Lévy Processes

Size: px
Start display at page:

Download "Karhunen-Loève Expansions of Lévy Processes"

Transcription

1 Karhunen-Loève Expansions of Lévy Processes Daniel Hackmann June 2016, Barcelona supported by the Austrian Science Fund (FWF), Project F5509-N26 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 0 / 40

2 Outline 1 Introduction 2 Lévy Processes and Infinitely Divisible Random Vectors 3 Main Results KLE Components Simulation 4 Examples Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 0 / 40

3 The main idea We want to expand a continuous time stochastic process X in a stochastic Fourier series on the interval [0, T ]: X t = k 1 Y k φ k (t), where {φ k } k 1 is an orthonormal basis of L 2 ([0, T ], R), and our stochastic Fourier coefficients are given by Y k := T 0 X t φ k (t)dt. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 1 / 40

4 The main idea Things to think about: In what sense does X t = k 1 Y kφ k (t)? Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 2 / 40

5 The main idea Things to think about: In what sense does X t = k 1 Y kφ k (t)? How should we choose {φ k } k 1? Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 2 / 40

6 The main idea Things to think about: In what sense does X t = k 1 Y kφ k (t)? How should we choose {φ k } k 1? Distribution of, and dependence among {Y k } k 1? Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 2 / 40

7 The main idea Assumptions: (a) E[X t ] = 0, (b) E[X 2 t ] <, and (c) Cov(X s, X t ) is continuous Basis: The eigenfunctions {e k } k 1 corresponding to the non-zero eigenvalues {λ k } k 1 of the operator K : L 2 ([0, T ]) L 2 ([0, T ]), (Kf)(s) := T constitute a basis for L 2 ([0, T ]). We define Z k := 0 T 0 Cov(X s, X t )f(t)dt X t e k (t)dt and determine the order of {e k } k 1, {Z k } k 1, and {λ k } k 1 according to λ 1 λ 2 λ Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 3 / 40

8 The main idea Theorem (The Karhunen-Loève Theorem (KLT)) (i) ( ) 2 d E X t Z k e k (t) 0, as d k=1 uniformly for t [0, T ]. Additionally, the {Z k } k 1 are uncorrelated and satisfy E[Z k ] = 0 and E[Z 2 k ] = λ k. (ii) For any other basis {φ k } k 1 with corresponding Fourier coefficients {Y k } k 1, and any d N, we have T 0 [ E (ε d (t)) 2] T [ dt E ( ε d (t)) 2] dt, 0 where ε d and ε d are the remainders ε d (t) := d+1 Z ke k (t) and ε d (t) := d+1 Y kφ k (t). Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 4 / 40

9 The main idea Proof in: Ghanem, R. G.. and Spanos, P.D.. (1991). Stochastic finite elements: A spectral approach. Springer Verlag, New York Berlin Heidelberg. They credit: Kac, M. and Siegert, A. (1947). An explicit representation of a stationary Gaussian process. Ann. Math. Stat. 18, Karhunen, K. (1947). Über lineare Methoden in der Wahrscheinlichkeitsrechnung. Amer. Acad. Sc. Fennicade, Ser. A, I 37, Loéve, M. (1948). Fonctions aleatoires du second ordre. In Processus stochastic et mouvement Brownien. ed. P. Lévy. Gauthier Villars, Paris. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 5 / 40

10 The main idea In order to use the KLT for a chosen process X in any meaningful way we see that we need to determine: The eigenvalues {λ k } k 1 and eigenfunctions {e k } k 1 of the operator K. I.e. solve: T Cov(X s, X t )e k (s)ds = λ k e k (t), 0 a homogeneous Fredholm integral equation of the second kind Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 6 / 40

11 The main idea In order to use the KLT for a chosen process X in any meaningful way we see that we need to determine: The eigenvalues {λ k } k 1 and eigenfunctions {e k } k 1 of the operator K. I.e. solve: T Cov(X s, X t )e k (s)ds = λ k e k (t), 0 a homogeneous Fredholm integral equation of the second kind The distribution of the Fourier coefficients {Z k } k 1 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 6 / 40

12 The main idea In order to use the KLT for a chosen process X in any meaningful way we see that we need to determine: The eigenvalues {λ k } k 1 and eigenfunctions {e k } k 1 of the operator K. I.e. solve: T Cov(X s, X t )e k (s)ds = λ k e k (t), 0 a homogeneous Fredholm integral equation of the second kind The distribution of the Fourier coefficients {Z k } k 1 If we want to simulate an approximate path of X via a Karhunen Loève expansion (KLE) we also need to know how to simulate the first d Fourier coefficients, i.e. simulate the random vector Z (d) = (Z 1,..., Z d ). Note: The components of this vector need not be independent! Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 6 / 40

13 The main idea We don t have full knowledge for many processes, and of those, we know primarily about Gaussian processes: Brownian motion Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 7 / 40

14 The main idea We don t have full knowledge for many processes, and of those, we know primarily about Gaussian processes: Brownian motion Brownian bridge Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 7 / 40

15 The main idea We don t have full knowledge for many processes, and of those, we know primarily about Gaussian processes: Brownian motion Brownian bridge Anderson-Darling Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 7 / 40

16 The main idea We don t have full knowledge for many processes, and of those, we know primarily about Gaussian processes: Brownian motion Brownian bridge Anderson-Darling More Gaussian references and interesting result for generalized BB in: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 7 / 40

17 The main idea We don t have full knowledge for many processes, and of those, we know primarily about Gaussian processes: Brownian motion Brownian bridge Anderson-Darling More Gaussian references and interesting result for generalized BB in: Barczy, M. and Lovas, R.L.. (2016). Karhunen Loève expansion for a generalization of the Wiener bridge. preprint: arxiv: v2. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 7 / 40

18 The main idea We don t have full knowledge for many processes, and of those, we know primarily about Gaussian processes: Brownian motion Brownian bridge Anderson-Darling More Gaussian references and interesting result for generalized BB in: Barczy, M. and Lovas, R.L.. (2016). Karhunen Loève expansion for a generalization of the Wiener bridge. preprint: arxiv: v2. Few complete results (any?) for Non-Gaussian process. A numerical example in: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 7 / 40

19 The main idea We don t have full knowledge for many processes, and of those, we know primarily about Gaussian processes: Brownian motion Brownian bridge Anderson-Darling More Gaussian references and interesting result for generalized BB in: Barczy, M. and Lovas, R.L.. (2016). Karhunen Loève expansion for a generalization of the Wiener bridge. preprint: arxiv: v2. Few complete results (any?) for Non-Gaussian process. A numerical example in: Phoon, K.K.., Huang, H.W.. and Quek, S.T.. (2005). Simulation of strongly non-gaussian processes using Karhunen Loeve expansion. Probabilistic Engineering Mechanics 20, Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 7 / 40

20 The main idea Goal: For a one-dimensional Lévy process X satisfying E[X t ] = 0 and E[X 2 t ] < determine {λ k } k 1, {e k } k 1 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 8 / 40

21 The main idea Goal: For a one-dimensional Lévy process X satisfying E[X t ] = 0 and E[X 2 t ] < determine {λ k } k 1, {e k } k 1 The distribution of {Z k } k 1 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 8 / 40

22 The main idea Goal: For a one-dimensional Lévy process X satisfying E[X t ] = 0 and E[X 2 t ] < determine {λ k } k 1, {e k } k 1 The distribution of {Z k } k 1 The distribution and dependence structure of Z (d) = (Z 1,..., Z d ) Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 8 / 40

23 The main idea Goal: For a one-dimensional Lévy process X satisfying E[X t ] = 0 and E[X 2 t ] < determine {λ k } k 1, {e k } k 1 The distribution of {Z k } k 1 The distribution and dependence structure of Z (d) = (Z 1,..., Z d ) How to simluate Z (d) Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 8 / 40

24 Outline 1 Introduction 2 Lévy Processes and Infinitely Divisible Random Vectors 3 Main Results KLE Components Simulation 4 Examples Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 8 / 40

25 Quick review The distribution of a d-dimensional Lévy process X or infinitely divisible vector ξ is completely determined by the characteristic exponent: Ψ X (z) := 1 t log E[ei z,xt ], or Ψ ξ (z) := log E[e i z,ξ ]. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 9 / 40

26 Quick review The distribution of a d-dimensional Lévy process X or infinitely divisible vector ξ is completely determined by the characteristic exponent: Ψ X (z) := 1 t log E[ei z,xt ], or Ψ ξ (z) := log E[e i z,ξ ]. The characteristic exponent always has the form Ψ(z) = 1 2 zt Qz i a, z e i z,x 1 i z, x h(x)ν(dx). R n \{0} Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 9 / 40

27 BV Assumption So Ψ and therefore the distribution are determined by the generating triple (a, Q, ν) h where If a R d (a R), Q R d d is positive semi definite (σ 2 0) (Gaussian component), and ν is a measure (Lévy measure) h a cut-off function needed in general to make the integral converge x <1 x ν(dx) < then we can set h 0. (X has bounded variation when Q = 0 or σ 2 = 0.) Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 10 / 40

28 Two examples A scaled Brownian motion with drift with E[X t ] = µt and Var(X t ) = σ 2 t has characteristic exponent Ψ(z) = σ2 2 z2 iµz. Since ν 0 there are no jumps and we have generating triple (µ, σ 2, 0) Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 11 / 40

29 Two examples For c, ρ, ˆρ > 0 set Then Ψ(z) = eˆρx ν(dx) = I(x < 0)c dx + I(x < 0)ce ρx x x dx. R\{0} ( ) ( ( e izx 1 ν(dx) = c log 1 iz ) ( + log 1 + izˆρ )) ρ is the characteristic exponent of a variance gamma process with generating triple (0, 0, ν). Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 12 / 40

30 Outline 1 Introduction 2 Lévy Processes and Infinitely Divisible Random Vectors 3 Main Results KLE Components Simulation 4 Examples Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 12 / 40

31 Outline 1 Introduction 2 Lévy Processes and Infinitely Divisible Random Vectors 3 Main Results KLE Components Simulation 4 Examples Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 12 / 40

32 Covariance and the operator K Under the KLT assumption it is straight-forward to show that Var(X t ) = E[X 2 t ] = αt, and Cov(X s, X t ) = α min(s, t), where α = Ψ (0). For example, for our Brownian motion we have α = σ 2 and for our variance gamma process α = c(ρ 2 + ˆρ 2 ). But this shows that the eigenvalues/functions of the operator K for a Lévy process can be derived in exactly the same manner as those of a Brownian motion. And so we get our first result essentially for free... Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 13 / 40

33 Basis functions Proposition The eigenvalues and associated eigenfunctions of the operator K defined with respect to a Lévy process X are given by αt 2 λ k = ) 2, and e k (t) = π (k for k N and t [0, T ]. 2 T sin ( π T ( k 1 ) ) t, 2 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 14 / 40

34 Basis functions Idea of the Proof: Differentiate both sides of T α min(s, t)e k (s)ds = λ k e k (t), 0 twice with respect to t to reduce to an ODE. Details for Brownian Motion case in Ash, R.B.. and Gardner, M.F.. (1975). Topics in stochastic process. Academic Press, New York San Francisco London. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 15 / 40

35 Total Variance Define the total variance of X on [0, T ] as v(t ) := T 0 E[X 2 t ]dt = αt 2 But we also have v(t ) = k 1 λ k (in general). Therefore, the total variance explained by a d-term approximation is 2 dk=1 λ k v(t ) = 2 d 1 π 2 ( ) 2. k=1 k 1 2 Computation yields: the first 2, 5 and 21 terms already explain 90%, 95%, and 99% of the total variance of the process. Holds independently of X, α or T. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 16 / 40

36 Gaussian processes A key property of Gaussian processes, is that the coefficients {Z k } k 1 are again Gaussian. For example, ) B t = 2 sin (π(k 1 2 Z t) k ( ) k 1 π k 1 2 where B is a standard Brownian Motion on [0, 1] and the {Z k } k 1 are i.i.d. random variables with common distribution N (0, 1). Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 17 / 40

37 Key Lemma Lemma Let X be a Lévy process and {f k } d k=1 be a collection of functions which are in L 1 ([0, T ]). Then the vector ξ consisting of elements ξ k = T 0 X t f k (s)ds, k {1, 2,..., d}, has an ID distribution with characteristic exponent Ψ ξ (z) = T 0 Ψ X ( z, u(t) ) dt, z R d, where u : [0, T ] R d is the function with k-th component u k (t) := T t f k (s)ds, k {1, 2,..., d}. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 18 / 40

38 Main Theorem 1 Theorem If X is a Lévy process with generating triple (a, σ 2, ν) that satisfies the KLT and BV assumptions then Z (d) has generating triple (a, Q, Π) where a is the vector with entries a k := a ( 1)k+1 2T 3 2 ) 2, k {1, 2,..., d}, π (k Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 19 / 40

39 Main Theorem 1 Theorem (cont.) Q is a diagonal d d matrix with entries and Π is the measure, Π(B) := q k,k := σ2 2 T 2 π 2 ( k 1 2 ) 2, k {1, 2,..., d}, I(f(v) B)(ν λ)(dv), B B R d \{0}, R\{0} [0,T ] where λ is the Lebesgue measure on [0, T ] and f : R [0, T ] R d is the function ( 2T x cos ( ( ) ) π T t (x, t) ( ),..., cos ( ( ) )) T π T d 1 2 t ( ). π 1 1 d Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 20 / 40

40 Main Theorem 1 Idea of Proof: From Lemma: Ψ Z (d)(z) = T 0 Ψ X( z, u(t) )dt Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 21 / 40

41 Main Theorem 1 Idea of Proof: From Lemma: Ψ Z (d)(z) = T 0 Ψ X( z, u(t) )dt Evaluate integral in the components of u u k (t) = 2 T T t ( π sin T ( ( k 1 ) ) s ds = 2T cos π T 2 ( ) ) k 1 2 t π(k 1 2 ) Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 21 / 40

42 Main Theorem 1 Idea of Proof: From Lemma: Ψ Z (d)(z) = T 0 Ψ X( z, u(t) )dt Evaluate integral in the components of u u k (t) = 2 T T t ( π sin T ( ( k 1 ) ) s ds = 2T cos π T 2 ( ) ) k 1 2 t π(k 1 2 ) Evaluation of integral over [0, T ] and the orthogonality of {u k } 1 k d give a and Q. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 21 / 40

43 Main Theorem 1 Idea of Proof: From Lemma: Ψ Z (d)(z) = T 0 Ψ X( z, u(t) )dt Evaluate integral in the components of u u k (t) = 2 T T t ( π sin T ( ( k 1 ) ) s ds = 2T cos π T 2 ( ) ) k 1 2 t π(k 1 2 ) Evaluation of integral over [0, T ] and the orthogonality of {u k } 1 k d give a and Q. Fubini s theorem and a change of variables yields Π. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 21 / 40

44 Dependence Corollary Z (d) has independent entries if, and only if, ν is the zero measure. Idea of Proof: ( ) known, or because Q is diagonal ( ) we show that Π is not supported by the coordinate axes Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 22 / 40

45 Outline 1 Introduction 2 Lévy Processes and Infinitely Divisible Random Vectors 3 Main Results KLE Components Simulation 4 Examples Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 22 / 40

46 The problem How can we simulate a random vector with dependent entries with only knowledge of the characteristic function? In general, this seems to be a difficult problem. Infinite divisbility makes it possible! Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 23 / 40

47 Shot noise Idea: Write Z (d) as a(n) (infinite) sum of simpler random vectors. Random sequences {V i } i 1 and {Γ i } i 1 which are independent of each other and defined on a common probability space. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 24 / 40

48 Shot noise Idea: Write Z (d) as a(n) (infinite) sum of simpler random vectors. Random sequences {V i } i 1 and {Γ i } i 1 which are independent of each other and defined on a common probability space. Γ i sum of i i.i.d exponential r.v. s with mean 1. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 24 / 40

49 Shot noise Idea: Write Z (d) as a(n) (infinite) sum of simpler random vectors. Random sequences {V i } i 1 and {Γ i } i 1 which are independent of each other and defined on a common probability space. Γ i sum of i i.i.d exponential r.v. s with mean 1. {V i } i 1 are independent and take values in some measurable space D with common dist. F Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 24 / 40

50 Shot noise Idea: Write Z (d) as a(n) (infinite) sum of simpler random vectors. Random sequences {V i } i 1 and {Γ i } i 1 which are independent of each other and defined on a common probability space. Γ i sum of i i.i.d exponential r.v. s with mean 1. {V i } i 1 are independent and take values in some measurable space D with common dist. F H : (0, ) D R d, measurable, and n S n := H(Γ i, V i ), n N, i=1 and µ(b) := 0 D I(H(r, v) B)F (dv)dr, B B R d \{0}. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 24 / 40

51 Shot noise Theorem If µ is a Lévy measure satifying the BV assumption, then S n converges almost surely to an ID random vector with generating triple (0, 0, µ) as n. Theorems 3.1, 3.2, and 3.4 in Rosiński, J. (1990). On series representations of infinitely divisible random vectors. The Annals of Probability 18, Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 25 / 40

52 Shot noise Idea: Show that Π has a disintegrated form like µ. Simplifying assumptions: X has no Gaussian component X has only positive jumps d X t = X + t Xt + B t Z (d) X = Z (d) X Z (d) + X + Z (d) B Necessary assumption: ν has a strictly positive density π. d We want g(x) := x π(s)ds to have a well defined, non-increasing inverse g 1. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 26 / 40

53 Main Result 2 Theorem If X is a Lévy process with only positive jumps, satisfying the KLT and BV assumptions, such that X has triple (a, 0, π), where π is strictly positive, then Z (d) d = a + H(Γ i, U i ), where {U i } i 1 is an i.i.d. sequence of uniform random variables on [0, 1], and i 1 H(r, v) := f(g 1 (r/t ), T v), where f is the function defined previously, i.e. ( ( 2T x cos π ( ) ) T t f(x, t) = ( ) π 1 1,..., cos ( ( ) )) T π T d 1 2 t ( ) 2 d 1. 2 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 27 / 40

54 Discussion Nice/Unique features: S (d) 2 t := T d k=1 ( ( π Z k sin k 1 ) ) t T 2 d Z k = ak + 2T g 1 (Γ i /T ) cos ( π ( ) ) k 1 2 Ui ( ) π i 1 k 1, 2 Z (d) is independent of t Potentially difficult: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 28 / 40

55 Discussion Nice/Unique features: S (d) 2 t := T d k=1 ( ( π Z k sin k 1 ) ) t T 2 d Z k = ak + 2T g 1 (Γ i /T ) cos ( π ( ) ) k 1 2 Ui ( ) π i 1 k 1, 2 Z (d) is independent of t We can increase d incrementally very easily (without starting fresh simulation) Potentially difficult: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 28 / 40

56 Discussion Nice/Unique features: S (d) 2 t := T d k=1 ( ( π Z k sin k 1 ) ) t T 2 d Z k = ak + 2T g 1 (Γ i /T ) cos ( π ( ) ) k 1 2 Ui ( ) π i 1 k 1, 2 Z (d) is independent of t We can increase d incrementally very easily (without starting fresh simulation) S (d) has smooth paths Potentially difficult: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 28 / 40

57 Discussion Nice/Unique features: S (d) 2 t := T d k=1 ( ( π Z k sin k 1 ) ) t T 2 d Z k = ak + 2T g 1 (Γ i /T ) cos ( π ( ) ) k 1 2 Ui ( ) π i 1 k 1, 2 Z (d) is independent of t We can increase d incrementally very easily (without starting fresh simulation) S (d) has smooth paths Potentially difficult: Each summand of i 1 H(Γ i, U i ) requires the evaluation of d cosine functions Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 28 / 40

58 Discussion Nice/Unique features: S (d) 2 t := T d k=1 ( ( π Z k sin k 1 ) ) t T 2 d Z k = ak + 2T g 1 (Γ i /T ) cos ( π ( ) ) k 1 2 Ui ( ) π i 1 k 1, 2 Z (d) is independent of t We can increase d incrementally very easily (without starting fresh simulation) S (d) has smooth paths Potentially difficult: Each summand of i 1 H(Γ i, U i ) requires the evaluation of d cosine functions Inverting g Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 28 / 40

59 Discussion Nice/Unique features: S (d) 2 t := T d k=1 ( ( π Z k sin k 1 ) ) t T 2 d Z k = ak + 2T g 1 (Γ i /T ) cos ( π ( ) ) k 1 2 Ui ( ) π i 1 k 1, 2 Z (d) is independent of t We can increase d incrementally very easily (without starting fresh simulation) S (d) has smooth paths Potentially difficult: Each summand of i 1 H(Γ i, U i ) requires the evaluation of d cosine functions Inverting g Convergence of sum depends on decay of g 1 Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 28 / 40

60 Outline 1 Introduction 2 Lévy Processes and Infinitely Divisible Random Vectors 3 Main Results KLE Components Simulation 4 Examples Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 28 / 40

61 Example 1 Consider a VG process with c = 1, ρ = 2, ˆρ = 5, and T = 3. The process X + has Lévy measure ν(dx) = e ρx x dx so that we have e ρx g(x) = c x x dx = ce 1(ρx), and g 1 where E 1 is the exponential integral function. ( ) Γi T = 1 ρ E 1 1 ( ) Γi, T c We truncate the random series when Γ i /(T c) > 46 since at this point g 1 (Γ i /T ) < ρ We expect to generate 46T c = 138 random pairs (U i, Γ i ) per path of X +. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 29 / 40

62 Example 1 Sample paths of S (d) with d {5, 10, 15, 20, 25, 100, 250, 500, 3000} Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 30 / 40

63 Sample paths of S (d) A closer look at d = 3000 shows the Gibbs phenomenon... Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 31 / 40

64 Sample paths of S (d)... but we can remove this if we want. For example if we form the Cesàro sums C (d) dk=1 t S (k) t the Gibbs phenomenon disappears. := 1 d Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 32 / 40

65 Sample paths of C (d) Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 33 / 40

66 Monte Carlo test Suppose we want to compute E[e Xt ] = e Ψ( i) = (5/3) t on the interval [1, 2] by simulating 10 6 paths of S (d). The errors: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 34 / 40

67 Discussion This method could be useful if we have to evaluate E[f(X t )] at many different points in the interval [0, T ], or at previously unknown points, say if we wanted to find the minimum in t. If, for example, d 100 is deemed good enough, then single precision floating point numbers (realizations of the {Z k } 1 k d ) can be stored in memory: 4 Bytes GB. Even d = 3000 is not impossible: 4 Bytes GB. If we want to compute E[f(X t )] at points t 1 t 2... t d in [0, T ] then we can also use a fractional FFT algorithm to compute the realizations of {S (d) t k } 1 k d with O(d log(d)) operations. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 35 / 40

68 Financial Applications Consider a VG process X with c = 5, ρ = , ˆρ = , and T = 3. We will also add a drift µ whose value is to be determined. Suppose we want to compute [ e rτ E (A 0 exp (X τ ) K) +], or [ (A0 τ ) ] + e rτ E exp (X t ) dt K, τ 0 where r, A 0, K > 0. Then µ is determined by the risk neutral condition Ψ( i) = r, and the expressions represent the price of a European and Asian call option respectively. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 36 / 40

69 European option Replace X by S (d) and do Monte Carlo with 10 6 iterations. Errors for τ [1, 2] compared to a benchmark computed using a Fourier transform technique. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 37 / 40

70 Asian option Replace X by C (d) and do Monte Carlo with 10 6 iterations. Errors for τ [1, 2] compared to a benchmark computed using a Fourier transform technique. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 38 / 40

71 Smooth paths Let c (d) be a realization of C (d) on [0, T ]. Then t exp(c (d) t ) is smooth function of t. Applying the right change of variables cos(θ) = 2 ( t τ 1) transforms τ 0 ( exp c (d) t ) dt into the integral of a periodic function over [0, π]. We expect exponential convergence of the trapezoidal rule for such a function. Numerical evaluation is then possible via the Clenshaw-Curtis quadrature. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 39 / 40

72 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

73 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

74 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Expanding X in a wavelet basis Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

75 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Expanding X in a wavelet basis Unser, M. and Tafti, P.D.. (2014). An introduction to sparse stochastic processes. Cambridge University Press, Cambridge. Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

76 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Expanding X in a wavelet basis Unser, M. and Tafti, P.D.. (2014). An introduction to sparse stochastic processes. Cambridge University Press, Cambridge. Determining the KLE of Y, Y t := f(x t ) directly Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

77 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Expanding X in a wavelet basis Unser, M. and Tafti, P.D.. (2014). An introduction to sparse stochastic processes. Cambridge University Press, Cambridge. Determining the KLE of Y, Y t := f(x t ) directly Other Lévy generalizations of Gaussian processes, e.g. Lévy Bridge Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

78 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Expanding X in a wavelet basis Unser, M. and Tafti, P.D.. (2014). An introduction to sparse stochastic processes. Cambridge University Press, Cambridge. Determining the KLE of Y, Y t := f(x t ) directly Other Lévy generalizations of Gaussian processes, e.g. Lévy Bridge In the discrete case, orthogonal transforms seem to speed rate of Quasi-Monte Carlo convergence when generating Brownian paths. Is something similar true here as well? Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

79 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Expanding X in a wavelet basis Unser, M. and Tafti, P.D.. (2014). An introduction to sparse stochastic processes. Cambridge University Press, Cambridge. Determining the KLE of Y, Y t := f(x t ) directly Other Lévy generalizations of Gaussian processes, e.g. Lévy Bridge In the discrete case, orthogonal transforms seem to speed rate of Quasi-Monte Carlo convergence when generating Brownian paths. Is something similar true here as well? Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

80 Some ideas for the future Can we speed the convergence of E[f(X t )] or E[F({X t : t [0, T ]})] by: Expanding X in a different orthonormal basis Expanding X in a wavelet basis Unser, M. and Tafti, P.D.. (2014). An introduction to sparse stochastic processes. Cambridge University Press, Cambridge. Determining the KLE of Y, Y t := f(x t ) directly Other Lévy generalizations of Gaussian processes, e.g. Lévy Bridge In the discrete case, orthogonal transforms seem to speed rate of Quasi-Monte Carlo convergence when generating Brownian paths. Is something similar true here as well? Hackmann, D. (2016). Karhunen Loève expansions of Lévy processes. preprint: arxiv: Daniel Hackmann (JKU Linz) KLE s of Lévy Processes 40 / 40

Karhunen Loève expansions of Lévy processes

Karhunen Loève expansions of Lévy processes Karhunen Loève expansions of Lévy processes Daniel Hackmann March, 6 Abstract Karhunen Loève expansions (KLE of stochastic processes are important tools in mathematics, the sciences, economics, and engineering.

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010 Optimal stopping for Hunt and Lévy processes Ernesto Mordecki 1 Lecture III. PASI - Guanajuato - June 2010 1Joint work with Paavo Salminen (Åbo, Finland) 1 Plan of the talk 1. Motivation: from Finance

More information

Numerical Methods with Lévy Processes

Numerical Methods with Lévy Processes Numerical Methods with Lévy Processes 1 Objective: i) Find models of asset returns, etc ii) Get numbers out of them. Why? VaR and risk management Valuing and hedging derivatives Why not? Usual assumption:

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Beyond the color of the noise: what is memory in random phenomena?

Beyond the color of the noise: what is memory in random phenomena? Beyond the color of the noise: what is memory in random phenomena? Gennady Samorodnitsky Cornell University September 19, 2014 Randomness means lack of pattern or predictability in events according to

More information

MATH4210 Financial Mathematics ( ) Tutorial 7

MATH4210 Financial Mathematics ( ) Tutorial 7 MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

UDC Anderson-Darling and New Weighted Cramér-von Mises Statistics. G. V. Martynov

UDC Anderson-Darling and New Weighted Cramér-von Mises Statistics. G. V. Martynov UDC 519.2 Anderson-Darling and New Weighted Cramér-von Mises Statistics G. V. Martynov Institute for Information Transmission Problems of the RAS, Bolshoy Karetny per., 19, build.1, Moscow, 12751, Russia

More information

A Unified Formulation of Gaussian Versus Sparse Stochastic Processes

A Unified Formulation of Gaussian Versus Sparse Stochastic Processes A Unified Formulation of Gaussian Versus Sparse Stochastic Processes Michael Unser, Pouya Tafti and Qiyu Sun EPFL, UCF Appears in IEEE Trans. on Information Theory, 2014 Presented by Liming Wang M. Unser,

More information

Characterization of heterogeneous hydraulic conductivity field via Karhunen-Loève expansions and a measure-theoretic computational method

Characterization of heterogeneous hydraulic conductivity field via Karhunen-Loève expansions and a measure-theoretic computational method Characterization of heterogeneous hydraulic conductivity field via Karhunen-Loève expansions and a measure-theoretic computational method Jiachuan He University of Texas at Austin April 15, 2016 Jiachuan

More information

Polynomial Chaos and Karhunen-Loeve Expansion

Polynomial Chaos and Karhunen-Loeve Expansion Polynomial Chaos and Karhunen-Loeve Expansion 1) Random Variables Consider a system that is modeled by R = M(x, t, X) where X is a random variable. We are interested in determining the probability of the

More information

Cramér-von Mises Gaussianity test in Hilbert space

Cramér-von Mises Gaussianity test in Hilbert space Cramér-von Mises Gaussianity test in Hilbert space Gennady MARTYNOV Institute for Information Transmission Problems of the Russian Academy of Sciences Higher School of Economics, Russia, Moscow Statistique

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

arxiv: v2 [math.pr] 27 Oct 2015

arxiv: v2 [math.pr] 27 Oct 2015 A brief note on the Karhunen-Loève expansion Alen Alexanderian arxiv:1509.07526v2 [math.pr] 27 Oct 2015 October 28, 2015 Abstract We provide a detailed derivation of the Karhunen Loève expansion of a stochastic

More information

Performance Evaluation of Generalized Polynomial Chaos

Performance Evaluation of Generalized Polynomial Chaos Performance Evaluation of Generalized Polynomial Chaos Dongbin Xiu, Didier Lucor, C.-H. Su, and George Em Karniadakis 1 Division of Applied Mathematics, Brown University, Providence, RI 02912, USA, gk@dam.brown.edu

More information

On the quantiles of the Brownian motion and their hitting times.

On the quantiles of the Brownian motion and their hitting times. On the quantiles of the Brownian motion and their hitting times. Angelos Dassios London School of Economics May 23 Abstract The distribution of the α-quantile of a Brownian motion on an interval [, t]

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

1 Brownian Local Time

1 Brownian Local Time 1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =

More information

Numerical Solution for Random Forced SPDE via Galerkin Finite Element Method

Numerical Solution for Random Forced SPDE via Galerkin Finite Element Method Journal of mathematics and computer science 9 (014), 71-8 umerical Solution for Random Forced SE via Galerkin Finite Element ethod Rasoul aseri, A. alek 1 epartment of Applied athematics, Faculty of athematical

More information

SELECTED TOPICS CLASS FINAL REPORT

SELECTED TOPICS CLASS FINAL REPORT SELECTED TOPICS CLASS FINAL REPORT WENJU ZHAO In this report, I have tested the stochastic Navier-Stokes equations with different kind of noise, when i am doing this project, I have encountered several

More information

On Cramér-von Mises test based on local time of switching diffusion process

On Cramér-von Mises test based on local time of switching diffusion process On Cramér-von Mises test based on local time of switching diffusion process Anis Gassem Laboratoire de Statistique et Processus, Université du Maine, 7285 Le Mans Cedex 9, France e-mail: Anis.Gassem@univ-lemans.fr

More information

The Multivariate Normal Distribution. In this case according to our theorem

The Multivariate Normal Distribution. In this case according to our theorem The Multivariate Normal Distribution Defn: Z R 1 N(0, 1) iff f Z (z) = 1 2π e z2 /2. Defn: Z R p MV N p (0, I) if and only if Z = (Z 1,..., Z p ) T with the Z i independent and each Z i N(0, 1). In this

More information

Gaussian vectors and central limit theorem

Gaussian vectors and central limit theorem Gaussian vectors and central limit theorem Samy Tindel Purdue University Probability Theory 2 - MA 539 Samy T. Gaussian vectors & CLT Probability Theory 1 / 86 Outline 1 Real Gaussian random variables

More information

State Space Representation of Gaussian Processes

State Space Representation of Gaussian Processes State Space Representation of Gaussian Processes Simo Särkkä Department of Biomedical Engineering and Computational Science (BECS) Aalto University, Espoo, Finland June 12th, 2013 Simo Särkkä (Aalto University)

More information

Jump-type Levy Processes

Jump-type Levy Processes Jump-type Levy Processes Ernst Eberlein Handbook of Financial Time Series Outline Table of contents Probabilistic Structure of Levy Processes Levy process Levy-Ito decomposition Jump part Probabilistic

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

Generalised Fractional-Black-Scholes Equation: pricing and hedging

Generalised Fractional-Black-Scholes Equation: pricing and hedging Generalised Fractional-Black-Scholes Equation: pricing and hedging Álvaro Cartea Birkbeck College, University of London April 2004 Outline Lévy processes Fractional calculus Fractional-Black-Scholes 1

More information

Chapter 4: Monte-Carlo Methods

Chapter 4: Monte-Carlo Methods Chapter 4: Monte-Carlo Methods A Monte-Carlo method is a technique for the numerical realization of a stochastic process by means of normally distributed random variables. In financial mathematics, it

More information

Integral representations in models with long memory

Integral representations in models with long memory Integral representations in models with long memory Georgiy Shevchenko, Yuliya Mishura, Esko Valkeila, Lauri Viitasaari, Taras Shalaiko Taras Shevchenko National University of Kyiv 29 September 215, Ulm

More information

Approximation of random dynamical systems with discrete time by stochastic differential equations: I. Theory

Approximation of random dynamical systems with discrete time by stochastic differential equations: I. Theory Random Operators / Stochastic Eqs. 15 7, 5 c de Gruyter 7 DOI 1.1515 / ROSE.7.13 Approximation of random dynamical systems with discrete time by stochastic differential equations: I. Theory Yuri A. Godin

More information

9 Brownian Motion: Construction

9 Brownian Motion: Construction 9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of

More information

An Uncertain Control Model with Application to. Production-Inventory System

An Uncertain Control Model with Application to. Production-Inventory System An Uncertain Control Model with Application to Production-Inventory System Kai Yao 1, Zhongfeng Qin 2 1 Department of Mathematical Sciences, Tsinghua University, Beijing 100084, China 2 School of Economics

More information

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32

Lévy Processes and Infinitely Divisible Measures in the Dual of afebruary Nuclear2017 Space 1 / 32 Lévy Processes and Infinitely Divisible Measures in the Dual of a Nuclear Space David Applebaum School of Mathematics and Statistics, University of Sheffield, UK Talk at "Workshop on Infinite Dimensional

More information

Random Variables and Their Distributions

Random Variables and Their Distributions Chapter 3 Random Variables and Their Distributions A random variable (r.v.) is a function that assigns one and only one numerical value to each simple event in an experiment. We will denote r.vs by capital

More information

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility José Enrique Figueroa-López 1 1 Department of Statistics Purdue University Statistics, Jump Processes,

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

Universal examples. Chapter The Bernoulli process

Universal examples. Chapter The Bernoulli process Chapter 1 Universal examples 1.1 The Bernoulli process First description: Bernoulli random variables Y i for i = 1, 2, 3,... independent with P [Y i = 1] = p and P [Y i = ] = 1 p. Second description: Binomial

More information

Stationary distributions of non Gaussian Ornstein Uhlenbeck processes for beam halos

Stationary distributions of non Gaussian Ornstein Uhlenbeck processes for beam halos N Cufaro Petroni: CYCLOTRONS 2007 Giardini Naxos, 1 5 October, 2007 1 Stationary distributions of non Gaussian Ornstein Uhlenbeck processes for beam halos CYCLOTRONS 2007 Giardini Naxos, 1 5 October Nicola

More information

Lecture 12: Detailed balance and Eigenfunction methods

Lecture 12: Detailed balance and Eigenfunction methods Miranda Holmes-Cerfon Applied Stochastic Analysis, Spring 2015 Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility),

More information

Small ball probabilities and metric entropy

Small ball probabilities and metric entropy Small ball probabilities and metric entropy Frank Aurzada, TU Berlin Sydney, February 2012 MCQMC Outline 1 Small ball probabilities vs. metric entropy 2 Connection to other questions 3 Recent results for

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

Lecture 12: Detailed balance and Eigenfunction methods

Lecture 12: Detailed balance and Eigenfunction methods Lecture 12: Detailed balance and Eigenfunction methods Readings Recommended: Pavliotis [2014] 4.5-4.7 (eigenfunction methods and reversibility), 4.2-4.4 (explicit examples of eigenfunction methods) Gardiner

More information

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows.

Perhaps the simplest way of modeling two (discrete) random variables is by means of a joint PMF, defined as follows. Chapter 5 Two Random Variables In a practical engineering problem, there is almost always causal relationship between different events. Some relationships are determined by physical laws, e.g., voltage

More information

A Note on Hilbertian Elliptically Contoured Distributions

A Note on Hilbertian Elliptically Contoured Distributions A Note on Hilbertian Elliptically Contoured Distributions Yehua Li Department of Statistics, University of Georgia, Athens, GA 30602, USA Abstract. In this paper, we discuss elliptically contoured distribution

More information

Practical unbiased Monte Carlo for Uncertainty Quantification

Practical unbiased Monte Carlo for Uncertainty Quantification Practical unbiased Monte Carlo for Uncertainty Quantification Sergios Agapiou Department of Statistics, University of Warwick MiR@W day: Uncertainty in Complex Computer Models, 2nd February 2015, University

More information

Bernstein-gamma functions and exponential functionals of Lévy processes

Bernstein-gamma functions and exponential functionals of Lévy processes Bernstein-gamma functions and exponential functionals of Lévy processes M. Savov 1 joint work with P. Patie 2 FCPNLO 216, Bilbao November 216 1 Marie Sklodowska Curie Individual Fellowship at IMI, BAS,

More information

Infinitely divisible distributions and the Lévy-Khintchine formula

Infinitely divisible distributions and the Lévy-Khintchine formula Infinitely divisible distributions and the Cornell University May 1, 2015 Some definitions Let X be a real-valued random variable with law µ X. Recall that X is said to be infinitely divisible if for every

More information

Asymptotic Probability Density Function of. Nonlinear Phase Noise

Asymptotic Probability Density Function of. Nonlinear Phase Noise Asymptotic Probability Density Function of Nonlinear Phase Noise Keang-Po Ho StrataLight Communications, Campbell, CA 95008 kpho@stratalight.com The asymptotic probability density function of nonlinear

More information

A New Wavelet-based Expansion of a Random Process

A New Wavelet-based Expansion of a Random Process Columbia International Publishing Journal of Applied Mathematics and Statistics doi:10.7726/jams.2016.1011 esearch Article A New Wavelet-based Expansion of a andom Process Ievgen Turchyn 1* eceived: 28

More information

QMC methods in quantitative finance. and perspectives

QMC methods in quantitative finance. and perspectives , tradition and perspectives Johannes Kepler University Linz (JKU) WU Research Seminar What is the content of the talk Valuation of financial derivatives in stochastic market models using (QMC-)simulation

More information

Stat. 758: Computation and Programming

Stat. 758: Computation and Programming Stat. 758: Computation and Programming Eric B. Laber Department of Statistics, North Carolina State University Lecture 4a Sept. 10, 2015 Ambition is like a frog sitting on a Venus flytrap. The flytrap

More information

Math Ordinary Differential Equations

Math Ordinary Differential Equations Math 411 - Ordinary Differential Equations Review Notes - 1 1 - Basic Theory A first order ordinary differential equation has the form x = f(t, x) (11) Here x = dx/dt Given an initial data x(t 0 ) = x

More information

Nonuniform Random Variate Generation

Nonuniform Random Variate Generation Nonuniform Random Variate Generation 1 Suppose we have a generator of i.i.d. U(0, 1) r.v. s. We want to generate r.v. s from other distributions, such as normal, Weibull, Poisson, etc. Or other random

More information

Multivariate Generalized Ornstein-Uhlenbeck Processes

Multivariate Generalized Ornstein-Uhlenbeck Processes Multivariate Generalized Ornstein-Uhlenbeck Processes Anita Behme TU München Alexander Lindner TU Braunschweig 7th International Conference on Lévy Processes: Theory and Applications Wroclaw, July 15 19,

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

Fractional Quantum Mechanics and Lévy Path Integrals

Fractional Quantum Mechanics and Lévy Path Integrals arxiv:hep-ph/9910419v2 22 Oct 1999 Fractional Quantum Mechanics and Lévy Path Integrals Nikolai Laskin Isotrace Laboratory, University of Toronto 60 St. George Street, Toronto, ON M5S 1A7 Canada Abstract

More information

A Note on the Central Limit Theorem for a Class of Linear Systems 1

A Note on the Central Limit Theorem for a Class of Linear Systems 1 A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.

More information

On prediction and density estimation Peter McCullagh University of Chicago December 2004

On prediction and density estimation Peter McCullagh University of Chicago December 2004 On prediction and density estimation Peter McCullagh University of Chicago December 2004 Summary Having observed the initial segment of a random sequence, subsequent values may be predicted by calculating

More information

Anderson-Darling Type Goodness-of-fit Statistic Based on a Multifold Integrated Empirical Distribution Function

Anderson-Darling Type Goodness-of-fit Statistic Based on a Multifold Integrated Empirical Distribution Function Anderson-Darling Type Goodness-of-fit Statistic Based on a Multifold Integrated Empirical Distribution Function S. Kuriki (Inst. Stat. Math., Tokyo) and H.-K. Hwang (Academia Sinica) Bernoulli Society

More information

Numerical methods for the discretization of random fields by means of the Karhunen Loève expansion

Numerical methods for the discretization of random fields by means of the Karhunen Loève expansion Numerical methods for the discretization of random fields by means of the Karhunen Loève expansion Wolfgang Betz, Iason Papaioannou, Daniel Straub Engineering Risk Analysis Group, Technische Universität

More information

Numerical Integration of SDEs: A Short Tutorial

Numerical Integration of SDEs: A Short Tutorial Numerical Integration of SDEs: A Short Tutorial Thomas Schaffter January 19, 010 1 Introduction 1.1 Itô and Stratonovich SDEs 1-dimensional stochastic differentiable equation (SDE) is given by [6, 7] dx

More information

A Hilbert Space for Random Processes

A Hilbert Space for Random Processes Gaussian Basics Random Processes Filtering of Random Processes Signal Space Concepts A Hilbert Space for Random Processes I A vector space for random processes X t that is analogous to L 2 (a, b) is of

More information

Accurate approximation of stochastic differential equations

Accurate approximation of stochastic differential equations Accurate approximation of stochastic differential equations Simon J.A. Malham and Anke Wiese (Heriot Watt University, Edinburgh) Birmingham: 6th February 29 Stochastic differential equations dy t = V (y

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand Carl Mueller 1 and Zhixin Wu Abstract We give a new representation of fractional

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises

Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises Hongwei Long* Department of Mathematical Sciences, Florida Atlantic University, Boca Raton Florida 33431-991,

More information

Simulation methods for stochastic models in chemistry

Simulation methods for stochastic models in chemistry Simulation methods for stochastic models in chemistry David F. Anderson anderson@math.wisc.edu Department of Mathematics University of Wisconsin - Madison SIAM: Barcelona June 4th, 21 Overview 1. Notation

More information

Lecture 22 Girsanov s Theorem

Lecture 22 Girsanov s Theorem Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n

More information

3. Probability and Statistics

3. Probability and Statistics FE661 - Statistical Methods for Financial Engineering 3. Probability and Statistics Jitkomut Songsiri definitions, probability measures conditional expectations correlation and covariance some important

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information

Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes

Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes Multi-Factor Lévy Models I: Symmetric alpha-stable (SαS) Lévy Processes Anatoliy Swishchuk Department of Mathematics and Statistics University of Calgary Calgary, Alberta, Canada Lunch at the Lab Talk

More information

Statistics of Stochastic Processes

Statistics of Stochastic Processes Prof. Dr. J. Franke All of Statistics 4.1 Statistics of Stochastic Processes discrete time: sequence of r.v...., X 1, X 0, X 1, X 2,... X t R d in general. Here: d = 1. continuous time: random function

More information

Higher order weak approximations of stochastic differential equations with and without jumps

Higher order weak approximations of stochastic differential equations with and without jumps Higher order weak approximations of stochastic differential equations with and without jumps Hideyuki TANAKA Graduate School of Science and Engineering, Ritsumeikan University Rough Path Analysis and Related

More information

Information and Credit Risk

Information and Credit Risk Information and Credit Risk M. L. Bedini Université de Bretagne Occidentale, Brest - Friedrich Schiller Universität, Jena Jena, March 2011 M. L. Bedini (Université de Bretagne Occidentale, Brest Information

More information

Statistical inference on Lévy processes

Statistical inference on Lévy processes Alberto Coca Cabrero University of Cambridge - CCA Supervisors: Dr. Richard Nickl and Professor L.C.G.Rogers Funded by Fundación Mutua Madrileña and EPSRC MASDOC/CCA student workshop 2013 26th March Outline

More information

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors

5. Random Vectors. probabilities. characteristic function. cross correlation, cross covariance. Gaussian random vectors. functions of random vectors EE401 (Semester 1) 5. Random Vectors Jitkomut Songsiri probabilities characteristic function cross correlation, cross covariance Gaussian random vectors functions of random vectors 5-1 Random vectors we

More information

STAT 331. Martingale Central Limit Theorem and Related Results

STAT 331. Martingale Central Limit Theorem and Related Results STAT 331 Martingale Central Limit Theorem and Related Results In this unit we discuss a version of the martingale central limit theorem, which states that under certain conditions, a sum of orthogonal

More information

Research Article A Necessary Characteristic Equation of Diffusion Processes Having Gaussian Marginals

Research Article A Necessary Characteristic Equation of Diffusion Processes Having Gaussian Marginals Abstract and Applied Analysis Volume 01, Article ID 598590, 9 pages doi:10.1155/01/598590 Research Article A Necessary Characteristic Equation of Diffusion Processes Having Gaussian Marginals Syeda Rabab

More information

On Optimal Stopping Problems with Power Function of Lévy Processes

On Optimal Stopping Problems with Power Function of Lévy Processes On Optimal Stopping Problems with Power Function of Lévy Processes Budhi Arta Surya Department of Mathematics University of Utrecht 31 August 2006 This talk is based on the joint paper with A.E. Kyprianou:

More information

ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER

ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER GERARDO HERNANDEZ-DEL-VALLE arxiv:1209.2411v1 [math.pr] 10 Sep 2012 Abstract. This work deals with first hitting time densities of Ito processes whose

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Numerical Analysis Preliminary Exam 10 am to 1 pm, August 20, 2018

Numerical Analysis Preliminary Exam 10 am to 1 pm, August 20, 2018 Numerical Analysis Preliminary Exam 1 am to 1 pm, August 2, 218 Instructions. You have three hours to complete this exam. Submit solutions to four (and no more) of the following six problems. Please start

More information

Stochastic Spectral Approaches to Bayesian Inference

Stochastic Spectral Approaches to Bayesian Inference Stochastic Spectral Approaches to Bayesian Inference Prof. Nathan L. Gibson Department of Mathematics Applied Mathematics and Computation Seminar March 4, 2011 Prof. Gibson (OSU) Spectral Approaches to

More information

Separation of Variables in Linear PDE: One-Dimensional Problems

Separation of Variables in Linear PDE: One-Dimensional Problems Separation of Variables in Linear PDE: One-Dimensional Problems Now we apply the theory of Hilbert spaces to linear differential equations with partial derivatives (PDE). We start with a particular example,

More information

Solving the steady state diffusion equation with uncertainty Final Presentation

Solving the steady state diffusion equation with uncertainty Final Presentation Solving the steady state diffusion equation with uncertainty Final Presentation Virginia Forstall vhfors@gmail.com Advisor: Howard Elman elman@cs.umd.edu Department of Computer Science May 6, 2012 Problem

More information

Logarithmic scaling of planar random walk s local times

Logarithmic scaling of planar random walk s local times Logarithmic scaling of planar random walk s local times Péter Nándori * and Zeyu Shen ** * Department of Mathematics, University of Maryland ** Courant Institute, New York University October 9, 2015 Abstract

More information

Stochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier.

Stochastic integral. Introduction. Ito integral. References. Appendices Stochastic Calculus I. Geneviève Gauthier. Ito 8-646-8 Calculus I Geneviève Gauthier HEC Montréal Riemann Ito The Ito The theories of stochastic and stochastic di erential equations have initially been developed by Kiyosi Ito around 194 (one of

More information

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2

n! (k 1)!(n k)! = F (X) U(0, 1). (x, y) = n(n 1) ( F (y) F (x) ) n 2 Order statistics Ex. 4. (*. Let independent variables X,..., X n have U(0, distribution. Show that for every x (0,, we have P ( X ( < x and P ( X (n > x as n. Ex. 4.2 (**. By using induction or otherwise,

More information

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Austrian Journal of Statistics April 27, Volume 46, 67 78. AJS http://www.ajs.or.at/ doi:.773/ajs.v46i3-4.672 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Yuliya

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

Wiener Measure and Brownian Motion

Wiener Measure and Brownian Motion Chapter 16 Wiener Measure and Brownian Motion Diffusion of particles is a product of their apparently random motion. The density u(t, x) of diffusing particles satisfies the diffusion equation (16.1) u

More information

Independence of some multiple Poisson stochastic integrals with variable-sign kernels

Independence of some multiple Poisson stochastic integrals with variable-sign kernels Independence of some multiple Poisson stochastic integrals with variable-sign kernels Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological

More information

Convergence Rates of Kernel Quadrature Rules

Convergence Rates of Kernel Quadrature Rules Convergence Rates of Kernel Quadrature Rules Francis Bach INRIA - Ecole Normale Supérieure, Paris, France ÉCOLE NORMALE SUPÉRIEURE NIPS workshop on probabilistic integration - Dec. 2015 Outline Introduction

More information