Statistical inference and Malliavin calculus

Size: px
Start display at page:

Download "Statistical inference and Malliavin calculus"

Transcription

1 Statistical inference and Malliavin calculus José M. Corcuera and Arturo Kohatsu-Higa Abstract. The derivative of the log-likelihood function, known as score function, plays a central role in parametric statistical inference. It can be used to study the asymptotic behavior of likelihood and pseudo-likelihood estimators. For instance, one can deduce the local asymptotic normality property which leads to various asymptotic properties of these estimators. In this article we apply Malliavin Calculus to obtain the score function as a conditional expectation. We then show, through different examples, how this idea can be useful for asymptotic inference of stochastic processes. In particular, we consider situations where there are jumps driving the data process. Mathematics Subject Classification 2. Primary 62Mxx, 6Hxx; Secondary 6Fxx, 62Fxx. Keywords. Diffusion processes, Malliavin calculus, parametric estimation, Cramer- Rao lower bound, LAN property, LAMN property, jump-diffusion processes, score function.. Introduction In classical statistical theory, the Cramer-Rao lower bound is obtained by using two steps: an integration by parts and the Cauchy-Schwarz inequality. Therefore it seems natural that the integration by parts formulas of Malliavin calculus will play a role in this context. In recent times, the theory of Malliavin Calculus has attracted attention from Computational Finance to derive expressions for the calculation of the greeks which measure the sensitivity of option prices conditional expectations with respect to certain parameters, see e.g. [3],[4]. We show in this article, that similar techniques can be used to derive expressions for the score function in a parametric statistical model and This work is supported by the MEC Grant No. MTM26-32.

2 2 J.M. Corcuera and A. Kohatsu-Higa consequently we obtain the Fisher information and the Cramer-Rao s bounds. The advantage of these expressions is that they do not require the explicit expression of the likelihood function directly and that its form is appropriate to study asymptotic properties of the model and some estimators. Gobet [5] was the first contribution in this direction. He uses the duality property in a Gaussian space to study the local asymptotic mixed normality of parametric diffusion models when the observations are discretely observed. Recently, duality properties have been used to obtain a Stein s type estimator in the context of a Gaussian space see [9] and [], and in the context of a Poisson process see []. An interesting discussion of different Cramer-Rao s bounds is given in [2]. In this paper we use Malliavin calculus with the aim of giving alternative expressions for the score function as conditional expectations of certain expressions involving Skorohod s integrals and we show how to use them to study the asymptotic properties of the statistical model, to derive Cramer-Rao lower bounds and expressions for the maximum likelihood estimator. The paper is organized as follows. In the first section we formulate the statistical model in a way that appears clearly that the Cramer- Rao lower bound can be obtained after integrating by parts and using the Cauchy-Schwarz inequality. In the second section, we consider the parametric models associated with diffusion processes where the driving process is a Wiener process. In the last section we consider the case of diffusions with jumps. The basic reference on Malliavin calculus is [7]. In particular, we refer the reader to this textbook for definitions and notation. As our goal is to concentrate in general principles rather than technical details, we briefly sketch the mathematical framework required and explain in the examples the procedure. 2. The Cramer-Rao lower bound A parametric statistical model is defined as the triplet X, F, {P, Θ} where X is the sample space that corresponds to the possible values of certain n-dimensional random vector X X,X 2,...,X n, F is the σ-field of observable events and {P, Θ} is the family

3 Inference and Malliavin calculus 3 of possible probability laws of X. However, when the vector X corresponds to observations of a random process where certain parameter is involved, it is better to assume that X itself depends explicitly. Then we define a parametric statistical model as a triplet consisting of a probability space Ω, F,P, a parameter space Θ, an open set in R d, and a measurable map X : Ω Θ ω, X R n Xω,, where in Θ we consider its Borelian σ-field. As usual, a statistic is a measurable map T : X R m x Tx y. For simplicity, take m d and Θ an open interval in R. Let us denote g : E T E TX, then, under smoothness conditions on g, we can evaluate where we write. ETX, Definition 2.. A square integrable statistic T C is said to be regular if ETX E TX and V artx <. Suppose that the family of random variables {X,, Θ} is also regular in the following sense: i. X has a density p ; C for all Θ with support, suppx, independent of. ii. Xω, C as a function of, ω Ω. Furthermore, X j L 2 Ω and E X j X x C as a function of x, Θ and j,...,n. iii. x j E X j XpX; px; L 2 Ω, j,...,n; where, for any smooth function h, we denote xj hx : xj hx xx iv. Any statistic T C with compact support in the interior of suppx is regular.

4 4 J.M. Corcuera and A. Kohatsu-Higa Remark 2.2. Note that if T C has compact support in the interior of suppx and the family {X,, Θ} is regular then ETX E TX E R n Tx R n xj TX X j xj TxE X j X xpx;dx xj E X j X xpx; dx, since lim E X j X xpx;tx,j,...,n, x x Θ and x suppx, where suppx is the boundary of the support of X. So, xj E X j XpX; ETX E TX. 2. px; Here the quotient is defined as if px;. The following result is a statement of Cramer-Rao s inequality. Proposition 2.3. Let T be a regular statistic satisfying i-iii and lim E X j X xpx;,j,...,n, 2.2 x x Θ and x suppx, including x in suppx if the suppx is not compact, then VarTX E ETX 2 n xj E X j XpX; px; provided that the denominator is not zero. Proof. By the boundary condition 2.2 we have that xj E X j XpX; E. px; 2, 2.3

5 Inference and Malliavin calculus 5 Then, by condition iii and since X j L 2 and T is regular we also have that E[ X j TX ] < and therefore lim E X j X xpx;tx,j,...,n, x x Θ and x suppx, then ETX E TX xj E X j XpX; px; 2.4 and we can write ETX E TX g xj E X j XpX;. px; The result follows from the Cauchy-Schwarz inequality. In most classical models, the above calculation can be straightforward, as the explicit form of the density function p is available. This is carried out in the following examples. Our goal later is to show that in some cases where the explicit density is not known, the above bound can also be written without using directly the form of p. This will be the case of elliptic diffusions. Example. Assume that X X,X 2,...,X n where X i U i are i.i.d. r.v. with U i exp. This situation corresponds to a usual parametric model of n independent observations exponentially distributed

6 6 J.M. Corcuera and A. Kohatsu-Higa with parameter. We have that X j X j ; E X j XpX; X j n exp{ xj E X j XpX; px; + X j; xj E X j XpX; X j n px; ; 2 xj E X j XpX; E V ar X j px; X j }; V ar logpx; I, where I is the Fisher information. So we see that, in this case, 2.3 is the classical Cramer-Rao lower bound. If we also assume that v. px; is smooth as function of, for every fixed x. vi. For any smooth statistic T with compact support in the interior of suppx, R n Txpx;dx R n Tx px;dx, Θ, we have the following proposition that shows that 2.3 is just the usual Cramer-Rao inequality. Proposition 2.4. Assume conditions i-vi. Then, a.e. xj E X j X xpx; px; log px;, Θ. Proof. We have seen in 2. that ETX E TX xj E X j XpX;. px;

7 Inference and Malliavin calculus 7 On the other hand, ETX Txpx;dx R n log px;txpx;dx R n ETX log px;. Using a density argument w.r.t. T, we have that xj E X j X xpx; px; log px; a.e. We have assumed in the above proof that the support of X does not depend on. In the following example we suggest a localization method to treat the case where the support depends on. Example 2. Assume that X X,X 2,...,X n where X i U i, with U i Uniform, independent r.v. s for i,...,n and R. Consider a function T TX C and function π : [, ] n R also belonging to C in, n and continuous in [, ] n such that πu

8 8 J.M. Corcuera and A. Kohatsu-Higa if there exists i,...,n with U i {, }. Then ETXπU E TXπU E xj TX X j πu E xj TX X j πu... n xj TxE X j X xπ x [,] [,] dx... Tx n xj E X j X xπ x dx ETX [,] [,] xj X j π X ETX ETX then we have that where V art xj X j π X ETXπU 2 E n x j X j π X 2 xj X j π X u j U j πu. Note that if πu and this could be considered as a limit case then the Cramer-Rao bound is of order n 2 which corresponds to the variance of maxx,x 2,...,X n the maximum likelihood estimator of. 3. Statistical Inference in Gaussian spaces In this section, we give an alternative derivation of the Cramer-Rao bound using the integration by parts formula of Malliavin Calculus. We start explaining some of the basic concepts of Malliavin Calculus. Consider a probability space Ω, F,P and a Gaussian subspace H of

9 Inference and Malliavin calculus 9 L 2 Ω, F,P whose elements are zero-mean Gaussian random variables. Let H be a separable Hilbert space with scalar product denoted by, H and norm H, we will assume that there exists an isometry in the sense that W : H h H Wh EWh Wh 2 h,h 2 H. Let S be the class of smooth random variables TWh,Wh 2,...,Wh n such that T and all its derivatives have polynomial growth. Given T S we can define its differential as DT i TWh,Wh 2,...,Wh n h i. DT can be seen as a random variable with values in H. Then we can define the stochastic derivative operator as D : D,2 L 2 Ω, R T DT. L 2 Ω,H where D,2 is the closure of the class of smooth random variables with respect to the norm T,2 E T 2 + E DT 2 H /2 Let u be an element of L 2 Ω,H and assume there is an element δu belonging to L 2 Ω and such that E DT,u H ETδu for any T D,2, then we say that u belongs to the domain of δ denoted by domδ and that δ is the adjoint operator of D. Proposition 3.. Let h be an element of H, then δh Wh. Proof. Without loss of generality we can assume that h and that T TWh,Wh 2,...,Wh n is in S with h i, i 2,...,n

10 J.M. Corcuera and A. Kohatsu-Higa orthogonal to h. Then E DT,h H E T E Tx,Wh 2,...,Wh n e 2 x2 dx R 2π E x Tx,Wh 2,...,Wh n e 2 x2 dx 2π R ETWh. Proposition 3.2. If u F j h j where F j S and h j are elements of H then δu F j Wh j DF j,h j H. Proof. ETδu ETF j Wh j ET DF j,h j H E DTF j,h j ET DF j,h j H E DTF j TDF j,h j H EF j DT,h j H E DT, F j h j H E DT,u H.

11 Inference and Malliavin calculus In the following results we assume that our observations are expressed as the measurable map X : Ω Θ ω, R n x Xω,, Θ an open subset of R with the Borelian σ-field and the σ-field in Ω is the σ-field generated by H. Theorem 3.3. Let X j D,2,j,...,n and Z be a random variable with values in H, in the domain of δ, such that Z,DX j H X j. 3. If T is a regular unbiased estimator of g then V artxv areδz X g Furthermore, assume that i X has density px; C as function of with support, suppx, independent of, ii Any smooth statistic with compact support in the interior of suppx is regular and R n Txpx;dx R n Tx px;dx, for all Θ, then EδZ X log px;,a.s. and for all Θ. Proof. E TX E xk TX X k. k If we have Z with Z,DX j H X j then ETX E xk TX Z,DX k H. By the chain rule for the derivative operator D, then DTX xk TXDX k, ETX E Z,DTX H ETXδZ.

12 2 J.M. Corcuera and A. Kohatsu-Higa Finally, by the Cauchy-Schwarz inequality Next, we have ETX 2 V artxv areδz X. ETXδZ ETX Txpx;dx R n Tx px;dx R n Tx log px;px;dx R n ETX log px;. Therefore the result follows from a density argument. Remark 3.4. The next goal is to provide a somewhat standard way of finding Z. For example, if there exists U, an n-dimensional random vector with values on H such that U k,dx j H δ kj where δ kj is Kronecker s delta then Z U k X k k verifies condition 3. if Z domδ. In particular, if A kj DX k,dx j H is well defined then we can take U k A kj DX j. The matrix A is called the Malliavin covariance matrix and the property that its inverse is well defined implies see Theorem 2..2 in [7] that the random vector X has a density function px;. In order to understand the role of each of the elements in Theorem 3.3, we treat some classical examples from time series analysis in the present framework.

13 Inference and Malliavin calculus 3 Example 3. Let X j ε j +, j n, where ε j are independent standard normally distributed random variables for j,..., n. Let H be the linear space generated by the sequence ε,ε 2,..., then DX j e j where We j ε j, and e j,e k H Eε j ε k δ jk, so A kj δ kj, and Z X k A kj DX j e j, since X k. Then δz j,k We j ε j X j n. Therefore we obtain the classical Cramer-Rao lower bound V ar T E T 2. n Example 4. Let X j ε j where ε j are independent standard normally distributed r.v. Let H be again the linear space generated by the sequence ε,ε 2,..., then DX j e j where We j ε j,a jk δ 2 jk, and X k ε j, so we can take Z X k A kj DX j ε je j. j,k then using Proposition 3.2, δz ε2 j e j,e j H X 2 j 3 n. Therefore we also obtain the classical Cramer-Rao lower bound V art 2 E T 2 As we mention in remark 3.4, the problem is how to find Z. The following elementary proposition can be useful in this sense. 2n

14 4 J.M. Corcuera and A. Kohatsu-Higa Proposition 3.5. Let X X,X 2,...,X n T and Y Y,Y 2,...,Y n T be two random vectors with components in D,2, where Y h,x with h C, and h, one to one for all. Assume that Z,DY j H Y j. Then, Also we have that Z,DX j H X j. DY r B rl d Y l h l,x r,l DX r A rl X l r,l where B kj DY j,dy k H, A kj DX j,dx k H and d means total derivative with respect to. That is, d Y l h,x. Proof. It is straightforward Next, we give an application of the above proposition. Example 5. Let X j X j + ε j, j,...,n where X is a constant and ε j are independent standard normal distributed. Let H be the linear space generated by the sequence ε,ε 2,..., then define Y h,x, where h j,x x j x j with x X. Therefore Y j ε j and d Y j d ε j, h j,x X j, so we have that DY r A rl d Y l h l,x e r δ rl X l. r,l r,l

15 As We j ε j, then Inference and Malliavin calculus 5 δdy r A rl d Y l h l,x r,l δe r δ rl X l r,l We l X l l ε l X l l DX l,e l H. l X l X l X l l and we obtain the score function. Another way to obtain Z in Theorem 3.3 would be to use the relation so that l X j X j + X j DX j e j + DX j e l X l,dx j X j + e l X l,dx j and therefore by uniqueness of solutions for difference equations, we have that X j e l X l,dx j. So, if we define then δz And consequently Z l l e l X l, l ε l X l l log px; X l X l X l. l X l X l X l. l

16 6 J.M. Corcuera and A. Kohatsu-Higa The continuous version of the previous example is given by the following one. Example 6. Let X n X t,x t2,...,x tn, < t < t 2... < t n, be observations of the Ornstein-Uhlenbeck process Or, by integrating, dx t X t dt + db t,t,x. X t t e t s db s. Here {B t,t } is a standard one dimensional Brownian motion. Let H be the closed linear space generated by the random variables {B t, t T } and H L 2 [,T],dx. Then the map W : H H [,t] W [,t] T [,t] sdb s B t defines a linear isometry, and Wh is the stochastic integral of the function h. Consequently DX t e t [,t]. We have so d X t X t dt X t dt,t, X, X t t T e t s X s ds where we write DX t s D s X t. Then X s D s X t ds X,DX t H, log px n ; EδX X n E T X s db s X n T T E X s dx s + Xsds X 2 n. In particular the maximum likelihood estimator of is given by ˆ E T X sdx s X n E T X2 sds X n.

17 Inference and Malliavin calculus 7 Take now t i n and n observations in such a way that n and n n when n goes to infinity. Then n n log px n ; E X s db s X n. n n n n By ergodicity n n n n X 2 sds P 2, then we have that n n δx L N, see Theorem.9 in [6]. 2 On the other hand n n n n n n δx X s db s X s dx s + Xsds 2 where R i X ti X ti + t i X s X ti db s + Xt 2 i n + R i, t i X ti X s X ti ds. By straightforward calculations one can see that n n n R L 2 i, so n n δx n n X ti X ti + Xt 2 i n and, since the second term is X n measurable, we have that δx E δx n n n n Xn L 2, so finally E δx n n Xn L N, 2. L 2, Note that the asymptotic Fisher information is given by. Also if we 2 consider Z n, + u n n : log px n ; + u n n log px n ;,

18 8 J.M. Corcuera and A. Kohatsu-Higa we have Z n, + where EM 2 n u n n M n E + u n n + u n n M n + u n n E L We can show that M 2 n. In fact + u n n and +2 E i<j + u n n E + u n n u + n n u + n n ER i X n EδX X n R i X n ER i X n ER i X n u n n u n n X n d X ti X ti + X n d. Xt 2 i n d X n d X n d + EERi 2 X n E u n n X n d 2 X n d ERi 2 X n p i p i ER j X n d, where p i is the joint density of n X ti, X ti evaluated at X. Then, since E ERi 2 X n p i p i /2 EERi 2 X n 2 pi E p i C EER 2 i X n 2 C 2 n 2 /2 /2 C ER 4 i /2 X n d,

19 we have that E + u n n Inference and Malliavin calculus 9 ER i X n 2 X n d C u2 n n. Here we have used Burkholder s inequality and, in order to prove that 2 E pi p i is uniformly bounded, we used the explicit form of the Gaussian density of X ti, X ti. Finally the term 3.3 goes to zero since the covariance function of X goes to zero exponentially. Then Z n, + u u X ti X ti + Xt 2 i n n n n n u 2 2 n n Xt 2 in n M n. and, since X is ergodic, we have that Z n, + u n n L un, 2 u2 4. That is, the model satisfies the LAN Local Asymptotic Normality property. 3.. Elliptic diffusions Let X n X t,x t2,...,x tn be the vector of observations. Here, t i i n, with n n where we omit the dependence of the partition on n. X is defined as the solution to X t x + t b s,x s ds + t σ s,x s db s. and B is a standard Brownian motion. Assume that b s, σ s and their derivatives with respect to and x are Cb.3 as functions of t and x and that σ s is uniformly bounded and uniformly elliptic. Sometimes, when the arguments are clear, we will write σ s and b s instead of σ s,x s and b s,x s respectively. Note that the setting here is different from the previous example where the time frame for the data satisfied that n n while in this subsection we have that n n.

20 2 J.M. Corcuera and A. Kohatsu-Higa Define and It can be shown that see [7] DX t x X t x X σ,x [,t]. β t x X t σ t,x t β t x X t X t atβ ti β ti {ti t<t i }, where a L 2 [,T], atdt,i,...,n. t i Then, it is easy to see that In fact β,dx ti H X ti β,dx ti H. x X ti atβ tj β tj {tj t<t j }dt i x X ti β tj β tj tj t j atdt i x X ti β tj β tj x X ti β ti β t x X ti β ti X ti. By the assumption of uniform ellipticity and smoothness of the coefficients, we have that β is in the domain of δ and that the score function is given by Eδ β X n, where we have for at n, n δ β β ti β ti t i x X t σ t,x t db t t i D t β ti x X t σ t,x t dt.

21 Inference and Malliavin calculus 2 Conditions i and ii of Theorem 3.3 are fulfilled since suppx is R n and px; is uniformly bounded with respect to by an integrable function. In addition, and X t x X t + t t x b s x X s ds + b s + x b s X s ds + t t so, by applying the Itô formula we have that β t X t x X t t µ s ds + t x σ s x X s db s, σ s + x σ s X s db s, σ s x X s db s, 3.4 for a certain adapted process µ that can be explicitly calculated. Then, by Theorem 2. in [] and the polarization identity, we have that n β ti β ti x X t σt t i t i σ t σ t dt L 2 db t σ s σ s dw s. Here W is a Brownian motion independent of B. Moreover, for t t i, D t β ti σ t x X t + and therefore we have n L 2. t D t µ s D t X s ds + t i D t β ti x X t σ t dt t σ s x D t X s db s. x X s t i σ t σ t dt Consequently n δ β L 2 σ s σ s dw s. 3.5

22 22 J.M. Corcuera and A. Kohatsu-Higa Moreover, by using the polarization identity and Burkholder s inequality we can see that δ β { } σ n ti X ti 2 σ ti L α, n n σ ti for any α >. So, finally σ 3 t i n Eδ β X n L 2 σ s σ s dw s. In particular this implies that the asymptotic Fisher information for is given by [ 2 σ s 2E ds]. If we define Z n, + u n log px n ; + u n log px n ; Z n, + u n It can be seen that { Eδ β X n σ n ti σ 3 t i σ s +u/ n Eδ β X n X n d. X ti 2 σ ti n σ ti } + R i, X ti, X ti 3.6 where E R i, X ti, X ti α /α O/n, uniformly in i, for any α >. Then, E R i, X ti, X ti E R i, X ti, X ti p i E R i α /α E [ pi p i p i ] β /β, where /α + /β,α >,β >, R i R i, X ti, X ti and p i is the joint density of X ti, X ti evaluated at X. The expectation with respect to this process is denoted by E.

23 Inference and Malliavin calculus 23 Now, since σ C,3 is uniformly elliptic, transition densities can be bounded from above and below uniformly by the Gaussian kernel, and then by continuity there exists β > such that [ ] β pi E < C, p i see Proposition 5. in [5]. So +u/ n E R i, X ti, X ti d C +u/ n E R i α /α d C u n 3/2. Now, we only need to calculate the derivatives, with respect to fixed X n, of the expression 3.6. We obtain, after some calculations, n Eδ β X n { 2 σ ti + σ 3 t i X ti 2 2σ t i n σ ti { 2 σ ti 3 X ti } σ ti n +o p. σ 4 t i } σ 2 t i Finally, by taking into account the continuity of the derivatives with respect to, we have, Z n, + u L σ 2 s u 2 dw s u2 σ s 2 ds. n σ s 2 That is, the model satisfies the LAMN Local Asymptotic Mixed Normality property. σ s 4. Statistical inference for models with jumps In this section, we show how to treat cases where the observed process have jumps. Essentially the question reduces to the use of an appropriate extension of Malliavin s Calculus to processes with jumps. That is, the integration by parts formulas. First, we consider the case when assume that the vector of observations Z n has a Gaussian component c.f. [4]. That is, Z is a sequence of G F H-measurable random

24 24 J.M. Corcuera and A. Kohatsu-Higa variables where F is the σ-field generated by the isonormal Gaussian process and H is a σ-field independent of F. For instance Z k F,X k,y k,k,...,n where X n X,...,X n is a G-measurable random vector and Y n Y,...,Y n is independent of G. We also set Z n Z,...,Z n. Let T be a regular statistic T TZ n and g E T. Then we have E T ETZ n E TZ n E zi TZ n Z i. Now we extend the operator D to the product filtration this is also called partial Malliavin Calculus so that DZ k xk FDX k Then using this duality property we have that DT zi TZ n DZ i zi TZ n xi FDX i. As before, if there exists a random variable V with values in H such that DZ i,v H Z i, we will have that log pz n ; EδV Z n. 4.. Jump diffusions Let X n X t,x t2,...,x tn, t i i n, where n and X t x+ t b s,x s ds+ t σ s,x s db s + t n c s,z,x s Mdz,ds. Where Mdz,ds is a compensated Poisson random measure with intensity νdz,ds ν s dzds and B is an independent standard Brownian motion. Assume that b s, σ s, c s,z and their derivatives with respect to and x are Cb.3 as functions of t and x and that σ s is uniformly bounded below by a positive constant. By using the same arguments that in the continuous case and assuming that x c s,z,x >, we have that DX t x X t x X σ,x [,t].

25 Inference and Malliavin calculus 25 Then, writing and where we have Consequently, where β t x X t σ t,x t δ β β t x X t X t atβ ti β ti {ti t<t i }, a L 2 [,T], atdt,i,...,n, t i X ti β,dx ti H. β ti β ti Also we will have that Z n, + u n L u Eδ β X n, t i at x X t σ t,x t db t t i atd t β ti x X t σ t,x t dt. σ s 2 dw s u2 σ s 2 2 σ s 2 ds. Notice that we would obtain another asymptotic regime if σ s would not depend on as we have seen in example 6. This is also the case in the next subsections The pure jump case In the previous calculation, we have used the Brownian motion in order to use the integration by parts property that will lead to an expression of the score. In this section we use the jump structure and, for the sake of simplicity, we assume that there is no Brownian component. Clearly, this also leads to different expressions for the same models if we consider models with Brownian and jump components. σ s

26 26 J.M. Corcuera and A. Kohatsu-Higa Processes with different intensity. Let Y t be an observation of a compound Poisson process of parameter λ and with jumps given by a random variable X, then λ E {Yt A} λ PY t A e λt λt n λ PX X n A n! n e λt λt n t PX X n A n! n ne λt λt n +t PX X n A n! n tpy t A + e λt λt n npx X n A λ n! n te {Yt A} + λ E {Y t A}N t E {Yt A} N t λ t, where we should understand that X +...+X n : if n. Suppose now that we observe Y t,y t2,..,y tn, t.. t n T. Since the increments are independent, the score function is given by E N T λ T Y n t,y t2,...,y tn λ EN ti+ N ti Y ti+ Y ti T. Here i f k Y t λt k EN t Y t k k k!, i fi Y t λti i! where f i stands for the i-th convolution of the density of the random variable X. We remark that this is related to Proposition 3.6 in [3], where the authors use Girsanov s theorem. The trajectories of a compound Poisson process can be described by indicating the time and amplitude of the jumps. To see this, let Ω n R + R n be the sample space and denote an element ω n R + R n as ω s,x,..., s k,x k for certain natural k. In this space we assume that the canonical random variables s j s j expλ and x j fx are independent sequences and also independent

27 of each other. Next, consider the maps Inference and Malliavin calculus 27 Y t : Ω s,x,..., s k,x k Y t ω R k x i [si, ]t the family Y t t is a compound Poisson process. Then, note that λ E {Yt A} e λt λt n t PX X n+ A PX X n A n! n e λt λt n te PX X n + x A PX X n A xxn+ n! n T T EΨ s,x {Yt A} xxn+ ds E Ψ s,x {Yt A}fxdxds, where for any random variable F, we let Ψ s,x Fω Fω s,x Fω, where ω s,x is a trajectory like ω but where we add a jump of amplitude x at time s. We also can write λ E {Yt A} λ E R T R Ψ s,x {Yt A}νdxds, where νdx λfxdx is the Lévy measure of the compound Poisson process Y. If we denote by δ the adjoint operator of Ψ, we deduce that δ N T λt. It can be seen that if u s,x is a predictable random field on [,T] R belonging to the domain of the operator δ then T δu u s,x Mdx,ds, R where M is the compensated random measure associated with the compound Poisson process we assume here that xνdx <, x > see [8] for the Poisson process and [2] for the general case, note that the authors use the operator Ψs,x instead of Ψ x s,x.

28 28 J.M. Corcuera and A. Kohatsu-Higa Processes with different amplitude parameter. Let L i : Y ti Y ti g,xmdx,ds, t i i n, i,...n, with Mdx,ds t i R δ {τk }dsδ {Xk }dx, k where < τ < τ 2 <... < τ n is a sequence of jump times of a Poisson process with intensity one and X i, i,...,n,... is an iid sequence independent of τ i i and with density f, g,x is a deterministic function. Note that the intensity of the Lévy process is assumed to be known and the inference is on the amplitude of the jumps. The compound Poisson process associated with Mdx,ds is t Z t xmdx,ds. R Note that when Z jumps x units, Y t t g,xmdx,ds jumps R g,x but with same frequency as Z. In fact, Y is a compound Poisson process with random Poisson measure M Y dx,ds δ {τk }dsδ {g,xk }dx k In such a situation we have for a regular statistic T TL,...,L n ET E lj T L j, and L j tj t j g,xmdx,ds R g,x k τk t j,t j }, k

29 Inference and Malliavin calculus 29 so ET E lj T g,x k τk t j,t j } k k E lj T xk L j g,x k x g,x k τ k t j,t j }. Since we can write ET k k xk L j x g,x k τk t j,t j } E lj T xk L j g,x k x g,x k τ k t j,t j } E xk T g,x k x g,x k τ k t j,t j } E k T fx k x g,x k fx k x g,x k τk t j,t j }, where in the last inequality we use the independence between X k k and the jump times and where we assume there are not border effects. That is, Finally ET E E k T g,xfx x g,x E T T T R tn R fx k x tj t j fx x t fx x. x suppf g,xfx x g,x g,xfx x g,x g,xfx x g,x xx k τk t j,t j } Mdx,ds Mdx,ds.

30 3 J.M. Corcuera and A. Kohatsu-Higa So, the score function is given by tn E t fx g,xfx x Mdx,ds x g,x Y t,y t2,...,y tn. R If we define the random variables V : g,x the previous expression can be written as tn v vf V v E Mdv,ds f V v Y t,y t2,...,y tn, R t and we can compare it with expression 2.4. References [] O.E. Barndorff-Nielsen, S.E. Graversen, J. Jacod, M. Podolskij, N. Shephard, A central limit theorem for realised power and bipower variations of continuous semimartingales, in: Yu. Kabanov, R. Liptser and J. Stoyanov Eds., From Stochastic Calculus to Mathematical Finance. Festschrift in Honour of A.N. Shiryaev, Heidelberg: Springer, 26, pp [2] B.Z. Bobrovsky, E. Mayer-Wolf and M. Zakai, Some Classes of Global Cramer-Rao Bounds, Annals of Statistics, 5 987, [3] M.H.A. Davis and P. Johansson, Malliavin Monte Carlo Greeks for jump diffusions. Stoch. Proc. Appl. 6 26, -29. [4] V. Debelley and N. Privault, Sensitivity Analysis of European options in jumpdiffusiom models via Malliavin calculus on the Wiener space. Preprint 26. [5] E. Gobet Local asymptotic mixed normality property for elliptic disffusion: a Malliavin calculus approach. Bernoulli 762, [6] Y. A. Kutoyants, Statistical inference for ergodic diffusion processes, Springer, 24. [7] D. Nualart, The Malliavin calculus an related topics. 2nd edition, Springer-Verlag, 26. [8] D. Nualart and J. Vives, A duality formula on the Poisson space and some applications. In R. Dalang, M. Dozzi, and F. Russo, editors, Seminar on Stochastic Analysis, Random Fields and ApplicationsAscona, 993, volume 36 of Progress in Probability, pages Birkhäuser, 995. [9] N. Privault and A. Réveillac Superefficient estimation on the Wiener space. C.R. Acad. Sci. Paris Sér. I Math.,343 26, [] N. Privault and A. Réveillac, Stein estimation for the drift of Gaussian processes using the Malliavin calculus. Preprint 27. To appear in the Annals of Statistics. [] N. Privault and A. Réveillac, Stein estimation of Poisson process intensities. Preprint 27. [2] J.L. Solé, F. Utzet and J. Vives Canonical Lévy Process and Malliavin Calculus. Stochastic Processes and their Applications, 72,27,65 87.

31 Inference and Malliavin calculus 3 José M. Corcuera Universitat de Barcelona Gran Via de les Corts Catalanes. 585 E-87, Barcelona Spain jmcorcuera@ub.edu Arturo Kohatsu-Higa Osaka University Graduate School of Engineering Sciences Machikaneyama cho. -3 Osaka Japan. kohatsu@sigmath.es.osaka-u.ac.jp

LAN property for ergodic jump-diffusion processes with discrete observations

LAN property for ergodic jump-diffusion processes with discrete observations LAN property for ergodic jump-diffusion processes with discrete observations Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Arturo Kohatsu-Higa (Ritsumeikan University, Japan) &

More information

LAN property for sde s with additive fractional noise and continuous time observation

LAN property for sde s with additive fractional noise and continuous time observation LAN property for sde s with additive fractional noise and continuous time observation Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Samy Tindel (Purdue University) Vlad s 6th birthday,

More information

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior Ehsan Azmoodeh University of Vaasa Finland 7th General AMaMeF and Swissquote Conference September 7 1, 215 Outline

More information

Independence of some multiple Poisson stochastic integrals with variable-sign kernels

Independence of some multiple Poisson stochastic integrals with variable-sign kernels Independence of some multiple Poisson stochastic integrals with variable-sign kernels Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological

More information

COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE

COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE Communications on Stochastic Analysis Vol. 4, No. 3 (21) 299-39 Serials Publications www.serialspublications.com COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE NICOLAS PRIVAULT

More information

Malliavin Calculus in Finance

Malliavin Calculus in Finance Malliavin Calculus in Finance Peter K. Friz 1 Greeks and the logarithmic derivative trick Model an underlying assent by a Markov process with values in R m with dynamics described by the SDE dx t = b(x

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Jump-type Levy Processes

Jump-type Levy Processes Jump-type Levy Processes Ernst Eberlein Handbook of Financial Time Series Outline Table of contents Probabilistic Structure of Levy Processes Levy process Levy-Ito decomposition Jump part Probabilistic

More information

Gaussian processes for inference in stochastic differential equations

Gaussian processes for inference in stochastic differential equations Gaussian processes for inference in stochastic differential equations Manfred Opper, AI group, TU Berlin November 6, 2017 Manfred Opper, AI group, TU Berlin (TU Berlin) inference in SDE November 6, 2017

More information

KOLMOGOROV DISTANCE FOR MULTIVARIATE NORMAL APPROXIMATION. Yoon Tae Kim and Hyun Suk Park

KOLMOGOROV DISTANCE FOR MULTIVARIATE NORMAL APPROXIMATION. Yoon Tae Kim and Hyun Suk Park Korean J. Math. 3 (015, No. 1, pp. 1 10 http://dx.doi.org/10.11568/kjm.015.3.1.1 KOLMOGOROV DISTANCE FOR MULTIVARIATE NORMAL APPROXIMATION Yoon Tae Kim and Hyun Suk Park Abstract. This paper concerns the

More information

Malliavin calculus and central limit theorems

Malliavin calculus and central limit theorems Malliavin calculus and central limit theorems David Nualart Department of Mathematics Kansas University Seminar on Stochastic Processes 2017 University of Virginia March 8-11 2017 David Nualart (Kansas

More information

Limit theorems for multipower variation in the presence of jumps

Limit theorems for multipower variation in the presence of jumps Limit theorems for multipower variation in the presence of jumps Ole E. Barndorff-Nielsen Department of Mathematical Sciences, University of Aarhus, Ny Munkegade, DK-8 Aarhus C, Denmark oebn@imf.au.dk

More information

Malliavin Calculus: Analysis on Gaussian spaces

Malliavin Calculus: Analysis on Gaussian spaces Malliavin Calculus: Analysis on Gaussian spaces Josef Teichmann ETH Zürich Oxford 2011 Isonormal Gaussian process A Gaussian space is a (complete) probability space together with a Hilbert space of centered

More information

Higher order weak approximations of stochastic differential equations with and without jumps

Higher order weak approximations of stochastic differential equations with and without jumps Higher order weak approximations of stochastic differential equations with and without jumps Hideyuki TANAKA Graduate School of Science and Engineering, Ritsumeikan University Rough Path Analysis and Related

More information

Random Fields: Skorohod integral and Malliavin derivative

Random Fields: Skorohod integral and Malliavin derivative Dept. of Math. University of Oslo Pure Mathematics No. 36 ISSN 0806 2439 November 2004 Random Fields: Skorohod integral and Malliavin derivative Giulia Di Nunno 1 Oslo, 15th November 2004. Abstract We

More information

Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises

Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises Hongwei Long* Department of Mathematical Sciences, Florida Atlantic University, Boca Raton Florida 33431-991,

More information

Poisson random measure: motivation

Poisson random measure: motivation : motivation The Lévy measure provides the expected number of jumps by time unit, i.e. in a time interval of the form: [t, t + 1], and of a certain size Example: ν([1, )) is the expected number of jumps

More information

Convoluted Brownian motions: a class of remarkable Gaussian processes

Convoluted Brownian motions: a class of remarkable Gaussian processes Convoluted Brownian motions: a class of remarkable Gaussian processes Sylvie Roelly Random models with applications in the natural sciences Bogotá, December 11-15, 217 S. Roelly (Universität Potsdam) 1

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

arxiv: v1 [math.pr] 10 Jan 2019

arxiv: v1 [math.pr] 10 Jan 2019 Gaussian lower bounds for the density via Malliavin calculus Nguyen Tien Dung arxiv:191.3248v1 [math.pr] 1 Jan 219 January 1, 219 Abstract The problem of obtaining a lower bound for the density is always

More information

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 19, No. 4, December 26 GIRSANOV THEOREM FOR GAUSSIAN PROCESS WITH INDEPENDENT INCREMENTS Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** Abstract.

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

Normal approximation of geometric Poisson functionals

Normal approximation of geometric Poisson functionals Institut für Stochastik Karlsruher Institut für Technologie Normal approximation of geometric Poisson functionals (Karlsruhe) joint work with Daniel Hug, Giovanni Peccati, Matthias Schulte presented at

More information

Regular Variation and Extreme Events for Stochastic Processes

Regular Variation and Extreme Events for Stochastic Processes 1 Regular Variation and Extreme Events for Stochastic Processes FILIP LINDSKOG Royal Institute of Technology, Stockholm 2005 based on joint work with Henrik Hult www.math.kth.se/ lindskog 2 Extremes for

More information

arxiv: v2 [math.pr] 22 Aug 2009

arxiv: v2 [math.pr] 22 Aug 2009 On the structure of Gaussian random variables arxiv:97.25v2 [math.pr] 22 Aug 29 Ciprian A. Tudor SAMOS/MATISSE, Centre d Economie de La Sorbonne, Université de Panthéon-Sorbonne Paris, 9, rue de Tolbiac,

More information

Rough paths methods 4: Application to fbm

Rough paths methods 4: Application to fbm Rough paths methods 4: Application to fbm Samy Tindel Purdue University University of Aarhus 2016 Samy T. (Purdue) Rough Paths 4 Aarhus 2016 1 / 67 Outline 1 Main result 2 Construction of the Levy area:

More information

Estimates for the density of functionals of SDE s with irregular drift

Estimates for the density of functionals of SDE s with irregular drift Estimates for the density of functionals of SDE s with irregular drift Arturo KOHATSU-HIGA a, Azmi MAKHLOUF a, a Ritsumeikan University and Japan Science and Technology Agency, Japan Abstract We obtain

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( )

Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio ( ) Mathematical Methods for Neurosciences. ENS - Master MVA Paris 6 - Master Maths-Bio (2014-2015) Etienne Tanré - Olivier Faugeras INRIA - Team Tosca November 26th, 2014 E. Tanré (INRIA - Team Tosca) Mathematical

More information

Discretization of SDEs: Euler Methods and Beyond

Discretization of SDEs: Euler Methods and Beyond Discretization of SDEs: Euler Methods and Beyond 09-26-2006 / PRisMa 2006 Workshop Outline Introduction 1 Introduction Motivation Stochastic Differential Equations 2 The Time Discretization of SDEs Monte-Carlo

More information

Chaos expansions and Malliavin calculus for Lévy processes

Chaos expansions and Malliavin calculus for Lévy processes Chaos expansions and Malliavin calculus for Lévy processes Josep Lluís Solé, Frederic Utzet, and Josep Vives Departament de Matemàtiques, Facultat de Ciències, Universitat Autónoma de Barcelona, 8193 Bellaterra

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

Normal approximation of Poisson functionals in Kolmogorov distance

Normal approximation of Poisson functionals in Kolmogorov distance Normal approximation of Poisson functionals in Kolmogorov distance Matthias Schulte Abstract Peccati, Solè, Taqqu, and Utzet recently combined Stein s method and Malliavin calculus to obtain a bound for

More information

A Class of Fractional Stochastic Differential Equations

A Class of Fractional Stochastic Differential Equations Vietnam Journal of Mathematics 36:38) 71 79 Vietnam Journal of MATHEMATICS VAST 8 A Class of Fractional Stochastic Differential Equations Nguyen Tien Dung Department of Mathematics, Vietnam National University,

More information

Convergence at first and second order of some approximations of stochastic integrals

Convergence at first and second order of some approximations of stochastic integrals Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456

More information

Numerical Methods with Lévy Processes

Numerical Methods with Lévy Processes Numerical Methods with Lévy Processes 1 Objective: i) Find models of asset returns, etc ii) Get numbers out of them. Why? VaR and risk management Valuing and hedging derivatives Why not? Usual assumption:

More information

A Note on the Central Limit Theorem for a Class of Linear Systems 1

A Note on the Central Limit Theorem for a Class of Linear Systems 1 A Note on the Central Limit Theorem for a Class of Linear Systems 1 Contents Yukio Nagahata Department of Mathematics, Graduate School of Engineering Science Osaka University, Toyonaka 560-8531, Japan.

More information

An Introduction to Malliavin calculus and its applications

An Introduction to Malliavin calculus and its applications An Introduction to Malliavin calculus and its applications Lecture 3: Clark-Ocone formula David Nualart Department of Mathematics Kansas University University of Wyoming Summer School 214 David Nualart

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

SYMMETRIC WEIGHTED ODD-POWER VARIATIONS OF FRACTIONAL BROWNIAN MOTION AND APPLICATIONS

SYMMETRIC WEIGHTED ODD-POWER VARIATIONS OF FRACTIONAL BROWNIAN MOTION AND APPLICATIONS Communications on Stochastic Analysis Vol. 1, No. 1 18 37-58 Serials Publications www.serialspublications.com SYMMETRIC WEIGHTED ODD-POWER VARIATIONS OF FRACTIONAL BROWNIAN MOTION AND APPLICATIONS DAVID

More information

An introduction to Lévy processes

An introduction to Lévy processes with financial modelling in mind Department of Statistics, University of Oxford 27 May 2008 1 Motivation 2 3 General modelling with Lévy processes Modelling financial price processes Quadratic variation

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

Multivariate Generalized Ornstein-Uhlenbeck Processes

Multivariate Generalized Ornstein-Uhlenbeck Processes Multivariate Generalized Ornstein-Uhlenbeck Processes Anita Behme TU München Alexander Lindner TU Braunschweig 7th International Conference on Lévy Processes: Theory and Applications Wroclaw, July 15 19,

More information

Interest Rate Models:

Interest Rate Models: 1/17 Interest Rate Models: from Parametric Statistics to Infinite Dimensional Stochastic Analysis René Carmona Bendheim Center for Finance ORFE & PACM, Princeton University email: rcarmna@princeton.edu

More information

Lecture 4: Introduction to stochastic processes and stochastic calculus

Lecture 4: Introduction to stochastic processes and stochastic calculus Lecture 4: Introduction to stochastic processes and stochastic calculus Cédric Archambeau Centre for Computational Statistics and Machine Learning Department of Computer Science University College London

More information

Tools of stochastic calculus

Tools of stochastic calculus slides for the course Interest rate theory, University of Ljubljana, 212-13/I, part III József Gáll University of Debrecen Nov. 212 Jan. 213, Ljubljana Itô integral, summary of main facts Notations, basic

More information

A Concise Course on Stochastic Partial Differential Equations

A Concise Course on Stochastic Partial Differential Equations A Concise Course on Stochastic Partial Differential Equations Michael Röckner Reference: C. Prevot, M. Röckner: Springer LN in Math. 1905, Berlin (2007) And see the references therein for the original

More information

White noise generalization of the Clark-Ocone formula under change of measure

White noise generalization of the Clark-Ocone formula under change of measure White noise generalization of the Clark-Ocone formula under change of measure Yeliz Yolcu Okur Supervisor: Prof. Bernt Øksendal Co-Advisor: Ass. Prof. Giulia Di Nunno Centre of Mathematics for Applications

More information

BSS Processes and Intermittency/Volatility

BSS Processes and Intermittency/Volatility BSS Processes and /Volatility Stochastics Ole E. Barndor -Nielsen and Jürgen Schmiegel Thiele Centre Department of Mathematical Sciences University of Aarhus I I I I I I Some open Brownian semistationary

More information

Statistical inference on Lévy processes

Statistical inference on Lévy processes Alberto Coca Cabrero University of Cambridge - CCA Supervisors: Dr. Richard Nickl and Professor L.C.G.Rogers Funded by Fundación Mutua Madrileña and EPSRC MASDOC/CCA student workshop 2013 26th March Outline

More information

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility

Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility Short-time expansions for close-to-the-money options under a Lévy jump model with stochastic volatility José Enrique Figueroa-López 1 1 Department of Statistics Purdue University Statistics, Jump Processes,

More information

Exercises in stochastic analysis

Exercises in stochastic analysis Exercises in stochastic analysis Franco Flandoli, Mario Maurelli, Dario Trevisan The exercises with a P are those which have been done totally or partially) in the previous lectures; the exercises with

More information

-variation of the divergence integral w.r.t. fbm with Hurst parameter H < 1 2

-variation of the divergence integral w.r.t. fbm with Hurst parameter H < 1 2 /4 On the -variation of the divergence integral w.r.t. fbm with urst parameter < 2 EL ASSAN ESSAKY joint work with : David Nualart Cadi Ayyad University Poly-disciplinary Faculty, Safi Colloque Franco-Maghrébin

More information

Maxima and Minima. (a, b) of R if

Maxima and Minima. (a, b) of R if Maxima and Minima Definition Let R be any region on the xy-plane, a function f (x, y) attains its absolute or global, maximum value M on R at the point (a, b) of R if (i) f (x, y) M for all points (x,

More information

{σ x >t}p x. (σ x >t)=e at.

{σ x >t}p x. (σ x >t)=e at. 3.11. EXERCISES 121 3.11 Exercises Exercise 3.1 Consider the Ornstein Uhlenbeck process in example 3.1.7(B). Show that the defined process is a Markov process which converges in distribution to an N(0,σ

More information

STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY PROCESSES WITH INDEPENDENT INCREMENTS

STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY PROCESSES WITH INDEPENDENT INCREMENTS STOCHASTIC DIFFERENTIAL EQUATIONS DRIVEN BY PROCESSES WITH INDEPENDENT INCREMENTS DAMIR FILIPOVIĆ AND STEFAN TAPPE Abstract. This article considers infinite dimensional stochastic differential equations

More information

Discrete approximation of stochastic integrals with respect to fractional Brownian motion of Hurst index H > 1 2

Discrete approximation of stochastic integrals with respect to fractional Brownian motion of Hurst index H > 1 2 Discrete approximation of stochastic integrals with respect to fractional Brownian motion of urst index > 1 2 Francesca Biagini 1), Massimo Campanino 2), Serena Fuschini 2) 11th March 28 1) 2) Department

More information

(2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS

(2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS (2m)-TH MEAN BEHAVIOR OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS UNDER PARAMETRIC PERTURBATIONS Svetlana Janković and Miljana Jovanović Faculty of Science, Department of Mathematics, University

More information

MATH4210 Financial Mathematics ( ) Tutorial 7

MATH4210 Financial Mathematics ( ) Tutorial 7 MATH40 Financial Mathematics (05-06) Tutorial 7 Review of some basic Probability: The triple (Ω, F, P) is called a probability space, where Ω denotes the sample space and F is the set of event (σ algebra

More information

Stochastic Calculus. Kevin Sinclair. August 2, 2016

Stochastic Calculus. Kevin Sinclair. August 2, 2016 Stochastic Calculus Kevin Sinclair August, 16 1 Background Suppose we have a Brownian motion W. This is a process, and the value of W at a particular time T (which we write W T ) is a normally distributed

More information

Contents. 1 Preliminaries 3. Martingales

Contents. 1 Preliminaries 3. Martingales Table of Preface PART I THE FUNDAMENTAL PRINCIPLES page xv 1 Preliminaries 3 2 Martingales 9 2.1 Martingales and examples 9 2.2 Stopping times 12 2.3 The maximum inequality 13 2.4 Doob s inequality 14

More information

On Optimal Stopping Problems with Power Function of Lévy Processes

On Optimal Stopping Problems with Power Function of Lévy Processes On Optimal Stopping Problems with Power Function of Lévy Processes Budhi Arta Surya Department of Mathematics University of Utrecht 31 August 2006 This talk is based on the joint paper with A.E. Kyprianou:

More information

The Stein and Chen-Stein methods for functionals of non-symmetric Bernoulli processes

The Stein and Chen-Stein methods for functionals of non-symmetric Bernoulli processes The Stein and Chen-Stein methods for functionals of non-symmetric Bernoulli processes Nicolas Privault Giovanni Luca Torrisi Abstract Based on a new multiplication formula for discrete multiple stochastic

More information

MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES

MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES D. NUALART Department of Mathematics, University of Kansas Lawrence, KS 6645, USA E-mail: nualart@math.ku.edu S. ORTIZ-LATORRE Departament de Probabilitat,

More information

The Azéma-Yor Embedding in Non-Singular Diffusions

The Azéma-Yor Embedding in Non-Singular Diffusions Stochastic Process. Appl. Vol. 96, No. 2, 2001, 305-312 Research Report No. 406, 1999, Dept. Theoret. Statist. Aarhus The Azéma-Yor Embedding in Non-Singular Diffusions J. L. Pedersen and G. Peskir Let

More information

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Austrian Journal of Statistics April 27, Volume 46, 67 78. AJS http://www.ajs.or.at/ doi:.773/ajs.v46i3-4.672 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Yuliya

More information

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory

Tyler Hofmeister. University of Calgary Mathematical and Computational Finance Laboratory JUMP PROCESSES GENERALIZING STOCHASTIC INTEGRALS WITH JUMPS Tyler Hofmeister University of Calgary Mathematical and Computational Finance Laboratory Overview 1. General Method 2. Poisson Processes 3. Diffusion

More information

Sequential maximum likelihood estimation for reflected Ornstein-Uhlenbeck processes

Sequential maximum likelihood estimation for reflected Ornstein-Uhlenbeck processes Sequential maximum likelihood estimation for reflected Ornstein-Uhlenbeck processes Chihoon Lee Department of Statistics Colorado State University Fort Collins, CO 8523 Jaya P. N. Bishwal Department of

More information

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 1998 Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ Lawrence D. Brown University

More information

Stochastic Volatility and Correction to the Heat Equation

Stochastic Volatility and Correction to the Heat Equation Stochastic Volatility and Correction to the Heat Equation Jean-Pierre Fouque, George Papanicolaou and Ronnie Sircar Abstract. From a probabilist s point of view the Twentieth Century has been a century

More information

Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri

Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri University of Bergamo (Italy) ilia.negri@unibg.it SAPS VIII, Le Mans 21-24 March,

More information

On a class of stochastic differential equations in a financial network model

On a class of stochastic differential equations in a financial network model 1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University

More information

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition

Filtrations, Markov Processes and Martingales. Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition Filtrations, Markov Processes and Martingales Lectures on Lévy Processes and Stochastic Calculus, Braunschweig, Lecture 3: The Lévy-Itô Decomposition David pplebaum Probability and Statistics Department,

More information

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion

Brownian Motion. An Undergraduate Introduction to Financial Mathematics. J. Robert Buchanan. J. Robert Buchanan Brownian Motion Brownian Motion An Undergraduate Introduction to Financial Mathematics J. Robert Buchanan 2010 Background We have already seen that the limiting behavior of a discrete random walk yields a derivation of

More information

Reflected Brownian Motion

Reflected Brownian Motion Chapter 6 Reflected Brownian Motion Often we encounter Diffusions in regions with boundary. If the process can reach the boundary from the interior in finite time with positive probability we need to decide

More information

ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES

ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES ON THE STRUCTURE OF GAUSSIAN RANDOM VARIABLES CIPRIAN A. TUDOR We study when a given Gaussian random variable on a given probability space Ω, F,P) is equal almost surely to β 1 where β is a Brownian motion

More information

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals

Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Functional Limit theorems for the quadratic variation of a continuous time random walk and for certain stochastic integrals Noèlia Viles Cuadros BCAM- Basque Center of Applied Mathematics with Prof. Enrico

More information

Densities for the Navier Stokes equations with noise

Densities for the Navier Stokes equations with noise Densities for the Navier Stokes equations with noise Marco Romito Università di Pisa Universitat de Barcelona March 25, 2015 Summary 1 Introduction & motivations 2 Malliavin calculus 3 Besov bounds 4 Other

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

On the Goodness-of-Fit Tests for Some Continuous Time Processes

On the Goodness-of-Fit Tests for Some Continuous Time Processes On the Goodness-of-Fit Tests for Some Continuous Time Processes Sergueï Dachian and Yury A. Kutoyants Laboratoire de Mathématiques, Université Blaise Pascal Laboratoire de Statistique et Processus, Université

More information

Goodness of fit test for ergodic diffusion processes

Goodness of fit test for ergodic diffusion processes Ann Inst Stat Math (29) 6:99 928 DOI.7/s463-7-62- Goodness of fit test for ergodic diffusion processes Ilia Negri Yoichi Nishiyama Received: 22 December 26 / Revised: July 27 / Published online: 2 January

More information

Some Tools From Stochastic Analysis

Some Tools From Stochastic Analysis W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click

More information

On Reflecting Brownian Motion with Drift

On Reflecting Brownian Motion with Drift Proc. Symp. Stoch. Syst. Osaka, 25), ISCIE Kyoto, 26, 1-5) On Reflecting Brownian Motion with Drift Goran Peskir This version: 12 June 26 First version: 1 September 25 Research Report No. 3, 25, Probability

More information

Nonparametric Drift Estimation for Stochastic Differential Equations

Nonparametric Drift Estimation for Stochastic Differential Equations Nonparametric Drift Estimation for Stochastic Differential Equations Gareth Roberts 1 Department of Statistics University of Warwick Brazilian Bayesian meeting, March 2010 Joint work with O. Papaspiliopoulos,

More information

ESTIMATES OF DERIVATIVES OF THE HEAT KERNEL ON A COMPACT RIEMANNIAN MANIFOLD

ESTIMATES OF DERIVATIVES OF THE HEAT KERNEL ON A COMPACT RIEMANNIAN MANIFOLD PROCDINGS OF H AMRICAN MAHMAICAL SOCIY Volume 127, Number 12, Pages 3739 3744 S 2-9939(99)4967-9 Article electronically published on May 13, 1999 SIMAS OF DRIVAIVS OF H HA KRNL ON A COMPAC RIMANNIAN MANIFOLD

More information

Harnack Inequalities and Applications for Stochastic Equations

Harnack Inequalities and Applications for Stochastic Equations p. 1/32 Harnack Inequalities and Applications for Stochastic Equations PhD Thesis Defense Shun-Xiang Ouyang Under the Supervision of Prof. Michael Röckner & Prof. Feng-Yu Wang March 6, 29 p. 2/32 Outline

More information

Stochastic Differential Equations.

Stochastic Differential Equations. Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)

More information

Generalized fractional Lévy processes 1

Generalized fractional Lévy processes 1 Generalized fractional Lévy processes 1 Claudia Klüppelberg Center for Mathematical Sciences and Institute for Advanced Study Technische Universität München Sandbjerg, January 2010 1 1 joint work with

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

Some Properties of NSFDEs

Some Properties of NSFDEs Chenggui Yuan (Swansea University) Some Properties of NSFDEs 1 / 41 Some Properties of NSFDEs Chenggui Yuan Swansea University Chenggui Yuan (Swansea University) Some Properties of NSFDEs 2 / 41 Outline

More information

THE MALLIAVIN CALCULUS FOR SDE WITH JUMPS AND THE PARTIALLY HYPOELLIPTIC PROBLEM

THE MALLIAVIN CALCULUS FOR SDE WITH JUMPS AND THE PARTIALLY HYPOELLIPTIC PROBLEM Takeuchi, A. Osaka J. Math. 39, 53 559 THE MALLIAVIN CALCULUS FOR SDE WITH JUMPS AND THE PARTIALLY HYPOELLIPTIC PROBLEM ATSUSHI TAKEUCHI Received October 11, 1. Introduction It has been studied by many

More information

Stein s method and weak convergence on Wiener space

Stein s method and weak convergence on Wiener space Stein s method and weak convergence on Wiener space Giovanni PECCATI (LSTA Paris VI) January 14, 2008 Main subject: two joint papers with I. Nourdin (Paris VI) Stein s method on Wiener chaos (ArXiv, December

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Performance Evaluation of Generalized Polynomial Chaos

Performance Evaluation of Generalized Polynomial Chaos Performance Evaluation of Generalized Polynomial Chaos Dongbin Xiu, Didier Lucor, C.-H. Su, and George Em Karniadakis 1 Division of Applied Mathematics, Brown University, Providence, RI 02912, USA, gk@dam.brown.edu

More information

Simulation of conditional diffusions via forward-reverse stochastic representations

Simulation of conditional diffusions via forward-reverse stochastic representations Weierstrass Institute for Applied Analysis and Stochastics Simulation of conditional diffusions via forward-reverse stochastic representations Christian Bayer and John Schoenmakers Numerical methods for

More information

The Cameron-Martin-Girsanov (CMG) Theorem

The Cameron-Martin-Girsanov (CMG) Theorem The Cameron-Martin-Girsanov (CMG) Theorem There are many versions of the CMG Theorem. In some sense, there are many CMG Theorems. The first version appeared in ] in 944. Here we present a standard version,

More information

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem

On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem On the martingales obtained by an extension due to Saisho, Tanemura and Yor of Pitman s theorem Koichiro TAKAOKA Dept of Applied Physics, Tokyo Institute of Technology Abstract M Yor constructed a family

More information

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010

Ernesto Mordecki 1. Lecture III. PASI - Guanajuato - June 2010 Optimal stopping for Hunt and Lévy processes Ernesto Mordecki 1 Lecture III. PASI - Guanajuato - June 2010 1Joint work with Paavo Salminen (Åbo, Finland) 1 Plan of the talk 1. Motivation: from Finance

More information