Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion

Size: px
Start display at page:

Download "Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion"

Transcription

1 Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion Luis Barboza October 23, 2012 Department of Statistics, Purdue University () Probability Seminar 1 / 59

2 Introduction Main Objective: To study the GMM (Generalized Method of Moments) joint estimator of the drift and memory parameters of the Ornstein-Uhlenbeck SDE driven by fbm. Joint work with Prof. Frederi Viens (Department of Statistics, Purdue University). Department of Statistics, Purdue University () Probability Seminar 2 / 59

3 Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fou Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 3 / 59

4 Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fou Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 4 / 59

5 Fractional Brownian Motion The fractional Brownian motion (fbm) with Hurst parameter H (0, 1) is the centered Gaussian process B H t, continuous a.s. with: Cov(B H t, B H s ) = 1 2 ( t 2H + s 2H t s 2H ), t, s R. Some properties: Self-Similarity: for each a > 0 Bat H d = a H Bt H. It admits an integral representation with respect to the standard Brownian motion over a finite interval: B H t = t 0 K H (t, s)dw (s) Department of Statistics, Purdue University () Probability Seminar 5 / 59

6 Fractional Gaussian noise (fgn) Definition: N H t := B H t B H t α where α > 0. Gaussian and stationary process. Long-memory behavior of the increments when H > 1 2 that n ρ(n) = ). Ergodicity. (in the sense Department of Statistics, Purdue University () Probability Seminar 6 / 59

7 Fractional Gaussian noise (fgn) Autocovariance function: ρ θ (t) = 1 [ t + α 2H + t α 2H 2 t 2H] 2 Spectral density: f θ (t) = 2c H (1 cos t) t 1 2H where c H = sin(πh)γ(2h+1) 2π. (Beran, 1994). Department of Statistics, Purdue University () Probability Seminar 7 / 59

8 Estimation of H (fbm and fgn) Classical methods: R/S statistic (Hurst 1951), Variance plot (Heuristic method). Robinson (1992,1995): Semiparametric Gaussian estimation. (spectral information) MLE Methods: Whittle s estimator. Department of Statistics, Purdue University () Probability Seminar 8 / 59

9 Estimation of H Methods using variations (filters): Coeurjolly (2001): consistent estimator of H (0, 1) based on the asymptotic behavior of discrete variations of the fbm. Asymptotic normality for H < 3/4. Tudor and Viens (2008): proved consistency and asymptotics of the second-order variational estimator (convergence to Rosenblatt random variable when H > 3/4) using Malliavin calculus. Department of Statistics, Purdue University () Probability Seminar 9 / 59

10 Ornstein-Uhlenbeck SDE driven by fbm Cheridito (2003) Take λ, σ > 0 and ζ a.s bounded r.v. The Langevin equation: t X t = ζ λ X s ds + σbt H, t 0 0 has as an unique strong solution and it is called the fractional Ornstein-Uhlenbeck process: ( t ) ζ X t = e λt ζ + σ e λu dbu H, t T 0 and this integral exists in the Riemann-Stieltjes sense. Department of Statistics, Purdue University () Probability Seminar 10 / 59

11 Properties Cheridito (2003): Stationary solution (fou process): t X t = σ e λ(t u) dbu H, t > 0 Autocovariance function (Pipiras and Taqqu, 2000): ρ θ (t) = 2σ 2 c H 0 cos(tx) x 1 2H λ 2 + x 2 dx where c H = Γ(2H+1) sin(πh) 2π. Department of Statistics, Purdue University () Probability Seminar 11 / 59

12 Properties Cheridito (2003): X t has long memory when H > 1 2, due to the following approximation when x is large: X t is ergodic. ρ θ (x) = H(2H 1) x 2H 2 + O(x 2H 4 ). λ X t is not self-similar, but it exhibits asymptotic selfsimilarity (Bonami and Estrade, 2003): f θ (x) = c H x 1 2H + O( x 3 2H ). Department of Statistics, Purdue University () Probability Seminar 12 / 59

13 Estimation of λ given H (OU-fBm) MLE estimators: Kleptsyna and Le Breton (2002): MLE estimator based on Girsanov formula for fbm. Strong consistency when H > 1/2. Tudor and Viens (2006): Extended the K&L result to more general drift conditions. Strong consistency of MLE estimator when H < 1 2 using Malliavin calculus. They work with the non-stationary case. Department of Statistics, Purdue University () Probability Seminar 13 / 59

14 Estimation of λ given H (Least-Squares methods) Hu and Nualart (2010) Estimate of λ which is strongly consistent for H 1 2. ( 1 λ T = HΓ(2H)T T λ T is asymptotically normal if H ( 1 2, 3 4 ) 0 ) 1 0 Xt 2 2H dt The proofs of these results rely mostly on Malliavin calculus techniques. Department of Statistics, Purdue University () Probability Seminar 14 / 59

15 Estimation of H and λ Methods based on variations: Biermé et al (2011): For fixed T, they use the results in Biermé and Richard (2006) to prove consistency and asymptotic normality of the joint estimator of (H, σ 2 ) for any stationary gaussian process with asymptotic self-similarity (Infill-asymptotics case). Brouste and Iacus (2012): For T and α 0 they proved consistency and asymp. normality when 1 2 < H < 3 4 for the pair (H, σ 2 ) (non-stationary case). Department of Statistics, Purdue University () Probability Seminar 15 / 59

16 Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fou Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 16 / 59

17 Preliminaries X t : real-valued centered gaussian stationary process with spectral density f θ0 (x). f θ (x): continuous function with respect to x, continuously differentiable with respect to θ. θ belongs a compact set Θ R p. Bochner s theorem: ρ θ (s) := Cov(X t+s, X t ) = R cos(sx)f θ (x)dx Department of Statistics, Purdue University () Probability Seminar 17 / 59

18 Preliminaries If ρ θ (s) is a continuous function of s, then the process X t is ergodic. Assumption 1 Take α > 0 and L a positive integer. Then there exists k {0, 1,..., L} such that ρ θ (αk) is an injective function of θ. Department of Statistics, Purdue University () Probability Seminar 18 / 59

19 Preliminaries Recall: a(l) := (a 0 (l),..., a L (l)) is a discrete filter of length L + 1 and order l for L Z + and l {0,..., L} if L a k (l)k p = 0 for 0 p l 1 k=0 L a k (l)k p 0 if p = l. k=0 Examples: finite-difference filters, Daubechies filters (wavelets). Assume that we can choose L filters with orders l i {1,..., L} for i = 1,..., L and a extra filter with l 0 = 0. Department of Statistics, Purdue University () Probability Seminar 19 / 59

20 Preliminaries Define the filtered process of order l i and step size f > 0 at t 0 as: ϕ i (X t ) := L a q (l i )X t f q q=0 Its expected value is: V i (θ 0 ) := E[ϕ i (X t ) 2 ] = L k=0 b k(l i )ρ θ0 ( f k). Define the set of moment equations by: where g(x t, θ) := (g 0 (X t, θ),..., g L (X t, θ)) g i (X t, θ) = ϕ i (X t ) 2 V i (θ), for 0 i L. Department of Statistics, Purdue University () Probability Seminar 20 / 59

21 Preliminaries Assume that we have observed the stationary process X t at times 0 = t 0 < t 1 < < t N 1 < t N = T and α := t i t i 1 > 0 (fixed). Assume there exists a sequence of symmetric positive-definite random matrices { N } such that  N p A, and A > 0. Define: ĝ N (θ) := 1 N L+1 N i=l g(x t i, θ). (sample moments) ˆQ N (θ) := ĝ N (θ)  N ĝ N (θ). Q 0 (θ) := E[g(X t, θ)] AE[g(X t, θ)]. Department of Statistics, Purdue University () Probability Seminar 21 / 59

22 GMM estimation Define the GMM estimator of θ 0 : ˆθ N : = argmin θ Θ ˆQ N (θ) ( ) T ( ) 1 N 1 N = argmin θ Θ g(x ti, θ) Â N g(x ti, θ). N N i=1 i=1 Department of Statistics, Purdue University () Probability Seminar 22 / 59

23 Consistency Lemma 1 Under the above assumptions: (i) (ii) if and only if θ = θ 0. sup ˆQ N (θ) Q 0 (θ) a.s 0. θ Θ Q 0 (θ) = 0. Department of Statistics, Purdue University () Probability Seminar 23 / 59

24 Consistency Key aspects of proof: Ergodicity of X t Continuity of ρ θ ( ) over the compact set Θ. Injectivity of: ρ θ (α 0) ρ θ (α) :=. ρ θ (α L) Results of Newey and McFadden, (GMM) Department of Statistics, Purdue University () Probability Seminar 24 / 59

25 Consistency We have all the conditions to apply: Theorem 1 (Newey and McFadden, 1994) Under the above assumptions, it holds that: ˆθ N a.s. θ 0. Department of Statistics, Purdue University () Probability Seminar 25 / 59

26 Asymptotic Normality of the sample moments Denote: Ĝ N (θ) := θ ĝ N (θ) G(θ) := E[ θ g(x t, θ)] Assumption 2 For fixed α > 0, assume that the (L + 1) p matrix θ ρ θ (α) is a full-column rank matrix for any θ Θ. We can prove that: Ĝ N (θ) = G(θ) = θ V (θ), where V (θ) = (V 0 (θ),..., V L (θ)). Department of Statistics, Purdue University () Probability Seminar 26 / 59

27 Asymptotic Normality of the sample moments And based on Biermé et al (2011) article, let us assume: Assumption 3 (Biermé et al s condition) Assume that for any l, l {l 0,..., l L } we have that: R(u l, l ) := min (1, u l+l ) ( ) u + 2πp f θ0 L 2 (( π, π)) α p Z Department of Statistics, Purdue University () Probability Seminar 27 / 59

28 Asymptotic Normality of the sample moments Lemma 2 Under Assumptions 1-3 NĝN (θ 0 ) d N(0, Ω) where π Ω ij = 2α 2 P ali [cos u] 2 P alj [cos u] 2 f θ0 (u/α) 2 du π and f θ0 (u) := p Z f θ 0 (u + 2πp). Note: P al (x) := L k=0 a k(l)x k. Department of Statistics, Purdue University () Probability Seminar 28 / 59

29 Sketch of proof Let V D (θ 0 ) := Diag (1/V j (θ 0 )) j {0,...,L}. We scale the vector g(x t, θ 0 ) as follows: N N ( NVD (θ 0 )ĝ N (θ 0 ) = H2 (Z lj,t N L + 1 i ) ) j {0,...,L} where Z lj,t i := ϕ j (X ti ) Vj (θ 0 ) and H 2( ) is the 2nd-order Hermite process. Use the vector-valued version of the Breuer-Major theorem with spectral-information conditions (Biermé et al, 2011) to deduce the asymptotic behavior of the previous sum. i=l Department of Statistics, Purdue University () Probability Seminar 29 / 59

30 Asymptotic behavior of the error Let ε N := ˆθ N θ 0. Using the mean value theorem: ε N := ˆθ N θ 0 = ψ N ( θ N, ˆθ N ) ĝ N (θ 0 ) where ψ N ( θ N, ˆθ N ) := [G(ˆθ N ) Â N G( θ N )] 1 G(ˆθ N ) Â N. Also note that: ψ N ( θ N, ˆθ N ) p [G(θ) AG(θ)] 1 G(θ) A then we can bound E[ ψ N ( θ N, ˆθ N ) 4p ] for any p > 0 Department of Statistics, Purdue University () Probability Seminar 30 / 59

31 Asymptotic behavior of the error X t can be represented as a Wiener-Ito integral with respect to the standard Brownian motion: where A k (x θ 0 ) := cos(αkx) f θ0 (x). X tk = I 1 (A k ( θ 0 )) Using the multiplication rule of Wiener integrals: (ĝ N (θ)) i = I 2 [B i,j ( θ)] where B i,j ( θ) is a kernel depending on the filter a. Department of Statistics, Purdue University () Probability Seminar 31 / 59

32 Asymptotic behavior of the error We already proved a CLT for ĝ N (θ 0 ), hence, for all i: E[ (ĝ N (θ 0 )) i 2 ] = O(N 1 ) Let p > 0, we can use the equivalence of L p -norms of a fixed Weiner chaos to get: By Borel-Cantelli, we conclude: E[ ĝ N (θ 0 ) 4p ] < τ p,l N 2p Department of Statistics, Purdue University () Probability Seminar 32 / 59

33 Asymptotic behavior of the error Corollary 1 Under Assumptions 1-3 we have: (i) E[ ˆθ N θ 0 2 ] = O(N 1 ) (ii) N γ ˆθ N θ 0 a.s. 0 for any γ < 1 2. Department of Statistics, Purdue University () Probability Seminar 33 / 59

34 Asymptotic behavior of the error And we can generalize the previous result as follows: Theorem 2 Let ˆθ N the GMM estimator of θ 0. Assume there exists a diagonal (L + 1) (L + 1) matrix D N (θ 0 ) such that: E[D (i,i) N (θ 0) (ĝ N (θ 0 )) i 2 ] = O(1) for all i {0,..., L}. Then: ( ) (i) E[ ˆθ N θ 0 2 ] = O max 0 i L {D (i,i) N (θ 0)} (ii) If max 0 i L {D (i,i) N (θ 0)} = f (N) N, for f (N) = o(n) and ν > 0 then for ν any γ < ν 2 : N γ ε N a.s. 0. Department of Statistics, Purdue University () Probability Seminar 34 / 59

35 Asymptotic Normality Theorem 3 Let X t a Gaussian stationary process with parameter θ 0. If ˆθ N is the GMM estimator of θ 0, then under Assumptions 1-3, it holds that: N(ˆθ N θ 0 ) d N(0, C(θ 0 )ΩC(θ 0 ) ) where C(θ 0 ) = [G(θ 0 ) AG(θ 0 )] 1 G(θ 0 ) A. Key ideas of the proof (Newey and McFadden, 1994; Hansen, 1982): Linear behavior of ε N = ˆθ N θ 0. Slutsky s theorem. Department of Statistics, Purdue University () Probability Seminar 35 / 59

36 Efficiency The GMM asymptotic variance is minimized by taking A = Ω 1. There are numerical techniques to compute ÂN such that p  N Ω 1, for example (Hansen et al., 1996): Two-step estimators. Iterative estimator. Continuous-updating estimator. Department of Statistics, Purdue University () Probability Seminar 36 / 59

37 Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fou Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 37 / 59

38 Preliminaries Assume that there exist a closed rectangle Θ R 2 such that θ = (H, λ) Θ, and assume that σ = 1. ρ θ (t) and its partial derivatives are continuous functions of θ. The Assumptions 1 and 2 are difficult to confirm due to the analytical complexity of the covariance function ρ θ (t). we decided to check these two assumptions at least locally, by checking numerically that: ] det [ ρθ (0) H ρ θ (α) H for different values of θ and α. ρ θ (0) λ ρ θ (α) λ 0 The injectivity approximately holds for 0 < λ < 5 and H > 0.3. Department of Statistics, Purdue University () Probability Seminar 38 / 59

39 Preliminaries Lemma 3 Let X t be the stationary fou process with parameters θ = (H, λ). The Assumption 3 (Biermé et al s condition) holds under the following two cases: Case 1 If l + l > 1 then it holds for all H (0, 1). Case 2 If l + l 1 then it holds if H (0, 3 4 ). Conclusion: Assumption 3 is not valid for the first component of ĝ N if H 3 4. Department of Statistics, Purdue University () Probability Seminar 39 / 59

40 Asymptotic Normality of the sample errors Using the previous lemma, for H < 3 4 : NĝN (θ) d N(0, Λ(θ)) where the covariance matrix Λ(θ) has entries: π Λ ij (θ) =2cH 2 α4h P ali [cos u] 2 Palj [cos u] 2 u + 2πp 1 2H 2 π (u + 2πp) 2 + (λα) 2 du p Z }{{} K i,j (α) Department of Statistics, Purdue University () Probability Seminar 40 / 59

41 Asymptotic Normality of the sample errors Lemma 4 Assume that X t is a stationary fou process with parameters θ = (H, λ) where H 3 4. Then: (i) If H = 3 4 : N log N (ĝ d N(θ)) 0 N(0, 2α 1 cθ 2 ) where c θ = H(2H 1) λ = 3 8λ. (ii) If H > 3 4, (ĝ N(θ)) 0 does not converge to a normal law, or even a second-chaos law. However, [ E N 2 2H (ĝ N (θ)) 0 2] = O(1). Department of Statistics, Purdue University () Probability Seminar 41 / 59

42 Sketch of proof Denote: ĝ N,0 (θ) = (ĝ N (θ)) 0. For H = 3 4, as N : [ ] N E log N ĝ N,0(θ) 2 2α 1 cθ 2 := c 1 where c θ := H(2H 1) λ. For H > 3 4 : E[ N 2 2H ĝ N,0 (θ) 2 ] 2α 4H 4 c 2 θ (2H 1)(4H 3) := c 2 Department of Statistics, Purdue University () Probability Seminar 42 / 59

43 Sketch of proof N c If F N = 1 log N ĝn,0(θ) if H = 3 4 N 4 4H c 2 ĝ N,0 (θ) if H > 3 4 then we need to prove where H = L 2 (( π, π)). DF N 2 H This result is valid only if H = 3 4. L 2 (Ω) 2 Hence we can use Nualart and Ortiz-Latorre (2008) to conclude asymptotic normality of F N. Department of Statistics, Purdue University () Probability Seminar 43 / 59

44 Sketch of proof In the case H > 3 4 we can use the same criteria to conclude that F N does not have a normal limit in law. Furthermore, F N has this kernel: I N (r, s) := N1 2H c2 N j=1 (A j A j )(r, s) It can be proved that I N (r, s) is not a Cauchy sequence in H 2. Then F N does not have a 2nd-chaos limit. Department of Statistics, Purdue University () Probability Seminar 44 / 59

45 Asymptotic behavior of the error Proposition 1 Let X t be a fou process with parameters θ = (H, λ). Then the GMM estimate ˆθ N satisfies: (i) (ii) As N goes to infinity: O(N ( 1 ) ) if H (0, 3 4 ) E[ ˆθ N θ 2 ] = O log N N if H = 3 4 O(N 4H 4 ) if H ( 3 4, 1) N γ ˆθ N θ a.s. 0 for all γ < 1 2 (if H 3 4 ). Otherwise for all γ < 2 2H, if H > 3 4. Department of Statistics, Purdue University () Probability Seminar 45 / 59

46 Asymptotics of ˆθ N : Proposition 2 Let X t be the stationary fou process with parameters θ = (H, λ). Then, for any positive-definite matrix A, the GMM estimator of θ is consistent for any H (0, 1) and: If H (0, 3 4 ): N(ˆθ N θ) d N(0, C(θ)ΛC(θ) ) where C(θ) = [G(θ) AG(θ)] 1 G(θ) A and Λ = 2c 2 H α4h K(α). If H = 3 4, then N log N (ˆθ N θ) d N where (C(θ)) 1 is the first column of C(θ). ( ) 0, 2c2 θ α (C(θ)) 1(C(θ)) 1 Department of Statistics, Purdue University () Probability Seminar 46 / 59

47 Asymptotics of ˆθ N : If H > 3 4, the GMM estimator does not converge to a multivariate normal law or even a second-chaos law. Department of Statistics, Purdue University () Probability Seminar 47 / 59

48 Outline 1 Preliminaries 2 Joint estimation of Gaussian stationary processes 3 fou Case 4 Simulation Department of Statistics, Purdue University () Probability Seminar 48 / 59

49 Numerical Details Approximation of ρ θ (t) = 2σ 2 c H 0 cos(tx) x1 2H λ 2 +x 2 dx: Filon s Method (integration of oscillatory integrands) 4-5 precision digits. Simulation of fbm: Davies and Harte (1987). Approximation of fbm-ou process: the stationary fbm-ou satisfies: ti+1 X ti+1 = e λ X ti + σ e λ(t i u) dbu H t i where X 0 N(0, ρ θ (0)). For comparison purposes we use the yuima R package of Brouste and Iacus (2012). Finite-difference filters. Department of Statistics, Purdue University () Probability Seminar 49 / 59

50 Asymptotic behavior of N(ˆθ N θ) Figure: Asymptotic variance of N(Ĥ N 0.62) Department of Statistics, Purdue University () Probability Seminar 50 / 59

51 Asymptotic behavior of N(ˆθ N θ) Figure: Asymptotic variance of N(ˆλ N 0.8) Department of Statistics, Purdue University () Probability Seminar 51 / 59

52 Numerical Details (H, λ) = (0.37, 1.45) N = 1000, α = 1. L = 3 (filters maximum order) Number of repetitions in the sampling dist.: 100. Efficient matrix estimation: Continuous-updating estimator. Department of Statistics, Purdue University () Probability Seminar 52 / 59

53 Sampling distribution of ĤN Department of Statistics, Purdue University () Probability Seminar 53 / 59

54 Sampling distribution of ˆλ N Department of Statistics, Purdue University () Probability Seminar 54 / 59

55 Some statistics... Empirical MSE of ˆθ N Asymp. variance of ĤN (empirical) Asymp. variance of Ĥ N (theoretical) Asymp. variance of ˆλ N (empirical) Asymp. variance of ˆλ N (theoretical) Department of Statistics, Purdue University () Probability Seminar 55 / 59

56 Sampling distribution of ĤN (H, λ) = (0.8, 2) with 4 filters. Department of Statistics, Purdue University () Probability Seminar 56 / 59

57 Sampling distribution of ˆλ N (H, λ) = (0.8, 2) with 4 filters. Department of Statistics, Purdue University () Probability Seminar 57 / 59

58 Future Work More accurate simulation of the fou process? Asymptotic law of ˆθ N when H > 3 4? (fou case) Department of Statistics, Purdue University () Probability Seminar 58 / 59

59 Thanks! Department of Statistics, Purdue University () Probability Seminar 59 / 59

LAN property for sde s with additive fractional noise and continuous time observation

LAN property for sde s with additive fractional noise and continuous time observation LAN property for sde s with additive fractional noise and continuous time observation Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Samy Tindel (Purdue University) Vlad s 6th birthday,

More information

Topics in fractional Brownian motion

Topics in fractional Brownian motion Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in

More information

Variations and Hurst index estimation for a Rosenblatt process using longer filters

Variations and Hurst index estimation for a Rosenblatt process using longer filters Variations and Hurst index estimation for a Rosenblatt process using longer filters Alexandra Chronopoulou 1, Ciprian A. Tudor Frederi G. Viens 1,, 1 Department of Statistics, Purdue University, 15. University

More information

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Austrian Journal of Statistics April 27, Volume 46, 67 78. AJS http://www.ajs.or.at/ doi:.773/ajs.v46i3-4.672 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Yuliya

More information

Long-Range Dependence and Self-Similarity. c Vladas Pipiras and Murad S. Taqqu

Long-Range Dependence and Self-Similarity. c Vladas Pipiras and Murad S. Taqqu Long-Range Dependence and Self-Similarity c Vladas Pipiras and Murad S. Taqqu January 24, 2016 Contents Contents 2 Preface 8 List of abbreviations 10 Notation 11 1 A brief overview of times series and

More information

Rough paths methods 4: Application to fbm

Rough paths methods 4: Application to fbm Rough paths methods 4: Application to fbm Samy Tindel Purdue University University of Aarhus 2016 Samy T. (Purdue) Rough Paths 4 Aarhus 2016 1 / 67 Outline 1 Main result 2 Construction of the Levy area:

More information

Stein s method and weak convergence on Wiener space

Stein s method and weak convergence on Wiener space Stein s method and weak convergence on Wiener space Giovanni PECCATI (LSTA Paris VI) January 14, 2008 Main subject: two joint papers with I. Nourdin (Paris VI) Stein s method on Wiener chaos (ArXiv, December

More information

SELF-SIMILARITY PARAMETER ESTIMATION AND REPRODUCTION PROPERTY FOR NON-GAUSSIAN HERMITE PROCESSES

SELF-SIMILARITY PARAMETER ESTIMATION AND REPRODUCTION PROPERTY FOR NON-GAUSSIAN HERMITE PROCESSES Communications on Stochastic Analysis Vol. 5, o. 1 11 161-185 Serials Publications www.serialspublications.com SELF-SIMILARITY PARAMETER ESTIMATIO AD REPRODUCTIO PROPERTY FOR O-GAUSSIA HERMITE PROCESSES

More information

Hypothesis testing of the drift parameter sign for fractional Ornstein Uhlenbeck process

Hypothesis testing of the drift parameter sign for fractional Ornstein Uhlenbeck process Electronic Journal of Statistics Vol. 11 217 385 4 ISSN: 1935-7524 DOI: 1.1214/17-EJS1237 Hypothesis testing of the drift parameter sign for fractional Ornstein Uhlenbeck process Alexander Kukush Department

More information

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior Ehsan Azmoodeh University of Vaasa Finland 7th General AMaMeF and Swissquote Conference September 7 1, 215 Outline

More information

Simulation of stationary fractional Ornstein-Uhlenbeck process and application to turbulence

Simulation of stationary fractional Ornstein-Uhlenbeck process and application to turbulence Simulation of stationary fractional Ornstein-Uhlenbeck process and application to turbulence Georg Schoechtel TU Darmstadt, IRTG 1529 Japanese-German International Workshop on Mathematical Fluid Dynamics,

More information

Malliavin calculus and central limit theorems

Malliavin calculus and central limit theorems Malliavin calculus and central limit theorems David Nualart Department of Mathematics Kansas University Seminar on Stochastic Processes 2017 University of Virginia March 8-11 2017 David Nualart (Kansas

More information

Adaptive wavelet decompositions of stochastic processes and some applications

Adaptive wavelet decompositions of stochastic processes and some applications Adaptive wavelet decompositions of stochastic processes and some applications Vladas Pipiras University of North Carolina at Chapel Hill SCAM meeting, June 1, 2012 (joint work with G. Didier, P. Abry)

More information

Integral representations in models with long memory

Integral representations in models with long memory Integral representations in models with long memory Georgiy Shevchenko, Yuliya Mishura, Esko Valkeila, Lauri Viitasaari, Taras Shalaiko Taras Shevchenko National University of Kyiv 29 September 215, Ulm

More information

Stochastic volatility models: tails and memory

Stochastic volatility models: tails and memory : tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier Plan Model assumptions; Limit theorems for partial sums and

More information

I forgot to mention last time: in the Ito formula for two standard processes, putting

I forgot to mention last time: in the Ito formula for two standard processes, putting I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy

More information

LAN property for ergodic jump-diffusion processes with discrete observations

LAN property for ergodic jump-diffusion processes with discrete observations LAN property for ergodic jump-diffusion processes with discrete observations Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Arturo Kohatsu-Higa (Ritsumeikan University, Japan) &

More information

Variations and estimators for selfsimilarity parameters via Malliavin calculus

Variations and estimators for selfsimilarity parameters via Malliavin calculus Variations and estimators for selfsimilarity parameters via Malliavin calculus Ciprian A. Tudor Frederi G. Viens SAMOS-MATISSE, Centre d Economie de La Sorbonne, Université de Paris Panthéon-Sorbonne,

More information

Optimal series representations of continuous Gaussian random fields

Optimal series representations of continuous Gaussian random fields Optimal series representations of continuous Gaussian random fields Antoine AYACHE Université Lille 1 - Laboratoire Paul Painlevé A. Ayache (Lille 1) Optimality of continuous Gaussian series 04/25/2012

More information

Mean-field SDE driven by a fractional BM. A related stochastic control problem

Mean-field SDE driven by a fractional BM. A related stochastic control problem Mean-field SDE driven by a fractional BM. A related stochastic control problem Rainer Buckdahn, Université de Bretagne Occidentale, Brest Durham Symposium on Stochastic Analysis, July 1th to July 2th,

More information

Struktura zależności dla rozwiązań ułamkowych równań z szumem α-stabilnym Marcin Magdziarz

Struktura zależności dla rozwiązań ułamkowych równań z szumem α-stabilnym Marcin Magdziarz Politechnika Wrocławska Instytut Matematyki i Informatyki Struktura zależności dla rozwiązań ułamkowych równań z szumem α-stabilnym Marcin Magdziarz Rozprawa doktorska Promotor: Prof. dr hab. Aleksander

More information

Minimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process

Minimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process isid/ms/23/25 September 11, 23 http://www.isid.ac.in/ statmath/eprints Minimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process B. L. S. Prakasa Rao Indian Statistical Institute, Delhi

More information

arxiv: v4 [math.pr] 19 Jan 2009

arxiv: v4 [math.pr] 19 Jan 2009 The Annals of Probability 28, Vol. 36, No. 6, 2159 2175 DOI: 1.1214/7-AOP385 c Institute of Mathematical Statistics, 28 arxiv:75.57v4 [math.pr] 19 Jan 29 ASYMPTOTIC BEHAVIOR OF WEIGHTED QUADRATIC AND CUBIC

More information

From Fractional Brownian Motion to Multifractional Brownian Motion

From Fractional Brownian Motion to Multifractional Brownian Motion From Fractional Brownian Motion to Multifractional Brownian Motion Antoine Ayache USTL (Lille) Antoine.Ayache@math.univ-lille1.fr Cassino December 2010 A.Ayache (USTL) From FBM to MBM Cassino December

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

NEW FUNCTIONAL INEQUALITIES

NEW FUNCTIONAL INEQUALITIES 1 / 29 NEW FUNCTIONAL INEQUALITIES VIA STEIN S METHOD Giovanni Peccati (Luxembourg University) IMA, Minneapolis: April 28, 2015 2 / 29 INTRODUCTION Based on two joint works: (1) Nourdin, Peccati and Swan

More information

Harnack Inequalities and Applications for Stochastic Equations

Harnack Inequalities and Applications for Stochastic Equations p. 1/32 Harnack Inequalities and Applications for Stochastic Equations PhD Thesis Defense Shun-Xiang Ouyang Under the Supervision of Prof. Michael Röckner & Prof. Feng-Yu Wang March 6, 29 p. 2/32 Outline

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

hal , version 2-18 Jun 2010

hal , version 2-18 Jun 2010 SELF-SIMILARITY PARAMETER ESTIMATIO AD REPRODUCTIO PROPERTY FOR O-GAUSSIA HERMITE PROCESSES ALEXADRA CHROOPOULOU, FREDERI G. VIES, AD CIPRIA A. TUDOR hal-9438, version - 18 Jun 1 Abstract. Let (Z (q,h)

More information

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010 1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Autosimilarité et anisotropie : applications en imagerie médicale

Autosimilarité et anisotropie : applications en imagerie médicale 1 Autosimilarité et anisotropie : applications en imagerie médicale Hermine Biermé journées MAS, Bordeaux, 03/09/2010 ANR-09-BLAN-0029-01 mataim [F. Richard (MAP5)] (www.mataim.fr) Coll: INSERM U 658 [Pr.

More information

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain

f(x θ)dx with respect to θ. Assuming certain smoothness conditions concern differentiating under the integral the integral sign, we first obtain 0.1. INTRODUCTION 1 0.1 Introduction R. A. Fisher, a pioneer in the development of mathematical statistics, introduced a measure of the amount of information contained in an observaton from f(x θ). Fisher

More information

Closest Moment Estimation under General Conditions

Closest Moment Estimation under General Conditions Closest Moment Estimation under General Conditions Chirok Han Victoria University of Wellington New Zealand Robert de Jong Ohio State University U.S.A October, 2003 Abstract This paper considers Closest

More information

University of Pavia. M Estimators. Eduardo Rossi

University of Pavia. M Estimators. Eduardo Rossi University of Pavia M Estimators Eduardo Rossi Criterion Function A basic unifying notion is that most econometric estimators are defined as the minimizers of certain functions constructed from the sample

More information

MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES

MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES D. NUALART Department of Mathematics, University of Kansas Lawrence, KS 6645, USA E-mail: nualart@math.ku.edu S. ORTIZ-LATORRE Departament de Probabilitat,

More information

An Introduction to Malliavin calculus and its applications

An Introduction to Malliavin calculus and its applications An Introduction to Malliavin calculus and its applications Lecture 3: Clark-Ocone formula David Nualart Department of Mathematics Kansas University University of Wyoming Summer School 214 David Nualart

More information

-variation of the divergence integral w.r.t. fbm with Hurst parameter H < 1 2

-variation of the divergence integral w.r.t. fbm with Hurst parameter H < 1 2 /4 On the -variation of the divergence integral w.r.t. fbm with urst parameter < 2 EL ASSAN ESSAKY joint work with : David Nualart Cadi Ayyad University Poly-disciplinary Faculty, Safi Colloque Franco-Maghrébin

More information

Quantitative stable limit theorems on the Wiener space

Quantitative stable limit theorems on the Wiener space Quantitative stable limit theorems on the Wiener space by Ivan Nourdin, David Nualart and Giovanni Peccati Université de Lorraine, Kansas University and Université du Luxembourg Abstract: We use Malliavin

More information

Rough paths methods 1: Introduction

Rough paths methods 1: Introduction Rough paths methods 1: Introduction Samy Tindel Purdue University University of Aarhus 216 Samy T. (Purdue) Rough Paths 1 Aarhus 216 1 / 16 Outline 1 Motivations for rough paths techniques 2 Summary of

More information

ERROR BOUNDS ON THE NON-NORMAL APPROXI- MATION OF HERMITE POWER VARIATIONS OF FRAC- TIONAL BROWNIAN MOTION

ERROR BOUNDS ON THE NON-NORMAL APPROXI- MATION OF HERMITE POWER VARIATIONS OF FRAC- TIONAL BROWNIAN MOTION Elect. Comm. in Probab. 13 28), 482 493 ELECTRONIC COMMUNICATIONS in PROBABILITY ERROR BOUNDS ON THE NON-NORMAL APPROXI- MATION OF HERMITE POWER VARIATIONS OF FRAC- TIONAL BROWNIAN MOTION JEAN-CHRISTOPHE

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Regularity of the density for the stochastic heat equation

Regularity of the density for the stochastic heat equation Regularity of the density for the stochastic heat equation Carl Mueller 1 Department of Mathematics University of Rochester Rochester, NY 15627 USA email: cmlr@math.rochester.edu David Nualart 2 Department

More information

Generalized fractional Lévy processes 1

Generalized fractional Lévy processes 1 Generalized fractional Lévy processes 1 Claudia Klüppelberg Center for Mathematical Sciences and Institute for Advanced Study Technische Universität München Sandbjerg, January 2010 1 1 joint work with

More information

Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri

Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri Asymptotical distribution free test for parameter change in a diffusion model (joint work with Y. Nishiyama) Ilia Negri University of Bergamo (Italy) ilia.negri@unibg.it SAPS VIII, Le Mans 21-24 March,

More information

KOLMOGOROV DISTANCE FOR MULTIVARIATE NORMAL APPROXIMATION. Yoon Tae Kim and Hyun Suk Park

KOLMOGOROV DISTANCE FOR MULTIVARIATE NORMAL APPROXIMATION. Yoon Tae Kim and Hyun Suk Park Korean J. Math. 3 (015, No. 1, pp. 1 10 http://dx.doi.org/10.11568/kjm.015.3.1.1 KOLMOGOROV DISTANCE FOR MULTIVARIATE NORMAL APPROXIMATION Yoon Tae Kim and Hyun Suk Park Abstract. This paper concerns the

More information

Wiener integrals, Malliavin calculus and covariance measure structure

Wiener integrals, Malliavin calculus and covariance measure structure Wiener integrals, Malliavin calculus and covariance measure structure Ida Kruk 1 Francesco Russo 1 Ciprian A. Tudor 1 Université de Paris 13, Institut Galilée, Mathématiques, 99, avenue J.B. Clément, F-9343,

More information

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES

A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES A NOTE ON STOCHASTIC INTEGRALS AS L 2 -CURVES STEFAN TAPPE Abstract. In a work of van Gaans (25a) stochastic integrals are regarded as L 2 -curves. In Filipović and Tappe (28) we have shown the connection

More information

Representing Gaussian Processes with Martingales

Representing Gaussian Processes with Martingales Representing Gaussian Processes with Martingales with Application to MLE of Langevin Equation Tommi Sottinen University of Vaasa Based on ongoing joint work with Lauri Viitasaari, University of Saarland

More information

Gaussian Random Fields: Geometric Properties and Extremes

Gaussian Random Fields: Geometric Properties and Extremes Gaussian Random Fields: Geometric Properties and Extremes Yimin Xiao Michigan State University Outline Lecture 1: Gaussian random fields and their regularity Lecture 2: Hausdorff dimension results and

More information

COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE

COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE Communications on Stochastic Analysis Vol. 4, No. 3 (21) 299-39 Serials Publications www.serialspublications.com COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE NICOLAS PRIVAULT

More information

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random

More information

ELEMENTS OF PROBABILITY THEORY

ELEMENTS OF PROBABILITY THEORY ELEMENTS OF PROBABILITY THEORY Elements of Probability Theory A collection of subsets of a set Ω is called a σ algebra if it contains Ω and is closed under the operations of taking complements and countable

More information

Statistical Properties of Numerical Derivatives

Statistical Properties of Numerical Derivatives Statistical Properties of Numerical Derivatives Han Hong, Aprajit Mahajan, and Denis Nekipelov Stanford University and UC Berkeley November 2010 1 / 63 Motivation Introduction Many models have objective

More information

arxiv:math/ v2 [math.pr] 9 Mar 2007

arxiv:math/ v2 [math.pr] 9 Mar 2007 arxiv:math/0703240v2 [math.pr] 9 Mar 2007 Central limit theorems for multiple stochastic integrals and Malliavin calculus D. Nualart and S. Ortiz-Latorre November 2, 2018 Abstract We give a new characterization

More information

Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises

Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises Least Squares Estimators for Stochastic Differential Equations Driven by Small Lévy Noises Hongwei Long* Department of Mathematical Sciences, Florida Atlantic University, Boca Raton Florida 33431-991,

More information

Tail empirical process for long memory stochastic volatility models

Tail empirical process for long memory stochastic volatility models Tail empirical process for long memory stochastic volatility models Rafa l Kulik and Philippe Soulier Carleton University, 5 May 2010 Rafa l Kulik and Philippe Soulier Quick review of limit theorems and

More information

Closest Moment Estimation under General Conditions

Closest Moment Estimation under General Conditions Closest Moment Estimation under General Conditions Chirok Han and Robert de Jong January 28, 2002 Abstract This paper considers Closest Moment (CM) estimation with a general distance function, and avoids

More information

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ)

Example 4.1 Let X be a random variable and f(t) a given function of time. Then. Y (t) = f(t)x. Y (t) = X sin(ωt + δ) Chapter 4 Stochastic Processes 4. Definition In the previous chapter we studied random variables as functions on a sample space X(ω), ω Ω, without regard to how these might depend on parameters. We now

More information

Stochastic Differential Equations

Stochastic Differential Equations Chapter 5 Stochastic Differential Equations We would like to introduce stochastic ODE s without going first through the machinery of stochastic integrals. 5.1 Itô Integrals and Itô Differential Equations

More information

On Stochastic Adaptive Control & its Applications. Bozenna Pasik-Duncan University of Kansas, USA

On Stochastic Adaptive Control & its Applications. Bozenna Pasik-Duncan University of Kansas, USA On Stochastic Adaptive Control & its Applications Bozenna Pasik-Duncan University of Kansas, USA ASEAS Workshop, AFOSR, 23-24 March, 2009 1. Motivation: Work in the 1970's 2. Adaptive Control of Continuous

More information

A Short Introduction to Diffusion Processes and Ito Calculus

A Short Introduction to Diffusion Processes and Ito Calculus A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,

More information

The Wiener Itô Chaos Expansion

The Wiener Itô Chaos Expansion 1 The Wiener Itô Chaos Expansion The celebrated Wiener Itô chaos expansion is fundamental in stochastic analysis. In particular, it plays a crucial role in the Malliavin calculus as it is presented in

More information

arxiv: v1 [math.pr] 23 Jan 2018

arxiv: v1 [math.pr] 23 Jan 2018 TRANSFER PRINCIPLE FOR nt ORDER FRACTIONAL BROWNIAN MOTION WIT APPLICATIONS TO PREDICTION AND EQUIVALENCE IN LAW TOMMI SOTTINEN arxiv:181.7574v1 [math.pr 3 Jan 18 Department of Mathematics and Statistics,

More information

Contents. 1 Preliminaries 3. Martingales

Contents. 1 Preliminaries 3. Martingales Table of Preface PART I THE FUNDAMENTAL PRINCIPLES page xv 1 Preliminaries 3 2 Martingales 9 2.1 Martingales and examples 9 2.2 Stopping times 12 2.3 The maximum inequality 13 2.4 Doob s inequality 14

More information

Statistical inference on Lévy processes

Statistical inference on Lévy processes Alberto Coca Cabrero University of Cambridge - CCA Supervisors: Dr. Richard Nickl and Professor L.C.G.Rogers Funded by Fundación Mutua Madrileña and EPSRC MASDOC/CCA student workshop 2013 26th March Outline

More information

Nonlinear Error Correction Model and Multiple-Threshold Cointegration May 23, / 31

Nonlinear Error Correction Model and Multiple-Threshold Cointegration May 23, / 31 Nonlinear Error Correction Model and Multiple-Threshold Cointegration Man Wang Dong Hua University, China Joint work with N.H.Chan May 23, 2014 Nonlinear Error Correction Model and Multiple-Threshold Cointegration

More information

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt. The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes

More information

A Class of Fractional Stochastic Differential Equations

A Class of Fractional Stochastic Differential Equations Vietnam Journal of Mathematics 36:38) 71 79 Vietnam Journal of MATHEMATICS VAST 8 A Class of Fractional Stochastic Differential Equations Nguyen Tien Dung Department of Mathematics, Vietnam National University,

More information

arxiv: v1 [math.pr] 7 May 2013

arxiv: v1 [math.pr] 7 May 2013 The optimal fourth moment theorem Ivan Nourdin and Giovanni Peccati May 8, 2013 arxiv:1305.1527v1 [math.pr] 7 May 2013 Abstract We compute the exact rates of convergence in total variation associated with

More information

Graduate Econometrics I: Maximum Likelihood I

Graduate Econometrics I: Maximum Likelihood I Graduate Econometrics I: Maximum Likelihood I Yves Dominicy Université libre de Bruxelles Solvay Brussels School of Economics and Management ECARES Yves Dominicy Graduate Econometrics I: Maximum Likelihood

More information

Vast Volatility Matrix Estimation for High Frequency Data

Vast Volatility Matrix Estimation for High Frequency Data Vast Volatility Matrix Estimation for High Frequency Data Yazhen Wang National Science Foundation Yale Workshop, May 14-17, 2009 Disclaimer: My opinion, not the views of NSF Y. Wang (at NSF) 1 / 36 Outline

More information

LARGE SAMPLE BEHAVIOR OF SOME WELL-KNOWN ROBUST ESTIMATORS UNDER LONG-RANGE DEPENDENCE

LARGE SAMPLE BEHAVIOR OF SOME WELL-KNOWN ROBUST ESTIMATORS UNDER LONG-RANGE DEPENDENCE LARGE SAMPLE BEHAVIOR OF SOME WELL-KNOWN ROBUST ESTIMATORS UNDER LONG-RANGE DEPENDENCE C. LÉVY-LEDUC, H. BOISTARD, E. MOULINES, M. S. TAQQU, AND V. A. REISEN Abstract. The paper concerns robust location

More information

Memory and hypoellipticity in neuronal models

Memory and hypoellipticity in neuronal models Memory and hypoellipticity in neuronal models S. Ditlevsen R. Höpfner E. Löcherbach M. Thieullen Banff, 2017 What this talk is about : What is the effect of memory in probabilistic models for neurons?

More information

Statistical Aspects of the Fractional Stochastic Calculus

Statistical Aspects of the Fractional Stochastic Calculus Statistical Aspects of the Fractional Stochastic Calculus Ciprian A. Tudor Frederi G. Viens SAMOS-MATISSE, Centre d'economie de La Sorbonne, Universite de Paris Pantheon-Sorbonne, 9, rue de Tolbiac, 75634,

More information

Some Tools From Stochastic Analysis

Some Tools From Stochastic Analysis W H I T E Some Tools From Stochastic Analysis J. Potthoff Lehrstuhl für Mathematik V Universität Mannheim email: potthoff@math.uni-mannheim.de url: http://ls5.math.uni-mannheim.de To close the file, click

More information

arxiv: v1 [math.pr] 10 Jan 2019

arxiv: v1 [math.pr] 10 Jan 2019 Gaussian lower bounds for the density via Malliavin calculus Nguyen Tien Dung arxiv:191.3248v1 [math.pr] 1 Jan 219 January 1, 219 Abstract The problem of obtaining a lower bound for the density is always

More information

arxiv: v2 [math.pr] 14 Dec 2009

arxiv: v2 [math.pr] 14 Dec 2009 The Annals of Probability 29, Vol. 37, No. 6, 22 223 DOI: 1.1214/9-AOP473 c Institute of Mathematical Statistics, 29 arxiv:82.337v2 [math.pr] 14 Dec 29 ASYMPTOTIC BEHAVIOR OF WEIGHTED QUADRATIC VARIATIONS

More information

arxiv: v1 [math.pr] 3 Jan 2019

arxiv: v1 [math.pr] 3 Jan 2019 MODELLING ITALIAN MORTALITY RATES WITH A GEOMETRIC-TYPE FRACTIONAL ORNSTEIN-UHLENBECK PROCESS. FRANCISCO DELGADO-VENCES AND ARELLY ORNELAS arxiv:1901.00795v1 [math.pr] 3 Jan 2019 Keywords: Mortality rate,

More information

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please

More information

Gaussian Random Fields: Excursion Probabilities

Gaussian Random Fields: Excursion Probabilities Gaussian Random Fields: Excursion Probabilities Yimin Xiao Michigan State University Lecture 5 Excursion Probabilities 1 Some classical results on excursion probabilities A large deviation result Upper

More information

arxiv: v2 [math.pr] 22 Aug 2009

arxiv: v2 [math.pr] 22 Aug 2009 On the structure of Gaussian random variables arxiv:97.25v2 [math.pr] 22 Aug 29 Ciprian A. Tudor SAMOS/MATISSE, Centre d Economie de La Sorbonne, Université de Panthéon-Sorbonne Paris, 9, rue de Tolbiac,

More information

Module 9: Stationary Processes

Module 9: Stationary Processes Module 9: Stationary Processes Lecture 1 Stationary Processes 1 Introduction A stationary process is a stochastic process whose joint probability distribution does not change when shifted in time or space.

More information

Stein's method meets Malliavin calculus: a short survey with new estimates. Contents

Stein's method meets Malliavin calculus: a short survey with new estimates. Contents Stein's method meets Malliavin calculus: a short survey with new estimates by Ivan Nourdin and Giovanni Peccati Université Paris VI and Université Paris Ouest Abstract: We provide an overview of some recent

More information

Ergodicity of Stochastic Differential Equations Driven by Fractional Brownian Motion

Ergodicity of Stochastic Differential Equations Driven by Fractional Brownian Motion Ergodicity of Stochastic Differential Equations Driven by Fractional Brownian Motion April 15, 23 Martin Hairer Mathematics Research Centre, University of Warwick Email: hairer@maths.warwick.ac.uk Abstract

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

HITTING PROBABILITIES FOR GENERAL GAUSSIAN PROCESSES

HITTING PROBABILITIES FOR GENERAL GAUSSIAN PROCESSES HITTING PROBABILITIES FOR GENERAL GAUSSIAN PROCESSES EULALIA NUALART AND FREDERI VIENS Abstract. For a scalar Gaussian process B on R + with a prescribed general variance function γ 2 r = Var B r and a

More information

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula

Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Partial Differential Equations with Applications to Finance Seminar 1: Proving and applying Dynkin s formula Group 4: Bertan Yilmaz, Richard Oti-Aboagye and Di Liu May, 15 Chapter 1 Proving Dynkin s formula

More information

Itô s formula. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Itô s formula. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Itô s formula Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Itô s formula Probability Theory

More information

Exercises. T 2T. e ita φ(t)dt.

Exercises. T 2T. e ita φ(t)dt. Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.

More information

Independence of some multiple Poisson stochastic integrals with variable-sign kernels

Independence of some multiple Poisson stochastic integrals with variable-sign kernels Independence of some multiple Poisson stochastic integrals with variable-sign kernels Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological

More information

Asymptotic distribution of GMM Estimator

Asymptotic distribution of GMM Estimator Asymptotic distribution of GMM Estimator Eduardo Rossi University of Pavia Econometria finanziaria 2010 Rossi (2010) GMM 2010 1 / 45 Outline 1 Asymptotic Normality of the GMM Estimator 2 Long Run Covariance

More information

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.

More information

Understanding Regressions with Observations Collected at High Frequency over Long Span

Understanding Regressions with Observations Collected at High Frequency over Long Span Understanding Regressions with Observations Collected at High Frequency over Long Span Yoosoon Chang Department of Economics, Indiana University Joon Y. Park Department of Economics, Indiana University

More information

An adaptive numerical scheme for fractional differential equations with explosions

An adaptive numerical scheme for fractional differential equations with explosions An adaptive numerical scheme for fractional differential equations with explosions Johanna Garzón Departamento de Matemáticas, Universidad Nacional de Colombia Seminario de procesos estocásticos Jointly

More information

On pathwise stochastic integration

On pathwise stochastic integration On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic

More information

Nonlinear Filtering Revisited: a Spectral Approach, II

Nonlinear Filtering Revisited: a Spectral Approach, II Nonlinear Filtering Revisited: a Spectral Approach, II Sergey V. Lototsky Center for Applied Mathematical Sciences University of Southern California Los Angeles, CA 90089-3 sergey@cams-00.usc.edu Remijigus

More information

Gaussian, Markov and stationary processes

Gaussian, Markov and stationary processes Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November

More information

Theoretical Statistics. Lecture 17.

Theoretical Statistics. Lecture 17. Theoretical Statistics. Lecture 17. Peter Bartlett 1. Asymptotic normality of Z-estimators: classical conditions. 2. Asymptotic equicontinuity. 1 Recall: Delta method Theorem: Supposeφ : R k R m is differentiable

More information