Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments

Similar documents
Hypothesis testing of the drift parameter sign for fractional Ornstein Uhlenbeck process

LAN property for sde s with additive fractional noise and continuous time observation

Integral representations in models with long memory

Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion

Representations of Gaussian measures that are equivalent to Wiener measure

arxiv: v1 [math.pr] 23 Jan 2018

Minimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process

Topics in fractional Brownian motion

557: MATHEMATICAL STATISTICS II BIAS AND VARIANCE

ON A SDE DRIVEN BY A FRACTIONAL BROWNIAN MOTION AND WITH MONOTONE DRIFT

Representing Gaussian Processes with Martingales

Bernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012

Bernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010

A Class of Fractional Stochastic Differential Equations

Regularity of the density for the stochastic heat equation

Gaussian, Markov and stationary processes

I forgot to mention last time: in the Ito formula for two standard processes, putting

ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process

LAN property for ergodic jump-diffusion processes with discrete observations

MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES

Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

PARAMETER ESTIMATION FOR SPDES WITH MULTIPLICATIVE FRACTIONAL NOISE

DA Freedman Notes on the MLE Fall 2003

PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Packing-Dimension Profiles and Fractional Brownian Motion

Generalized Gaussian Bridges of Prediction-Invertible Processes

Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***

An adaptive numerical scheme for fractional differential equations with explosions

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

On the quantiles of the Brownian motion and their hitting times.

Rough paths methods 4: Application to fbm

t x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.

The Codimension of the Zeros of a Stable Process in Random Scenery

Mathematical statistics

Convergence at first and second order of some approximations of stochastic integrals

Lecture 7 Introduction to Statistical Decision Theory

On a class of stochastic differential equations in a financial network model

Consistent Bivariate Distribution

COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE

The concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.

Exercises. T 2T. e ita φ(t)dt.

9 Brownian Motion: Construction

A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand

Conditional Full Support for Gaussian Processes with Stationary Increments

UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE

Solution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have

On pathwise stochastic integration

ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS

Lecture 21 Representations of Martingales

PROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS

Lecture 12. F o s, (1.1) F t := s>t

Homogeneous Stochastic Differential Equations

Expectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda

A Short Introduction to Diffusion Processes and Ito Calculus

Asymptotic Properties of an Approximate Maximum Likelihood Estimator for Stochastic PDEs

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes

1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].

Malliavin calculus and central limit theorems

Expectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES

Supermodular ordering of Poisson arrays

Chapter 3. Point Estimation. 3.1 Introduction

Stochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno

Stochastic Differential Equations.

Some Properties of NSFDEs

Mi-Hwa Ko. t=1 Z t is true. j=0

1. Stochastic Processes and filtrations

Lecture 22 Girsanov s Theorem

A Primer on Asymptotics

Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14

1 Brownian Local Time

Large Deviations for Small-Noise Stochastic Differential Equations

Gaussian Processes. 1. Basic Notions

Covariance function estimation in Gaussian process regression

IEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion

System Identification, Lecture 4

STOCHASTIC GEOMETRY BIOIMAGING

-variation of the divergence integral w.r.t. fbm with Hurst parameter H < 1 2

A Few Notes on Fisher Information (WIP)

Harmonic Analysis. 1. Hermite Polynomials in Dimension One. Recall that if L 2 ([0 2π] ), then we can write as

Exact Simulation of Multivariate Itô Diffusions

Research Statement. Mamikon S. Ginovyan. Boston University

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek

EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS

14 - Gaussian Stochastic Processes

Discretization of SDEs: Euler Methods and Beyond

ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER

The Cameron-Martin-Girsanov (CMG) Theorem

Review. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda

Variations. ECE 6540, Lecture 10 Maximum Likelihood Estimation

Multiple points of the Brownian sheet in critical dimensions

WHITE NOISE APPROACH TO THE ITÔ FORMULA FOR THE STOCHASTIC HEAT EQUATION

arxiv: v2 [math.pr] 27 Oct 2015

Citation Osaka Journal of Mathematics. 41(4)

Properties of the Autocorrelation Function

Large Deviations for Small-Noise Stochastic Differential Equations

MAS223 Statistical Inference and Modelling Exercises

Transcription:

Austrian Journal of Statistics April 27, Volume 46, 67 78. AJS http://www.ajs.or.at/ doi:.773/ajs.v46i3-4.672 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Yuliya Mishura Taras Shevchenko National University of Kyiv Kostiantyn Ralchenko Taras Shevchenko National University of Kyiv Sergiy Shklyar Taras Shevchenko National University of Kyiv Abstract The paper deals with the regression model X t θt+b t, t [, T ], where B {B t, t } is a centered Gaussian process with stationary increments. We study the estimation of the unknown parameter θ and establish the formula for the likelihood function in terms of a solution to an integral equation. Then we find the maximum likelihood estimator and prove its strong consistency. The results obtained generalize the known results for fractional and mixed fractional Brownian motion. Keywords: Gaussian process, stationary increments, discrete observations, continuous observations; maximum likelihood estimator, strong consistency, fractional Brownian motion, mixed fractional Brownian motion.. Introduction We study the problem of the drift parameter estimation for the stochastic process X t θt + B t, where θ R is an unknown parameter, and B {B t, t } is a centered Gaussian process with stationary increments, B. In the particular case when B B H is a fractional Brownian motion, this model has been studied by many authors. Mention the paper by Norros, Valkeila, and Virtamo 999 that treats the maximum likelihood estimation by continuous observations of the trajectory of X on the interval [, T ] see also Le Breton 998. Further, Hu, Nualart, Xiao, and Zhang 2 investigate the exact maximum likelihood estimator by discrete observations at the points tk kh, k, 2,..., N; Bertin, Torres, and Tudor 2 consider the maximum likelihood estimation in the discrete scheme of observations, where the trajectory of X is observed at the points t k k N, k, 2,..., N α, α >. For hypothesis testing of the drift parameter sign in the model driven by a fractional Brownian motion, see Stiburek 27. The paper Cai, Chigansky, and Kleptsyna 26 treats the likelihood function for Gaussian processes not necessarily having stationary increments. However, on the one hand, our approach is different from their one, it cannot be deduced from their general formulas and on the other hand, gives rather elegant representations. The construction of the maximum likelihood estimator in the case where B is the sum of two fractional Brow-

68 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments nian motions was studied in Mishura 26 and Mishura and Voronov 25. A similar non-gaussian model driven by the Rosenblatt process was considered in Bertin et al. 2. As already mentioned, we consider the case when B is a centered Gaussian processes with stationary increments. We construct the maximum likelihood estimators for both discrete and continuous schemes of observations. The assumptions on the process in the continuous case are formulated in terms of the second derivative of its covariance function, see Assumptions and 2. The exact formula for the maximum likelihood estimator contains a solution of an integral equation with the kernel obtained after the differentiation. We give the sufficient conditions for the strong consistency of the estimators. Several examples of the process B are considered. The paper is organized as follows. Section 2 is devoted to the case of the discrete observations. The maximum likelihood estimation for continuous time is studied in Section 3. 2. Maximum likelihood estimation by discrete observations We start with the construction of the likelihood function and the maximum likelihood estimator in the case of discrete observations. In the next section these results will be used for the derivation of the likelihood function in the continuous-time case, see the proof of Theorem 3.3. Let the process X defined by formula be observed at the points t k, k,,..., N, t < t <... < t N T. 2 The problem is to estimate the parameter θ by the observations X tk, k,,..., N of the process X t. 2.. Likelihood function and construction of the estimator Denote X N X tk X tk N k, BN B tk B tk N k. Note that in our model X t X, and the N-dimensional vector X N is a one-to-one function of the observations. The vectors B N and X N are Gaussian with different means except the case θ and the same covariance matrix. We denote this covariance matrix by N. The next maximum likelihood estimator coincides with the least square estimator considered in Rao 22, eq. 4a..5. Lemma 2.. Assume that the Gaussian distribution of the vector B tk N k is nonsingular. Then one can take the function L N X N x θ f θx {θz f x exp N θ 2 x 2 z N } z, 3 where z t k t k N k, as a likelihood function in the discrete-time model. MLE is linear with respect to the observations and equals ˆθ N z N X N z N z. 4 Proof. The pdf of B N with respect to the Lebesgue measure equals f θ x 2π N/2 exp { 2 det x θz N } x θz. N The density of the observations for given θ with respect to the distribution of the observations for θ is taken as a likelihood function.

Austrian Journal of Statistics 69 Remark 2.2. Let the process X be observed on a regular grid, i.e., at the points t k kh, k,..., N, where h >. Then N is a Toeplitz matrix, that is N k+l,l N l,k+l E B k+lh B k+l h Blh B l h E Bk+h B h E B kh B h does not depend on l due to the stationarity of increments. This simplifies the numerical computation of MLE. 2.2. Properties of the estimator Since X N B N + θz, the maximum likelihood estimator 4 equals ˆθ N θ + z N B N z N z. Lemma 2.3. Under the assumptions of Lemma 2., the estimator normally distributed. Its variance equals var ˆθ N z N z. ˆθ N is unbiased and Proof. The estimator ˆθ N is unbiased and normally distributed because ˆθ N θ is linear and centered Gaussian vector B N. The variance of the estimator is equal to var ˆθ var z N B N N z N 2 z N var B N N z z z N 2 z z N N N z z N z 2 z N z. To prove the consistency of the estimator, we need the following technical result. Lemma 2.4. If A R N N is a positive definite matrix, x R N, x is a non-zero vector, then x A x x 4 x Ax. Proof. As the matrix A is positive definite, x Ax > and there exists a positive definite matrix A /2 and so the matrix A /2 is symmetric and nonsingular such that A /2 2 A. By the Cauchy-Schwarz inequality, 2 x 4 x A /2 A x /2 A /2 x 2 A /2 x 2 x Ax x A x, whence the desired inequality follows. In the rest of this section we assume that the process X is observed on a regular grid, at the points t k kh, k,..., N, for some h >. We also assume that for any N the Gaussian distribution of the vector B kh N k is nonsingular. Theorem 2.5. Let h >, and E B k+h B kh Bh as k. Let ˆθ N be the ML estimator of parameter θ of the model by the observations X kh, k,..., N. Then the estimator ˆθ N is mean-square consistent, i.e., E ˆθN θ 2 as N.

7 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Proof. By Remark 2.2, the matrix N has Toeplitz structure, N l,m N l,m E B l m +h B l m h Bh. Moreover, N k,l does not depend on N as soon as N maxk, l. By Toeplitz theorem, N 2 l m N l,m N EB h 2 k2 2N + k N 2 E B kh B k h Bh lim k E B kh B k h Bh as N. For a regular grid we have that z h,..., h. Hence, in this case, z N z h 2 Finally, with the use of Lemma 2.3, N l m ˆθN 2 E θ z N z N z z z 4 h 2 N 2 N l,m, z h N. l m N l,m To prove the strong consistency, we need the following auxiliary statement. as N. Lemma 2.6. Let h >, and ˆθ N be the ML estimator of parameter θ of the model by the observations X kh, k,..., N. Then the random process ˆθ N has independent increments. Proof. In the next paragraph, N 2 N 3 are positive integers, I, I N2, N2 N 3 N 2 is N 2 N 3 diagonal matrix with ones on the diagonal, and its transpose is denoted by I. The vector B N 2 B h,..., B N2 h B N2 h is the beginning of vector B N 3 B h,..., B N3 h B N3 h ; the vector z N2 h,..., h R N 2 is the beginning of vector z N3 h,..., h R N 3, so Then For N N 2 N 3 B N 2 I, B N 3, z N2 I, z N3. E B N 3 B N 2 E N N 3 B N 3 I N 3 I, E ˆθ N 3 ˆθN 2 E z N 3 N 3 B N 3 B N 2 N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 z N 3 N 3 N 3 I N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 z N 3 I N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 z N 2 N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 E ˆθ N 3 ˆθN 2 ˆθ N E ˆθ N 3 ˆθN 2 E ˆθ N 3 ˆθN z N 3 N 3 zn3. z N 3 N 3 zn3 z N 3 N 3 ; zn3

Austrian Journal of Statistics 7 therefore for N N 2 N 3 N 4 E ˆθN 4 ˆθ N 3 ˆθN 2 ˆθ N. Thus, the Gaussian process {ˆθ N, N, 2,...} is proved to have uncorrelated increments. Hence its increments are independent. Theorem 2.7. Under the assumptions of Theorem 2.5, the estimator θ N is strongly consistent, i.e. ˆθ N θ as N almost surely. Proof. By Theorem 2.5 var ˆθ N as N, so ˆθN var ˆθ N var ˆθ N + var ˆθ N 2 var ˆθ N var ˆθ ˆθN N corr, ˆθ N var ˆθ N as N. The process ˆθ N has independent increments. Therefore by Kolmogorov s inequality, for ɛ > and N N P sup ˆθ N ˆθ N > ɛ 4 ˆθN 2 ɛ 2 lim var ˆθ N 4 N ɛ 2 var ˆθ N. N N Then, using the unbiasedness of the estimator, we get P sup ˆθ N θ ɛ P ˆθN θ ɛ + P sup 2 N N whence ˆθ N θ as N almost surely. N N ˆθ N ˆθ N ɛ 2 4 ɛ 2 var ˆθ N + 4 ɛ 2 var ˆθ N 8 ɛ 2 var ˆθ N as N, Example 2.8. Let us consider the model with B t B H t + B H 2 t, where B H t and B H 2 t are two independent fractional Brownian motions with Hurst indices H, H 2,, i. e. centered Gaussian processes with covariance functions E B H i t B H i s 2 t 2H i + s 2H i t s 2H i, t, s, i, 2. These processes have stationary increments, and E B H i k+h BH i kh B H i h h2h i H i 2H i k 2Hi 2, as k, see e. g. Mishura 28, Sec..2. Taking into account the independence of centered processes B H t and B H 2 t, we obtain that EB k+h B kh B h E B H k+h BH kh B H h + E B H 2 k+h BH 2 kh B H 2 h as k. Thus, the assumptions of Theorem 2.5 are satisfied. 3. Maximum likelihood estimation by continuous observations Let the process X be observed on the whole interval [, T ]. It is required to estimate the unknown parameter θ by these observations. 3.. Likelihood function and construction of the estimator In this section we construct a formula for continuous-time MLE, similar to the formula 4 for the discrete case.

72 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Assumption. The covariance function of B t has a mixed derivative 2 s t E B tb s Kt s, where Kt is an even function, K L [ T, T ]. Lemma 3.. Under Assumption, the integral ft db t exists as the mean square limit of the corresponding Riemann sums for any f L 2 [, T ]. Moreover, [ ] E ft db t gs db s ft Kt sgs ds dt 5 for any f, g L 2 [, T ]. Proof. According to Huang and Cambanis 978 see also Cramér and Leadbetter 24, Sec. 5.3, the integral ft db t exists if and only if the double Riemann integral ftfskt s ds dt exists. Moreover, if the both integrals ft db t and gs db s exist, then formula 5 holds. However, using the properties of a convolution, one can prove that ftfskt s ds dt K L [ T,T ] f 2 L 2 [,T ] <. Define a linear operator T : L 2 [, T ] L 2 [, T ] by It follows from 6 that T ft Kt sfs ds. 6 [ ] E ft db t gs db s T ftgt dt. 7 For a fixed set of points t,..., t N which satisfy 2 define mutually adjoint linear operators M : L 2 [, T ] R N and M : R N L 2 [, T ]. If f R N, then let Mf R N be a vector whose k-th element is equal to t k t k fs ds. The adjoint operator M is M x x k [tk,t k ], x R N. k The basic properties of the operator T are collected in the following evident lemma. Lemma 3.2. Let Assumption hold. Then i The operator T is bounded T K L [ T,T ] and self-adjoint; ii The following relation between the operator T and the covariance matrix N from Proposition 2. holds: M T M N, i. e., the matrix of the operator M T M : R N R N equals N. Now we are ready to formulate our key assumption on the kernel K in terms of the operator T. Assumption 2. For all T >, the constant function [,T] t, t [, T ], belongs to the range of the operator T, i. e. there exists a function h T L 2 [, T ] such that T h T [,T].

Austrian Journal of Statistics 73 Theorem 3.3. If all finite-dimensional distributions of the process {B t, t, T ]}, are nonsingular and Assumptions and 2 hold, then { } Lθ exp θ h T s db s θ2 h T s ds 8 2 is a likelihood function. Proof. Let us show that the function Lθ defined in 8 is a density function for the distribution of the process X t for a given θ with respect to the density function of a distribution of the process B t it coincides with X t when θ. In other words, we need to prove that dp θ Lθ dp, where P θ is the probability measure that corresponds to the value of the parameter θ. It suffices to show that for all partitions t < t <... < t N T of the interval [, T ] and for all cylinder sets A F N the following equality holds: dp θ Lθ dp, 9 A where F N is the σ-algebra, generated by the values B tk of the process B t at the points t k, k,..., N. We have dp θ L N θ P, A A Lθ dp E [Lθ F N ] dp, A A where L N is the likelihood function 3 for the discrete-time model, and E [ F N ] is the conditional expectation corresponding to the probability measure P. To prove 9, it suffices to show that L N θ E [Lθ F N ]. If θ, then X t B t, E [Lθ F N ] E exp A { } θ h T s db s θ2 h T sds. 2 Due to joint normality of h T s db s and B N, the conditional distribution of h T s db s with respect to F N is Gaussian Anderson 23, Theorem 2.5.; its conditional variance is nonrandom. Let us find its parameters. By the least squares method, [ E[B t F N ] E B t B N] cov B t, B N cov B N, B N B N. We have cov B N, B N N. Calculate cov B t, B N : cov B t, B tk B tk t and for any vector x x k N k RN cov B t, B N x s tk cov B t, B tk B tk xk k t Ks um xu du ds ut k Ks u du ds, t s k tk T [,t] s M xs ds M T [,t] x, ut k Ks ux k du ds [,t] s T M xs ds

74 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments whence we get Therefore Then cov B t, B N M T [,t]. E[B t F N ] M T [,t] N B N [ ] FN E h T t db t t T M N B N s ds. h T s T M N B N s ds T h T s M N B N s ds M T h T N B N, where we have used that the operator T is self-adjoint. Further, M T h T M [,T] z, where the vector z is defined after 3. Hence, [ ] FN E h T t db t z N B N. In order to calculate the variance we apply the partition-of-variance equality [ ] [ ] [ ] FN FN var h T t db t var E h T t db t + var h T t db t. We have [ ] var h T t db t and Hence, T h T t h T t dt [,T] th T t dt h T t dt, [ ] FN var E h T t db t var z N B N z N z. [ ] FN var h T t db t h T t dt z N z. Applying the formula for the mean of the log-normal distribution, we obtain { E [Lθ F N ] E exp θz N B N Thus, 9 is proved. + θ2 h T t dt z N } z θ2 h T sds L N θ. 2 2 Corollary 3.4. The maximum likelihood estimator of θ by continuous observations is given by ˆθ T h T t dx t h T t dt. 3.2. Properties of the estimator It follows immediately from that the maximum likelihood estimator ˆθ T is equal to ˆθ T θ + h T t db t h T t dt. 2

Austrian Journal of Statistics 75 Proposition 3.5. The estimator ˆθ T is unbiased and normally distributed. Its variance is equal to var ˆθ 2 T E ˆθT θ h T t dt. 3 Proof. Unbiasedness and normality follows from the fact that ˆθ θ is a linear functional of centered Gaussian process B. By 7, var h T t db t T ht h T t dt [,T] t h T t dt h T t dt. Thus, equation 3 immediately follows from 2. Corollary 3.6. Let the process B {B t, t } satisfy Assumptions and 2. If h T t dt, as T +, 4 then the maximum likelihood estimator ˆθ T is mean-square consistent, i.e., Eˆθ T θ 2, as T +. It can be hard to verify the condition 4. The following result gives sufficient conditions for the consistency in terms of the autocovariance function of B. Theorem 3.7. Let the process B {B t, t } satisfy Assumptions and 2. If the covariance function of the increment process B N B N tends to : E B N+ B N B as N, then the maximum likelihood estimator ˆθ T is mean-square consistent. Proof. The estimator ˆθ N from the discrete sample {X,..., X N } is mean-square consistent by Theorem 2.5. The estimator from the continuous-time sample {X t, t [, T ]} is unbiased. Now compare the variances of the discrete and continuous-time estimators. The desired inequalities are got from the proof of Theorem 2.5. Suppose that T, N is an integer such that N T < N +. By equation we have As var ˆθ T h T t dt, var ˆθ N h T t dt z N z. 5 z N z, we have var ˆθT var ˆθ N, and 2 lim ˆθT ˆθN 2 E θ lim E θ. T + N To prove the strong consistency of ˆθ T, we need the following auxiliary result. Lemma 3.8. Let the process B satisfy the conditions of Theorem 3.3. Then the estimator process ˆθ {ˆθ T, T } has independent increments. Proof. Let T 2 T 3. Then [3 2 ] 2 E h T3 t db t h T2 s db s T3 h T3 t h T2 t dt 2 [,T3 ]t h T2 t dt 2 h T2 t dt.

76 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Thus, if < T T 2 T 3 T 4, then Eˆθ T4 ˆθ T3 ˆθ T2 ˆθ T 4 3 2 h T4 t db t E 4 h T4 t dt h T3 t db t h T2 t db t 3 2 h T3 t dt h T2 t dt [ 4 ] [ T2 T3 ] T2 E h T4 t db t h T2 t db t E 4 h T4 t dt h T3 t db t h T2 t db t T 2 3 h T2 t dt h T3 t dt 2 h T2 t dt [ 4 ] [ T T3 E h T4 t db t h T t db t E 4 h T4 t dt + 3 h T t dt h T3 t dt h T t dt 4 h T4 t dt 3 h T3 t dt 4 h T4 t dt + 3 h T3 t dt. h T t db t h T t dt h T3 t db t h T t db t ] Similarly to the proof of Lemma 2.6, the random process ˆθ T is Gaussian and its increments are proved to be uncorrelated so they are independent. Theorem 3.9. Under conditions of Theorem 3.7 the estimator ˆθ T is strongly consistent. Proof. By Kolmogorov s inequality, for any ɛ > and t > P sup ˆθ T θ > ɛ P ˆθ t θ > ɛ + P T >t 2 By Theorem 3.7, whence the strong consistency follows. 4 ɛ 2 var ˆθ t + 4 ɛ 2 sup T >t ˆθ T θ t > ɛ 2 lim ˆθT var ˆθ t 8 T + ɛ 2 var ˆθ t. lim P sup ˆθ T θ > ɛ for all ɛ >, t + T >t Remark 3.. The Brownian motion does not satisfy Assumption as for covariance function maxs, t of Wiener process, maxs,t t is not continuous in s. So we extend our model such that it can handle Wiener process. Let the process B be a sum of two independent random processes, B t Bt C + W t, 6 where B C satisfies Assumption, and W is a standard Wiener process. Let us look at the changes of the statements if the process B admits representation 6 Assumption for B is dropped. Lemma 3. changes as follows: [ ] E ft db t gs db s ft Kt sgs ds dt + ftgt dt. Equation 7 will stand true, if we set T ft ft + C T ft ft + Kt sfs ds. Lemma 3.2 stands true with T K L [ T,T ] +. changes in the proof are required. Theorem 3.3 holds true; minor

Austrian Journal of Statistics 77 Table : The means and variances of ˆθ T H.6.7.8.9 T Sample Sample Theoretical Mean Variance Variance 2.344.9829.8292 2.47.2584.2356.9995.9956.9692 2.296.3484.327 2.65 2.435.993 2.7.555.4392 2.7 2.376.9984 2.99.77.5867 3.3. Examples Example 3.. Let B be a fractional Brownian motion with the Hurst index H /2,. Then Kt H2H. We denote by H t 2 2H T the corresponding operator T. Then for the function h T s C H s /2 H T s /2 H, C H H2H B H 2, 3 2 H, we have that H T h T [,T], see Norros et al. 999. The maximum likelihood estimator is given by ˆθ T T 2H 2 s /2 H T s /2 H dx s. B3/2 H, 3/2 H Example 3.2. Consider the following model: X t θt + W t + B H t, 7 where W is a standard Wiener process, B H is a fractional Brownian motion with Hurst index H, and random processes W t and Bt H are independent. The process W t + Bt H admits representation 6 with B C B H. Corresponding operator T is T I + H T see Example 3. for the definition of H T. The operator H T is self-adjoint and positive semi-definite. Hence, the operator T is invertible. Thus Assumption 2 holds true. The function h T T [,T] can be evaluated iteratively h T k 2 H T + 2 I H T H T k [,T]. 8 k+ 3.4. Simulations We illustrate the behavior of the maximum likelihood estimator for Example 3.2 by means of simulation experiments. For T and T and various values of H we find h T iteratively by 8. Then for θ 2 we simulate realizations of the process 7 for each H and compute the estimates by. The means and variances of these estimates are reported in Table. The theoretical variances calculated by 3 are also presented. We see that these simulation studies confirm the theoretical properties of θ T, especially unbiasedness and consistency.

78 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments References Anderson TV 23. An Introduction to Multivariate Statistical Analysis. Wiley, Hoboken NJ. Bertin K, Torres S, Tudor CA 2. Maximum-likelihood Estimators and Random Walks in Long Memory Models. Statistics, 454, 36 374. Cai C, Chigansky P, Kleptsyna M 26. Mixed Gaussian Processes: A Filtering Approach. Ann. Probab., 444, 332 375. Cramér H, Leadbetter MR 24. Stationary and Related Stochastic Processes. Dover Publications, Inc., Mineola, NY. Sample function properties and their applications, Reprint of the 967 original. Hu Y, Nualart D, Xiao W, Zhang W 2. Exact Maximum Likelihood Estimator for Drift Fractional Brownian Motion at Discrete Observation. Acta Math. Sci. Ser. B Engl. Ed., 35, 85 859. Huang ST, Cambanis S 978. Stochastic and Multiple Wiener Integrals for Gaussian Processes. Ann. Probab., 64, 585 64. Le Breton A 998. Filtering and Parameter Estimation in a Simple Linear System Driven by a Fractional Brownian Motion. Statist. Probab. Lett., 383, 263 274. Mishura Y 28. Stochastic Calculus for Fractional Brownian Motion and Related Processes, volume 929. Springer Science & Business Media. Mishura Y 26. Maximum Likelihood Drift Estimation for the Mixing of Two Fractional Brownian Motions. In Stochastic and Infinite Dimensional Analysis, pp. 263 28. Springer. Mishura Y, Voronov I 25. Construction of Maximum Likelihood Estimator in the Mixed Fractional Fractional Brownian Motion Model with Double Long-range Dependence. Mod. Stoch. Theory Appl., 22, 47 64. Norros I, Valkeila E, Virtamo J 999. An Elementary Approach to a Girsanov Formula and Other Analytical Results on Fractional Brownian motions. Bernoulli, 54, 57 587. Rao CR 22. Linear Statistical Inference and its Applications. Second edition. Wiley, New York. Wiley Series in Probability and Statistics. Stiburek D 27. Statistical Inference on the Drift Parameter in Fractional Brownian Motion with a Deterministic Drift. Comm. Statist. Theory Methods, 462, 892 95. Affiliation: Yuliya Mishura, Kostiantyn Ralchenko, Sergiy Shklyar Department of Probability Theory, Statistics and Actuarial Mathematics Taras Shevchenko National University of Kyiv 64 Volodymyrska 6 Kyiv, Ukraine E-mail: myus@univ.kiev.ua, k.ralchenko@gmail.com, shklyar@univ.kiev.ua Austrian Journal of Statistics http://www.ajs.or.at/ published by the Austrian Society of Statistics http://www.osg.or.at/ Volume 46 Submitted: 26--5 April 27 Accepted: 27-2-2