Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments
|
|
- Jemima Hunt
- 5 years ago
- Views:
Transcription
1 Austrian Journal of Statistics April 27, Volume 46, AJS doi:.773/ajs.v46i Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Yuliya Mishura Taras Shevchenko National University of Kyiv Kostiantyn Ralchenko Taras Shevchenko National University of Kyiv Sergiy Shklyar Taras Shevchenko National University of Kyiv Abstract The paper deals with the regression model X t θt+b t, t [, T ], where B {B t, t } is a centered Gaussian process with stationary increments. We study the estimation of the unknown parameter θ and establish the formula for the likelihood function in terms of a solution to an integral equation. Then we find the maximum likelihood estimator and prove its strong consistency. The results obtained generalize the known results for fractional and mixed fractional Brownian motion. Keywords: Gaussian process, stationary increments, discrete observations, continuous observations; maximum likelihood estimator, strong consistency, fractional Brownian motion, mixed fractional Brownian motion.. Introduction We study the problem of the drift parameter estimation for the stochastic process X t θt + B t, where θ R is an unknown parameter, and B {B t, t } is a centered Gaussian process with stationary increments, B. In the particular case when B B H is a fractional Brownian motion, this model has been studied by many authors. Mention the paper by Norros, Valkeila, and Virtamo 999 that treats the maximum likelihood estimation by continuous observations of the trajectory of X on the interval [, T ] see also Le Breton 998. Further, Hu, Nualart, Xiao, and Zhang 2 investigate the exact maximum likelihood estimator by discrete observations at the points tk kh, k, 2,..., N; Bertin, Torres, and Tudor 2 consider the maximum likelihood estimation in the discrete scheme of observations, where the trajectory of X is observed at the points t k k N, k, 2,..., N α, α >. For hypothesis testing of the drift parameter sign in the model driven by a fractional Brownian motion, see Stiburek 27. The paper Cai, Chigansky, and Kleptsyna 26 treats the likelihood function for Gaussian processes not necessarily having stationary increments. However, on the one hand, our approach is different from their one, it cannot be deduced from their general formulas and on the other hand, gives rather elegant representations. The construction of the maximum likelihood estimator in the case where B is the sum of two fractional Brow-
2 68 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments nian motions was studied in Mishura 26 and Mishura and Voronov 25. A similar non-gaussian model driven by the Rosenblatt process was considered in Bertin et al. 2. As already mentioned, we consider the case when B is a centered Gaussian processes with stationary increments. We construct the maximum likelihood estimators for both discrete and continuous schemes of observations. The assumptions on the process in the continuous case are formulated in terms of the second derivative of its covariance function, see Assumptions and 2. The exact formula for the maximum likelihood estimator contains a solution of an integral equation with the kernel obtained after the differentiation. We give the sufficient conditions for the strong consistency of the estimators. Several examples of the process B are considered. The paper is organized as follows. Section 2 is devoted to the case of the discrete observations. The maximum likelihood estimation for continuous time is studied in Section Maximum likelihood estimation by discrete observations We start with the construction of the likelihood function and the maximum likelihood estimator in the case of discrete observations. In the next section these results will be used for the derivation of the likelihood function in the continuous-time case, see the proof of Theorem 3.3. Let the process X defined by formula be observed at the points t k, k,,..., N, t < t <... < t N T. 2 The problem is to estimate the parameter θ by the observations X tk, k,,..., N of the process X t. 2.. Likelihood function and construction of the estimator Denote X N X tk X tk N k, BN B tk B tk N k. Note that in our model X t X, and the N-dimensional vector X N is a one-to-one function of the observations. The vectors B N and X N are Gaussian with different means except the case θ and the same covariance matrix. We denote this covariance matrix by N. The next maximum likelihood estimator coincides with the least square estimator considered in Rao 22, eq. 4a..5. Lemma 2.. Assume that the Gaussian distribution of the vector B tk N k is nonsingular. Then one can take the function L N X N x θ f θx {θz f x exp N θ 2 x 2 z N } z, 3 where z t k t k N k, as a likelihood function in the discrete-time model. MLE is linear with respect to the observations and equals ˆθ N z N X N z N z. 4 Proof. The pdf of B N with respect to the Lebesgue measure equals f θ x 2π N/2 exp { 2 det x θz N } x θz. N The density of the observations for given θ with respect to the distribution of the observations for θ is taken as a likelihood function.
3 Austrian Journal of Statistics 69 Remark 2.2. Let the process X be observed on a regular grid, i.e., at the points t k kh, k,..., N, where h >. Then N is a Toeplitz matrix, that is N k+l,l N l,k+l E B k+lh B k+l h Blh B l h E Bk+h B h E B kh B h does not depend on l due to the stationarity of increments. This simplifies the numerical computation of MLE Properties of the estimator Since X N B N + θz, the maximum likelihood estimator 4 equals ˆθ N θ + z N B N z N z. Lemma 2.3. Under the assumptions of Lemma 2., the estimator normally distributed. Its variance equals var ˆθ N z N z. ˆθ N is unbiased and Proof. The estimator ˆθ N is unbiased and normally distributed because ˆθ N θ is linear and centered Gaussian vector B N. The variance of the estimator is equal to var ˆθ var z N B N N z N 2 z N var B N N z z z N 2 z z N N N z z N z 2 z N z. To prove the consistency of the estimator, we need the following technical result. Lemma 2.4. If A R N N is a positive definite matrix, x R N, x is a non-zero vector, then x A x x 4 x Ax. Proof. As the matrix A is positive definite, x Ax > and there exists a positive definite matrix A /2 and so the matrix A /2 is symmetric and nonsingular such that A /2 2 A. By the Cauchy-Schwarz inequality, 2 x 4 x A /2 A x /2 A /2 x 2 A /2 x 2 x Ax x A x, whence the desired inequality follows. In the rest of this section we assume that the process X is observed on a regular grid, at the points t k kh, k,..., N, for some h >. We also assume that for any N the Gaussian distribution of the vector B kh N k is nonsingular. Theorem 2.5. Let h >, and E B k+h B kh Bh as k. Let ˆθ N be the ML estimator of parameter θ of the model by the observations X kh, k,..., N. Then the estimator ˆθ N is mean-square consistent, i.e., E ˆθN θ 2 as N.
4 7 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Proof. By Remark 2.2, the matrix N has Toeplitz structure, N l,m N l,m E B l m +h B l m h Bh. Moreover, N k,l does not depend on N as soon as N maxk, l. By Toeplitz theorem, N 2 l m N l,m N EB h 2 k2 2N + k N 2 E B kh B k h Bh lim k E B kh B k h Bh as N. For a regular grid we have that z h,..., h. Hence, in this case, z N z h 2 Finally, with the use of Lemma 2.3, N l m ˆθN 2 E θ z N z N z z z 4 h 2 N 2 N l,m, z h N. l m N l,m To prove the strong consistency, we need the following auxiliary statement. as N. Lemma 2.6. Let h >, and ˆθ N be the ML estimator of parameter θ of the model by the observations X kh, k,..., N. Then the random process ˆθ N has independent increments. Proof. In the next paragraph, N 2 N 3 are positive integers, I, I N2, N2 N 3 N 2 is N 2 N 3 diagonal matrix with ones on the diagonal, and its transpose is denoted by I. The vector B N 2 B h,..., B N2 h B N2 h is the beginning of vector B N 3 B h,..., B N3 h B N3 h ; the vector z N2 h,..., h R N 2 is the beginning of vector z N3 h,..., h R N 3, so Then For N N 2 N 3 B N 2 I, B N 3, z N2 I, z N3. E B N 3 B N 2 E N N 3 B N 3 I N 3 I, E ˆθ N 3 ˆθN 2 E z N 3 N 3 B N 3 B N 2 N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 z N 3 N 3 N 3 I N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 z N 3 I N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 z N 2 N 2 zn2 z N 3 N 3 zn3 z N 2 N 2 zn2 E ˆθ N 3 ˆθN 2 ˆθ N E ˆθ N 3 ˆθN 2 E ˆθ N 3 ˆθN z N 3 N 3 zn3. z N 3 N 3 zn3 z N 3 N 3 ; zn3
5 Austrian Journal of Statistics 7 therefore for N N 2 N 3 N 4 E ˆθN 4 ˆθ N 3 ˆθN 2 ˆθ N. Thus, the Gaussian process {ˆθ N, N, 2,...} is proved to have uncorrelated increments. Hence its increments are independent. Theorem 2.7. Under the assumptions of Theorem 2.5, the estimator θ N is strongly consistent, i.e. ˆθ N θ as N almost surely. Proof. By Theorem 2.5 var ˆθ N as N, so ˆθN var ˆθ N var ˆθ N + var ˆθ N 2 var ˆθ N var ˆθ ˆθN N corr, ˆθ N var ˆθ N as N. The process ˆθ N has independent increments. Therefore by Kolmogorov s inequality, for ɛ > and N N P sup ˆθ N ˆθ N > ɛ 4 ˆθN 2 ɛ 2 lim var ˆθ N 4 N ɛ 2 var ˆθ N. N N Then, using the unbiasedness of the estimator, we get P sup ˆθ N θ ɛ P ˆθN θ ɛ + P sup 2 N N whence ˆθ N θ as N almost surely. N N ˆθ N ˆθ N ɛ 2 4 ɛ 2 var ˆθ N + 4 ɛ 2 var ˆθ N 8 ɛ 2 var ˆθ N as N, Example 2.8. Let us consider the model with B t B H t + B H 2 t, where B H t and B H 2 t are two independent fractional Brownian motions with Hurst indices H, H 2,, i. e. centered Gaussian processes with covariance functions E B H i t B H i s 2 t 2H i + s 2H i t s 2H i, t, s, i, 2. These processes have stationary increments, and E B H i k+h BH i kh B H i h h2h i H i 2H i k 2Hi 2, as k, see e. g. Mishura 28, Sec..2. Taking into account the independence of centered processes B H t and B H 2 t, we obtain that EB k+h B kh B h E B H k+h BH kh B H h + E B H 2 k+h BH 2 kh B H 2 h as k. Thus, the assumptions of Theorem 2.5 are satisfied. 3. Maximum likelihood estimation by continuous observations Let the process X be observed on the whole interval [, T ]. It is required to estimate the unknown parameter θ by these observations. 3.. Likelihood function and construction of the estimator In this section we construct a formula for continuous-time MLE, similar to the formula 4 for the discrete case.
6 72 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Assumption. The covariance function of B t has a mixed derivative 2 s t E B tb s Kt s, where Kt is an even function, K L [ T, T ]. Lemma 3.. Under Assumption, the integral ft db t exists as the mean square limit of the corresponding Riemann sums for any f L 2 [, T ]. Moreover, [ ] E ft db t gs db s ft Kt sgs ds dt 5 for any f, g L 2 [, T ]. Proof. According to Huang and Cambanis 978 see also Cramér and Leadbetter 24, Sec. 5.3, the integral ft db t exists if and only if the double Riemann integral ftfskt s ds dt exists. Moreover, if the both integrals ft db t and gs db s exist, then formula 5 holds. However, using the properties of a convolution, one can prove that ftfskt s ds dt K L [ T,T ] f 2 L 2 [,T ] <. Define a linear operator T : L 2 [, T ] L 2 [, T ] by It follows from 6 that T ft Kt sfs ds. 6 [ ] E ft db t gs db s T ftgt dt. 7 For a fixed set of points t,..., t N which satisfy 2 define mutually adjoint linear operators M : L 2 [, T ] R N and M : R N L 2 [, T ]. If f R N, then let Mf R N be a vector whose k-th element is equal to t k t k fs ds. The adjoint operator M is M x x k [tk,t k ], x R N. k The basic properties of the operator T are collected in the following evident lemma. Lemma 3.2. Let Assumption hold. Then i The operator T is bounded T K L [ T,T ] and self-adjoint; ii The following relation between the operator T and the covariance matrix N from Proposition 2. holds: M T M N, i. e., the matrix of the operator M T M : R N R N equals N. Now we are ready to formulate our key assumption on the kernel K in terms of the operator T. Assumption 2. For all T >, the constant function [,T] t, t [, T ], belongs to the range of the operator T, i. e. there exists a function h T L 2 [, T ] such that T h T [,T].
7 Austrian Journal of Statistics 73 Theorem 3.3. If all finite-dimensional distributions of the process {B t, t, T ]}, are nonsingular and Assumptions and 2 hold, then { } Lθ exp θ h T s db s θ2 h T s ds 8 2 is a likelihood function. Proof. Let us show that the function Lθ defined in 8 is a density function for the distribution of the process X t for a given θ with respect to the density function of a distribution of the process B t it coincides with X t when θ. In other words, we need to prove that dp θ Lθ dp, where P θ is the probability measure that corresponds to the value of the parameter θ. It suffices to show that for all partitions t < t <... < t N T of the interval [, T ] and for all cylinder sets A F N the following equality holds: dp θ Lθ dp, 9 A where F N is the σ-algebra, generated by the values B tk of the process B t at the points t k, k,..., N. We have dp θ L N θ P, A A Lθ dp E [Lθ F N ] dp, A A where L N is the likelihood function 3 for the discrete-time model, and E [ F N ] is the conditional expectation corresponding to the probability measure P. To prove 9, it suffices to show that L N θ E [Lθ F N ]. If θ, then X t B t, E [Lθ F N ] E exp A { } θ h T s db s θ2 h T sds. 2 Due to joint normality of h T s db s and B N, the conditional distribution of h T s db s with respect to F N is Gaussian Anderson 23, Theorem 2.5.; its conditional variance is nonrandom. Let us find its parameters. By the least squares method, [ E[B t F N ] E B t B N] cov B t, B N cov B N, B N B N. We have cov B N, B N N. Calculate cov B t, B N : cov B t, B tk B tk t and for any vector x x k N k RN cov B t, B N x s tk cov B t, B tk B tk xk k t Ks um xu du ds ut k Ks u du ds, t s k tk T [,t] s M xs ds M T [,t] x, ut k Ks ux k du ds [,t] s T M xs ds
8 74 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments whence we get Therefore Then cov B t, B N M T [,t]. E[B t F N ] M T [,t] N B N [ ] FN E h T t db t t T M N B N s ds. h T s T M N B N s ds T h T s M N B N s ds M T h T N B N, where we have used that the operator T is self-adjoint. Further, M T h T M [,T] z, where the vector z is defined after 3. Hence, [ ] FN E h T t db t z N B N. In order to calculate the variance we apply the partition-of-variance equality [ ] [ ] [ ] FN FN var h T t db t var E h T t db t + var h T t db t. We have [ ] var h T t db t and Hence, T h T t h T t dt [,T] th T t dt h T t dt, [ ] FN var E h T t db t var z N B N z N z. [ ] FN var h T t db t h T t dt z N z. Applying the formula for the mean of the log-normal distribution, we obtain { E [Lθ F N ] E exp θz N B N Thus, 9 is proved. + θ2 h T t dt z N } z θ2 h T sds L N θ. 2 2 Corollary 3.4. The maximum likelihood estimator of θ by continuous observations is given by ˆθ T h T t dx t h T t dt Properties of the estimator It follows immediately from that the maximum likelihood estimator ˆθ T is equal to ˆθ T θ + h T t db t h T t dt. 2
9 Austrian Journal of Statistics 75 Proposition 3.5. The estimator ˆθ T is unbiased and normally distributed. Its variance is equal to var ˆθ 2 T E ˆθT θ h T t dt. 3 Proof. Unbiasedness and normality follows from the fact that ˆθ θ is a linear functional of centered Gaussian process B. By 7, var h T t db t T ht h T t dt [,T] t h T t dt h T t dt. Thus, equation 3 immediately follows from 2. Corollary 3.6. Let the process B {B t, t } satisfy Assumptions and 2. If h T t dt, as T +, 4 then the maximum likelihood estimator ˆθ T is mean-square consistent, i.e., Eˆθ T θ 2, as T +. It can be hard to verify the condition 4. The following result gives sufficient conditions for the consistency in terms of the autocovariance function of B. Theorem 3.7. Let the process B {B t, t } satisfy Assumptions and 2. If the covariance function of the increment process B N B N tends to : E B N+ B N B as N, then the maximum likelihood estimator ˆθ T is mean-square consistent. Proof. The estimator ˆθ N from the discrete sample {X,..., X N } is mean-square consistent by Theorem 2.5. The estimator from the continuous-time sample {X t, t [, T ]} is unbiased. Now compare the variances of the discrete and continuous-time estimators. The desired inequalities are got from the proof of Theorem 2.5. Suppose that T, N is an integer such that N T < N +. By equation we have As var ˆθ T h T t dt, var ˆθ N h T t dt z N z. 5 z N z, we have var ˆθT var ˆθ N, and 2 lim ˆθT ˆθN 2 E θ lim E θ. T + N To prove the strong consistency of ˆθ T, we need the following auxiliary result. Lemma 3.8. Let the process B satisfy the conditions of Theorem 3.3. Then the estimator process ˆθ {ˆθ T, T } has independent increments. Proof. Let T 2 T 3. Then [3 2 ] 2 E h T3 t db t h T2 s db s T3 h T3 t h T2 t dt 2 [,T3 ]t h T2 t dt 2 h T2 t dt.
10 76 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments Thus, if < T T 2 T 3 T 4, then Eˆθ T4 ˆθ T3 ˆθ T2 ˆθ T h T4 t db t E 4 h T4 t dt h T3 t db t h T2 t db t 3 2 h T3 t dt h T2 t dt [ 4 ] [ T2 T3 ] T2 E h T4 t db t h T2 t db t E 4 h T4 t dt h T3 t db t h T2 t db t T 2 3 h T2 t dt h T3 t dt 2 h T2 t dt [ 4 ] [ T T3 E h T4 t db t h T t db t E 4 h T4 t dt + 3 h T t dt h T3 t dt h T t dt 4 h T4 t dt 3 h T3 t dt 4 h T4 t dt + 3 h T3 t dt. h T t db t h T t dt h T3 t db t h T t db t ] Similarly to the proof of Lemma 2.6, the random process ˆθ T is Gaussian and its increments are proved to be uncorrelated so they are independent. Theorem 3.9. Under conditions of Theorem 3.7 the estimator ˆθ T is strongly consistent. Proof. By Kolmogorov s inequality, for any ɛ > and t > P sup ˆθ T θ > ɛ P ˆθ t θ > ɛ + P T >t 2 By Theorem 3.7, whence the strong consistency follows. 4 ɛ 2 var ˆθ t + 4 ɛ 2 sup T >t ˆθ T θ t > ɛ 2 lim ˆθT var ˆθ t 8 T + ɛ 2 var ˆθ t. lim P sup ˆθ T θ > ɛ for all ɛ >, t + T >t Remark 3.. The Brownian motion does not satisfy Assumption as for covariance function maxs, t of Wiener process, maxs,t t is not continuous in s. So we extend our model such that it can handle Wiener process. Let the process B be a sum of two independent random processes, B t Bt C + W t, 6 where B C satisfies Assumption, and W is a standard Wiener process. Let us look at the changes of the statements if the process B admits representation 6 Assumption for B is dropped. Lemma 3. changes as follows: [ ] E ft db t gs db s ft Kt sgs ds dt + ftgt dt. Equation 7 will stand true, if we set T ft ft + C T ft ft + Kt sfs ds. Lemma 3.2 stands true with T K L [ T,T ] +. changes in the proof are required. Theorem 3.3 holds true; minor
11 Austrian Journal of Statistics 77 Table : The means and variances of ˆθ T H T Sample Sample Theoretical Mean Variance Variance Examples Example 3.. Let B be a fractional Brownian motion with the Hurst index H /2,. Then Kt H2H. We denote by H t 2 2H T the corresponding operator T. Then for the function h T s C H s /2 H T s /2 H, C H H2H B H 2, 3 2 H, we have that H T h T [,T], see Norros et al The maximum likelihood estimator is given by ˆθ T T 2H 2 s /2 H T s /2 H dx s. B3/2 H, 3/2 H Example 3.2. Consider the following model: X t θt + W t + B H t, 7 where W is a standard Wiener process, B H is a fractional Brownian motion with Hurst index H, and random processes W t and Bt H are independent. The process W t + Bt H admits representation 6 with B C B H. Corresponding operator T is T I + H T see Example 3. for the definition of H T. The operator H T is self-adjoint and positive semi-definite. Hence, the operator T is invertible. Thus Assumption 2 holds true. The function h T T [,T] can be evaluated iteratively h T k 2 H T + 2 I H T H T k [,T]. 8 k Simulations We illustrate the behavior of the maximum likelihood estimator for Example 3.2 by means of simulation experiments. For T and T and various values of H we find h T iteratively by 8. Then for θ 2 we simulate realizations of the process 7 for each H and compute the estimates by. The means and variances of these estimates are reported in Table. The theoretical variances calculated by 3 are also presented. We see that these simulation studies confirm the theoretical properties of θ T, especially unbiasedness and consistency.
12 78 Maximum Likelihood Drift Estimation for Gaussian Process with Stationary Increments References Anderson TV 23. An Introduction to Multivariate Statistical Analysis. Wiley, Hoboken NJ. Bertin K, Torres S, Tudor CA 2. Maximum-likelihood Estimators and Random Walks in Long Memory Models. Statistics, 454, Cai C, Chigansky P, Kleptsyna M 26. Mixed Gaussian Processes: A Filtering Approach. Ann. Probab., 444, Cramér H, Leadbetter MR 24. Stationary and Related Stochastic Processes. Dover Publications, Inc., Mineola, NY. Sample function properties and their applications, Reprint of the 967 original. Hu Y, Nualart D, Xiao W, Zhang W 2. Exact Maximum Likelihood Estimator for Drift Fractional Brownian Motion at Discrete Observation. Acta Math. Sci. Ser. B Engl. Ed., 35, Huang ST, Cambanis S 978. Stochastic and Multiple Wiener Integrals for Gaussian Processes. Ann. Probab., 64, Le Breton A 998. Filtering and Parameter Estimation in a Simple Linear System Driven by a Fractional Brownian Motion. Statist. Probab. Lett., 383, Mishura Y 28. Stochastic Calculus for Fractional Brownian Motion and Related Processes, volume 929. Springer Science & Business Media. Mishura Y 26. Maximum Likelihood Drift Estimation for the Mixing of Two Fractional Brownian Motions. In Stochastic and Infinite Dimensional Analysis, pp Springer. Mishura Y, Voronov I 25. Construction of Maximum Likelihood Estimator in the Mixed Fractional Fractional Brownian Motion Model with Double Long-range Dependence. Mod. Stoch. Theory Appl., 22, Norros I, Valkeila E, Virtamo J 999. An Elementary Approach to a Girsanov Formula and Other Analytical Results on Fractional Brownian motions. Bernoulli, 54, Rao CR 22. Linear Statistical Inference and its Applications. Second edition. Wiley, New York. Wiley Series in Probability and Statistics. Stiburek D 27. Statistical Inference on the Drift Parameter in Fractional Brownian Motion with a Deterministic Drift. Comm. Statist. Theory Methods, 462, Affiliation: Yuliya Mishura, Kostiantyn Ralchenko, Sergiy Shklyar Department of Probability Theory, Statistics and Actuarial Mathematics Taras Shevchenko National University of Kyiv 64 Volodymyrska 6 Kyiv, Ukraine myus@univ.kiev.ua, k.ralchenko@gmail.com, shklyar@univ.kiev.ua Austrian Journal of Statistics published by the Austrian Society of Statistics Volume 46 Submitted: April 27 Accepted:
Hypothesis testing of the drift parameter sign for fractional Ornstein Uhlenbeck process
Electronic Journal of Statistics Vol. 11 217 385 4 ISSN: 1935-7524 DOI: 1.1214/17-EJS1237 Hypothesis testing of the drift parameter sign for fractional Ornstein Uhlenbeck process Alexander Kukush Department
More informationLAN property for sde s with additive fractional noise and continuous time observation
LAN property for sde s with additive fractional noise and continuous time observation Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Samy Tindel (Purdue University) Vlad s 6th birthday,
More informationIntegral representations in models with long memory
Integral representations in models with long memory Georgiy Shevchenko, Yuliya Mishura, Esko Valkeila, Lauri Viitasaari, Taras Shalaiko Taras Shevchenko National University of Kyiv 29 September 215, Ulm
More informationJoint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion
Joint Parameter Estimation of the Ornstein-Uhlenbeck SDE driven by Fractional Brownian Motion Luis Barboza October 23, 2012 Department of Statistics, Purdue University () Probability Seminar 1 / 59 Introduction
More informationRepresentations of Gaussian measures that are equivalent to Wiener measure
Representations of Gaussian measures that are equivalent to Wiener measure Patrick Cheridito Departement für Mathematik, ETHZ, 89 Zürich, Switzerland. E-mail: dito@math.ethz.ch Summary. We summarize results
More informationarxiv: v1 [math.pr] 23 Jan 2018
TRANSFER PRINCIPLE FOR nt ORDER FRACTIONAL BROWNIAN MOTION WIT APPLICATIONS TO PREDICTION AND EQUIVALENCE IN LAW TOMMI SOTTINEN arxiv:181.7574v1 [math.pr 3 Jan 18 Department of Mathematics and Statistics,
More informationMinimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process
isid/ms/23/25 September 11, 23 http://www.isid.ac.in/ statmath/eprints Minimum L 1 -norm Estimation for Fractional Ornstein-Uhlenbeck Type Process B. L. S. Prakasa Rao Indian Statistical Institute, Delhi
More informationTopics in fractional Brownian motion
Topics in fractional Brownian motion Esko Valkeila Spring School, Jena 25.3. 2011 We plan to discuss the following items during these lectures: Fractional Brownian motion and its properties. Topics in
More information557: MATHEMATICAL STATISTICS II BIAS AND VARIANCE
557: MATHEMATICAL STATISTICS II BIAS AND VARIANCE An estimator, T (X), of θ can be evaluated via its statistical properties. Typically, two aspects are considered: Expectation Variance either in terms
More informationON A SDE DRIVEN BY A FRACTIONAL BROWNIAN MOTION AND WITH MONOTONE DRIFT
Elect. Comm. in Probab. 8 23122 134 ELECTRONIC COMMUNICATIONS in PROBABILITY ON A SDE DRIVEN BY A FRACTIONAL BROWNIAN MOTION AND WIT MONOTONE DRIFT BRAIM BOUFOUSSI 1 Cadi Ayyad University FSSM, Department
More informationRepresenting Gaussian Processes with Martingales
Representing Gaussian Processes with Martingales with Application to MLE of Langevin Equation Tommi Sottinen University of Vaasa Based on ongoing joint work with Lauri Viitasaari, University of Saarland
More informationBernardo D Auria Stochastic Processes /12. Notes. March 29 th, 2012
1 Stochastic Calculus Notes March 9 th, 1 In 19, Bachelier proposed for the Paris stock exchange a model for the fluctuations affecting the price X(t) of an asset that was given by the Brownian motion.
More informationBernardo D Auria Stochastic Processes /10. Notes. Abril 13 th, 2010
1 Stochastic Calculus Notes Abril 13 th, 1 As we have seen in previous lessons, the stochastic integral with respect to the Brownian motion shows a behavior different from the classical Riemann-Stieltjes
More informationA Class of Fractional Stochastic Differential Equations
Vietnam Journal of Mathematics 36:38) 71 79 Vietnam Journal of MATHEMATICS VAST 8 A Class of Fractional Stochastic Differential Equations Nguyen Tien Dung Department of Mathematics, Vietnam National University,
More informationRegularity of the density for the stochastic heat equation
Regularity of the density for the stochastic heat equation Carl Mueller 1 Department of Mathematics University of Rochester Rochester, NY 15627 USA email: cmlr@math.rochester.edu David Nualart 2 Department
More informationGaussian, Markov and stationary processes
Gaussian, Markov and stationary processes Gonzalo Mateos Dept. of ECE and Goergen Institute for Data Science University of Rochester gmateosb@ece.rochester.edu http://www.ece.rochester.edu/~gmateosb/ November
More informationI forgot to mention last time: in the Ito formula for two standard processes, putting
I forgot to mention last time: in the Ito formula for two standard processes, putting dx t = a t dt + b t db t dy t = α t dt + β t db t, and taking f(x, y = xy, one has f x = y, f y = x, and f xx = f yy
More informationELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process
Department of Electrical Engineering University of Arkansas ELEG 3143 Probability & Stochastic Process Ch. 6 Stochastic Process Dr. Jingxian Wu wuj@uark.edu OUTLINE 2 Definition of stochastic process (random
More informationLAN property for ergodic jump-diffusion processes with discrete observations
LAN property for ergodic jump-diffusion processes with discrete observations Eulalia Nualart (Universitat Pompeu Fabra, Barcelona) joint work with Arturo Kohatsu-Higa (Ritsumeikan University, Japan) &
More informationMOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES
J. Korean Math. Soc. 47 1, No., pp. 63 75 DOI 1.4134/JKMS.1.47..63 MOMENT CONVERGENCE RATES OF LIL FOR NEGATIVELY ASSOCIATED SEQUENCES Ke-Ang Fu Li-Hua Hu Abstract. Let X n ; n 1 be a strictly stationary
More informationPathwise volatility in a long-memory pricing model: estimation and asymptotic behavior
Pathwise volatility in a long-memory pricing model: estimation and asymptotic behavior Ehsan Azmoodeh University of Vaasa Finland 7th General AMaMeF and Swissquote Conference September 7 1, 215 Outline
More informationBrownian Motion. 1 Definition Brownian Motion Wiener measure... 3
Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................
More informationPARAMETER ESTIMATION FOR SPDES WITH MULTIPLICATIVE FRACTIONAL NOISE
PARAMETER ESTIMATION FOR SPDES WITH MULTIPLICATIVE FRACTIONAL NOISE IGOR CIALENCO Abstract. We study parameter estimation problem for diagonalizable stochastic partial differential equations driven by
More informationDA Freedman Notes on the MLE Fall 2003
DA Freedman Notes on the MLE Fall 2003 The object here is to provide a sketch of the theory of the MLE. Rigorous presentations can be found in the references cited below. Calculus. Let f be a smooth, scalar
More informationPACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION
PACKING-DIMENSION PROFILES AND FRACTIONAL BROWNIAN MOTION DAVAR KHOSHNEVISAN AND YIMIN XIAO Abstract. In order to compute the packing dimension of orthogonal projections Falconer and Howroyd 997) introduced
More informationBrownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539
Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory
More informationPacking-Dimension Profiles and Fractional Brownian Motion
Under consideration for publication in Math. Proc. Camb. Phil. Soc. 1 Packing-Dimension Profiles and Fractional Brownian Motion By DAVAR KHOSHNEVISAN Department of Mathematics, 155 S. 1400 E., JWB 233,
More informationGeneralized Gaussian Bridges of Prediction-Invertible Processes
Generalized Gaussian Bridges of Prediction-Invertible Processes Tommi Sottinen 1 and Adil Yazigi University of Vaasa, Finland Modern Stochastics: Theory and Applications III September 1, 212, Kyiv, Ukraine
More informationMan Kyu Im*, Un Cig Ji **, and Jae Hee Kim ***
JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 19, No. 4, December 26 GIRSANOV THEOREM FOR GAUSSIAN PROCESS WITH INDEPENDENT INCREMENTS Man Kyu Im*, Un Cig Ji **, and Jae Hee Kim *** Abstract.
More informationAn adaptive numerical scheme for fractional differential equations with explosions
An adaptive numerical scheme for fractional differential equations with explosions Johanna Garzón Departamento de Matemáticas, Universidad Nacional de Colombia Seminario de procesos estocásticos Jointly
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 218. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationOn the quantiles of the Brownian motion and their hitting times.
On the quantiles of the Brownian motion and their hitting times. Angelos Dassios London School of Economics May 23 Abstract The distribution of the α-quantile of a Brownian motion on an interval [, t]
More informationRough paths methods 4: Application to fbm
Rough paths methods 4: Application to fbm Samy Tindel Purdue University University of Aarhus 2016 Samy T. (Purdue) Rough Paths 4 Aarhus 2016 1 / 67 Outline 1 Main result 2 Construction of the Levy area:
More informationt x 1 e t dt, and simplify the answer when possible (for example, when r is a positive even number). In particular, confirm that EX 4 = 3.
Mathematical Statistics: Homewor problems General guideline. While woring outside the classroom, use any help you want, including people, computer algebra systems, Internet, and solution manuals, but mae
More informationThe Codimension of the Zeros of a Stable Process in Random Scenery
The Codimension of the Zeros of a Stable Process in Random Scenery Davar Khoshnevisan The University of Utah, Department of Mathematics Salt Lake City, UT 84105 0090, U.S.A. davar@math.utah.edu http://www.math.utah.edu/~davar
More informationMathematical statistics
October 4 th, 2018 Lecture 12: Information Where are we? Week 1 Week 2 Week 4 Week 7 Week 10 Week 14 Probability reviews Chapter 6: Statistics and Sampling Distributions Chapter 7: Point Estimation Chapter
More informationConvergence at first and second order of some approximations of stochastic integrals
Convergence at first and second order of some approximations of stochastic integrals Bérard Bergery Blandine, Vallois Pierre IECN, Nancy-Université, CNRS, INRIA, Boulevard des Aiguillettes B.P. 239 F-5456
More informationLecture 7 Introduction to Statistical Decision Theory
Lecture 7 Introduction to Statistical Decision Theory I-Hsiang Wang Department of Electrical Engineering National Taiwan University ihwang@ntu.edu.tw December 20, 2016 1 / 55 I-Hsiang Wang IT Lecture 7
More informationOn a class of stochastic differential equations in a financial network model
1 On a class of stochastic differential equations in a financial network model Tomoyuki Ichiba Department of Statistics & Applied Probability, Center for Financial Mathematics and Actuarial Research, University
More informationConsistent Bivariate Distribution
A Characterization of the Normal Conditional Distributions MATSUNO 79 Therefore, the function ( ) = G( : a/(1 b2)) = N(0, a/(1 b2)) is a solu- tion for the integral equation (10). The constant times of
More informationCOVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE
Communications on Stochastic Analysis Vol. 4, No. 3 (21) 299-39 Serials Publications www.serialspublications.com COVARIANCE IDENTITIES AND MIXING OF RANDOM TRANSFORMATIONS ON THE WIENER SPACE NICOLAS PRIVAULT
More informationThe concentration of a drug in blood. Exponential decay. Different realizations. Exponential decay with noise. dc(t) dt.
The concentration of a drug in blood Exponential decay C12 concentration 2 4 6 8 1 C12 concentration 2 4 6 8 1 dc(t) dt = µc(t) C(t) = C()e µt 2 4 6 8 1 12 time in minutes 2 4 6 8 1 12 time in minutes
More informationExercises. T 2T. e ita φ(t)dt.
Exercises. Set #. Construct an example of a sequence of probability measures P n on R which converge weakly to a probability measure P but so that the first moments m,n = xdp n do not converge to m = xdp.
More information9 Brownian Motion: Construction
9 Brownian Motion: Construction 9.1 Definition and Heuristics The central limit theorem states that the standard Gaussian distribution arises as the weak limit of the rescaled partial sums S n / p n of
More informationA connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand
A connection between the stochastic heat equation and fractional Brownian motion, and a simple proof of a result of Talagrand Carl Mueller 1 and Zhixin Wu Abstract We give a new representation of fractional
More informationConditional Full Support for Gaussian Processes with Stationary Increments
Conditional Full Support for Gaussian Processes with Stationary Increments Tommi Sottinen University of Vaasa Kyiv, September 9, 2010 International Conference Modern Stochastics: Theory and Applications
More informationUNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE
Surveys in Mathematics and its Applications ISSN 1842-6298 (electronic), 1843-7265 (print) Volume 5 (2010), 275 284 UNCERTAINTY FUNCTIONAL DIFFERENTIAL EQUATIONS FOR FINANCE Iuliana Carmen Bărbăcioru Abstract.
More informationSolution for Problem 7.1. We argue by contradiction. If the limit were not infinite, then since τ M (ω) is nondecreasing we would have
362 Problem Hints and Solutions sup g n (ω, t) g(ω, t) sup g(ω, s) g(ω, t) µ n (ω). t T s,t: s t 1/n By the uniform continuity of t g(ω, t) on [, T], one has for each ω that µ n (ω) as n. Two applications
More informationOn pathwise stochastic integration
On pathwise stochastic integration Rafa l Marcin Lochowski Afican Institute for Mathematical Sciences, Warsaw School of Economics UWC seminar Rafa l Marcin Lochowski (AIMS, WSE) On pathwise stochastic
More informationON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS
PORTUGALIAE MATHEMATICA Vol. 55 Fasc. 4 1998 ON THE PATHWISE UNIQUENESS OF SOLUTIONS OF STOCHASTIC DIFFERENTIAL EQUATIONS C. Sonoc Abstract: A sufficient condition for uniqueness of solutions of ordinary
More informationLecture 21 Representations of Martingales
Lecture 21: Representations of Martingales 1 of 11 Course: Theory of Probability II Term: Spring 215 Instructor: Gordan Zitkovic Lecture 21 Representations of Martingales Right-continuous inverses Let
More informationPROBABILITY: LIMIT THEOREMS II, SPRING HOMEWORK PROBLEMS
PROBABILITY: LIMIT THEOREMS II, SPRING 15. HOMEWORK PROBLEMS PROF. YURI BAKHTIN Instructions. You are allowed to work on solutions in groups, but you are required to write up solutions on your own. Please
More informationLecture 12. F o s, (1.1) F t := s>t
Lecture 12 1 Brownian motion: the Markov property Let C := C(0, ), R) be the space of continuous functions mapping from 0, ) to R, in which a Brownian motion (B t ) t 0 almost surely takes its value. Let
More informationHomogeneous Stochastic Differential Equations
WDS'1 Proceedings of Contributed Papers, Part I, 195 2, 21. ISBN 978-8-7378-139-2 MATFYZPRESS Homogeneous Stochastic Differential Equations J. Bártek Charles University, Faculty of Mathematics and Physics,
More informationExpectation. DS GA 1002 Probability and Statistics for Data Science. Carlos Fernandez-Granda
Expectation DS GA 1002 Probability and Statistics for Data Science http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall17 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean,
More informationA Short Introduction to Diffusion Processes and Ito Calculus
A Short Introduction to Diffusion Processes and Ito Calculus Cédric Archambeau University College, London Center for Computational Statistics and Machine Learning c.archambeau@cs.ucl.ac.uk January 24,
More informationAsymptotic Properties of an Approximate Maximum Likelihood Estimator for Stochastic PDEs
Asymptotic Properties of an Approximate Maximum Likelihood Estimator for Stochastic PDEs M. Huebner S. Lototsky B.L. Rozovskii In: Yu. M. Kabanov, B. L. Rozovskii, and A. N. Shiryaev editors, Statistics
More informationGAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM
GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)
More informationConvergence rates in weighted L 1 spaces of kernel density estimators for linear processes
Alea 4, 117 129 (2008) Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Anton Schick and Wolfgang Wefelmeyer Anton Schick, Department of Mathematical Sciences,
More information1 Introduction. 2 Diffusion equation and central limit theorem. The content of these notes is also covered by chapter 3 section B of [1].
1 Introduction The content of these notes is also covered by chapter 3 section B of [1]. Diffusion equation and central limit theorem Consider a sequence {ξ i } i=1 i.i.d. ξ i = d ξ with ξ : Ω { Dx, 0,
More informationMalliavin calculus and central limit theorems
Malliavin calculus and central limit theorems David Nualart Department of Mathematics Kansas University Seminar on Stochastic Processes 2017 University of Virginia March 8-11 2017 David Nualart (Kansas
More informationExpectation. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Expectation DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Aim Describe random variables with a few numbers: mean, variance,
More informationMULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES
MULTIDIMENSIONAL WICK-ITÔ FORMULA FOR GAUSSIAN PROCESSES D. NUALART Department of Mathematics, University of Kansas Lawrence, KS 6645, USA E-mail: nualart@math.ku.edu S. ORTIZ-LATORRE Departament de Probabilitat,
More informationSupermodular ordering of Poisson arrays
Supermodular ordering of Poisson arrays Bünyamin Kızıldemir Nicolas Privault Division of Mathematical Sciences School of Physical and Mathematical Sciences Nanyang Technological University 637371 Singapore
More informationChapter 3. Point Estimation. 3.1 Introduction
Chapter 3 Point Estimation Let (Ω, A, P θ ), P θ P = {P θ θ Θ}be probability space, X 1, X 2,..., X n : (Ω, A) (IR k, B k ) random variables (X, B X ) sample space γ : Θ IR k measurable function, i.e.
More informationStochastic Processes. M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno
Stochastic Processes M. Sami Fadali Professor of Electrical Engineering University of Nevada, Reno 1 Outline Stochastic (random) processes. Autocorrelation. Crosscorrelation. Spectral density function.
More informationStochastic Differential Equations.
Chapter 3 Stochastic Differential Equations. 3.1 Existence and Uniqueness. One of the ways of constructing a Diffusion process is to solve the stochastic differential equation dx(t) = σ(t, x(t)) dβ(t)
More informationSome Properties of NSFDEs
Chenggui Yuan (Swansea University) Some Properties of NSFDEs 1 / 41 Some Properties of NSFDEs Chenggui Yuan Swansea University Chenggui Yuan (Swansea University) Some Properties of NSFDEs 2 / 41 Outline
More informationMi-Hwa Ko. t=1 Z t is true. j=0
Commun. Korean Math. Soc. 21 (2006), No. 4, pp. 779 786 FUNCTIONAL CENTRAL LIMIT THEOREMS FOR MULTIVARIATE LINEAR PROCESSES GENERATED BY DEPENDENT RANDOM VECTORS Mi-Hwa Ko Abstract. Let X t be an m-dimensional
More information1. Stochastic Processes and filtrations
1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S
More informationLecture 22 Girsanov s Theorem
Lecture 22: Girsanov s Theorem of 8 Course: Theory of Probability II Term: Spring 25 Instructor: Gordan Zitkovic Lecture 22 Girsanov s Theorem An example Consider a finite Gaussian random walk X n = n
More informationA Primer on Asymptotics
A Primer on Asymptotics Eric Zivot Department of Economics University of Washington September 30, 2003 Revised: October 7, 2009 Introduction The two main concepts in asymptotic theory covered in these
More informationSolutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14
Introduction to Econometrics (3 rd Updated Edition) by James H. Stock and Mark W. Watson Solutions to Odd-Numbered End-of-Chapter Exercises: Chapter 14 (This version July 0, 014) 015 Pearson Education,
More information1 Brownian Local Time
1 Brownian Local Time We first begin by defining the space and variables for Brownian local time. Let W t be a standard 1-D Wiener process. We know that for the set, {t : W t = } P (µ{t : W t = } = ) =
More informationLarge Deviations for Small-Noise Stochastic Differential Equations
Chapter 21 Large Deviations for Small-Noise Stochastic Differential Equations This lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a first taste of large
More informationGaussian Processes. 1. Basic Notions
Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian
More informationCovariance function estimation in Gaussian process regression
Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian
More informationIEOR 4701: Stochastic Models in Financial Engineering. Summer 2007, Professor Whitt. SOLUTIONS to Homework Assignment 9: Brownian motion
IEOR 471: Stochastic Models in Financial Engineering Summer 27, Professor Whitt SOLUTIONS to Homework Assignment 9: Brownian motion In Ross, read Sections 1.1-1.3 and 1.6. (The total required reading there
More informationSystem Identification, Lecture 4
System Identification, Lecture 4 Kristiaan Pelckmans (IT/UU, 2338) Course code: 1RT880, Report code: 61800 - Spring 2016 F, FRI Uppsala University, Information Technology 13 April 2016 SI-2016 K. Pelckmans
More informationSTOCHASTIC GEOMETRY BIOIMAGING
CENTRE FOR STOCHASTIC GEOMETRY AND ADVANCED BIOIMAGING 2018 www.csgb.dk RESEARCH REPORT Anders Rønn-Nielsen and Eva B. Vedel Jensen Central limit theorem for mean and variogram estimators in Lévy based
More information-variation of the divergence integral w.r.t. fbm with Hurst parameter H < 1 2
/4 On the -variation of the divergence integral w.r.t. fbm with urst parameter < 2 EL ASSAN ESSAKY joint work with : David Nualart Cadi Ayyad University Poly-disciplinary Faculty, Safi Colloque Franco-Maghrébin
More informationA Few Notes on Fisher Information (WIP)
A Few Notes on Fisher Information (WIP) David Meyer dmm@{-4-5.net,uoregon.edu} Last update: April 30, 208 Definitions There are so many interesting things about Fisher Information and its theoretical properties
More informationHarmonic Analysis. 1. Hermite Polynomials in Dimension One. Recall that if L 2 ([0 2π] ), then we can write as
Harmonic Analysis Recall that if L 2 ([0 2π] ), then we can write as () Z e ˆ (3.) F:L where the convergence takes place in L 2 ([0 2π] ) and ˆ is the th Fourier coefficient of ; that is, ˆ : (2π) [02π]
More informationExact Simulation of Multivariate Itô Diffusions
Exact Simulation of Multivariate Itô Diffusions Jose Blanchet Joint work with Fan Zhang Columbia and Stanford July 7, 2017 Jose Blanchet (Columbia/Stanford) Exact Simulation of Diffusions July 7, 2017
More informationResearch Statement. Mamikon S. Ginovyan. Boston University
Research Statement Mamikon S. Ginovyan Boston University 1 Abstract My research interests and contributions mostly are in the areas of prediction, estimation and hypotheses testing problems for second
More informationHan-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek
J. Korean Math. Soc. 41 (2004), No. 5, pp. 883 894 CONVERGENCE OF WEIGHTED SUMS FOR DEPENDENT RANDOM VARIABLES Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek Abstract. We discuss in this paper the strong
More informationEULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS
Qiao, H. Osaka J. Math. 51 (14), 47 66 EULER MARUYAMA APPROXIMATION FOR SDES WITH JUMPS AND NON-LIPSCHITZ COEFFICIENTS HUIJIE QIAO (Received May 6, 11, revised May 1, 1) Abstract In this paper we show
More information14 - Gaussian Stochastic Processes
14-1 Gaussian Stochastic Processes S. Lall, Stanford 211.2.24.1 14 - Gaussian Stochastic Processes Linear systems driven by IID noise Evolution of mean and covariance Example: mass-spring system Steady-state
More informationDiscretization of SDEs: Euler Methods and Beyond
Discretization of SDEs: Euler Methods and Beyond 09-26-2006 / PRisMa 2006 Workshop Outline Introduction 1 Introduction Motivation Stochastic Differential Equations 2 The Time Discretization of SDEs Monte-Carlo
More informationON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER
ON THE FIRST TIME THAT AN ITO PROCESS HITS A BARRIER GERARDO HERNANDEZ-DEL-VALLE arxiv:1209.2411v1 [math.pr] 10 Sep 2012 Abstract. This work deals with first hitting time densities of Ito processes whose
More informationThe Cameron-Martin-Girsanov (CMG) Theorem
The Cameron-Martin-Girsanov (CMG) Theorem There are many versions of the CMG Theorem. In some sense, there are many CMG Theorems. The first version appeared in ] in 944. Here we present a standard version,
More informationReview. DS GA 1002 Statistical and Mathematical Models. Carlos Fernandez-Granda
Review DS GA 1002 Statistical and Mathematical Models http://www.cims.nyu.edu/~cfgranda/pages/dsga1002_fall16 Carlos Fernandez-Granda Probability and statistics Probability: Framework for dealing with
More informationVariations. ECE 6540, Lecture 10 Maximum Likelihood Estimation
Variations ECE 6540, Lecture 10 Last Time BLUE (Best Linear Unbiased Estimator) Formulation Advantages Disadvantages 2 The BLUE A simplification Assume the estimator is a linear system For a single parameter
More informationMultiple points of the Brownian sheet in critical dimensions
Multiple points of the Brownian sheet in critical dimensions Robert C. Dalang Ecole Polytechnique Fédérale de Lausanne Based on joint work with: Carl Mueller Multiple points of the Brownian sheet in critical
More informationWHITE NOISE APPROACH TO THE ITÔ FORMULA FOR THE STOCHASTIC HEAT EQUATION
Communications on Stochastic Analysis Vol. 1, No. 2 (27) 311-32 WHITE NOISE APPROACH TO THE ITÔ FORMULA FOR THE STOCHASTIC HEAT EQUATION ALBERTO LANCONELLI Abstract. We derive an Itô s-type formula for
More informationarxiv: v2 [math.pr] 27 Oct 2015
A brief note on the Karhunen-Loève expansion Alen Alexanderian arxiv:1509.07526v2 [math.pr] 27 Oct 2015 October 28, 2015 Abstract We provide a detailed derivation of the Karhunen Loève expansion of a stochastic
More informationCitation Osaka Journal of Mathematics. 41(4)
TitleA non quasi-invariance of the Brown Authors Sadasue, Gaku Citation Osaka Journal of Mathematics. 414 Issue 4-1 Date Text Version publisher URL http://hdl.handle.net/1194/1174 DOI Rights Osaka University
More informationProperties of the Autocorrelation Function
Properties of the Autocorrelation Function I The autocorrelation function of a (real-valued) random process satisfies the following properties: 1. R X (t, t) 0 2. R X (t, u) =R X (u, t) (symmetry) 3. R
More informationLarge Deviations for Small-Noise Stochastic Differential Equations
Chapter 22 Large Deviations for Small-Noise Stochastic Differential Equations This lecture is at once the end of our main consideration of diffusions and stochastic calculus, and a first taste of large
More informationMAS223 Statistical Inference and Modelling Exercises
MAS223 Statistical Inference and Modelling Exercises The exercises are grouped into sections, corresponding to chapters of the lecture notes Within each section exercises are divided into warm-up questions,
More information