Strictly Stationary Solutions of Autoregressive Moving Average Equations

Size: px
Start display at page:

Download "Strictly Stationary Solutions of Autoregressive Moving Average Equations"

Transcription

1 Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution of the equations defining an ARMA(p, q process driven by an independent and identically distributed noise sequence are determined. No moment assumptions on the driving noise sequence are made. Keywords: ARMA process, heavy tails, infinite variance, strict stationarity Introduction A strict ARMA(p, q process (autoregressive moving average process of order (p, q is defined as a complex-valued strictly stationary solution Y = (Y t t Z of the difference equations Φ(BY t = Θ(BZ t, t Z, (. where B denotes the backward shift operator, (Z t t Z is an independent and identically distributed (i.i.d. complex-valued sequence and Φ(z and Θ(z are the polynomials, Φ(z := φ z... φ p z p, Θ(z := + θ z θ q z q, z C, with φ,..., φ p C, θ,..., θ q C, φ 0 :=, θ 0 :=, φ p 0 and θ q 0. A great deal of attention has been devoted to weak ARMA(p, q processes which are defined as weakly stationary solutions of the difference equations (. with (Z t t Z assumed only to be white noise, i.e an uncorrelated sequence of random variables with constant Colorado State University, Fort Collins, Colorado. pjbrock@stat.colostate.edu. Support of this work by NSF Grant DMS is gratefully acknowledged. Institut für Mathematische Stochastik, TU Braunschweig, Pockelsstraße 4, D-3806 Braunschweig, Germany a.lindner@tu-bs.de

2 mean µ and variance σ 2 > 0. It is easy to show, see e.g. Brockwell and Davis [], that if (Y t t Z is a weakly stationary solution, then it has a spectral density f Y which satisfies, Φ(e iω 2 f Y (ω = Θ(e iω 2 σ 2 /(2π, ω [ π, π]. (.2 Since f Y must be integrable on [ π, π], this shows that any zero of Φ( on the unit circle must be removable, i.e. if λ = and λ is a zero of order m of Φ(, then λ must also be a zero of Θ( of order at least m. Conversely if all the zeroes of Φ(z on the unit circle are removable and (Z t t Z is white noise then there exists a weakly stationary solution of (., namely the unique weakly stationary solution of the equation obtained from (. by cancelling the factors of Φ(B corresponding to the zeroes on the unit circle with the corresponding factors of Θ(B. If Φ(z has no zeroes on the unit circle then the weakly stationary solution is unique. Provided the probability space on which (Z t t Z is defined is rich enough to support a random variable which is uniformly distributed on [0, ] and uncorrelated with (Z t t Z, then there is a unique weakly stationary solution of (. only if Φ(z has no zeroes on the unit circle. (The last statement can be proved using a slight modification of the argument in Lemma 2 below. In recent years the study of heavy-tailed and asymmetric time series models has become of particular importance, especially in mathematical finance, where series of daily log returns on assets (the daily log return at time t is defined as log P t log P t, where P t is the closing price of the asset on day t show clear evidence of dependence, heavy tails and asymmetry. Daily realized volatility series also exhibit these features. The interest in ARMA processes as defined in the first paragraph has increased accordingly, especially those driven by i.i.d. noise with possible asymmetry and heavy tails. Although sufficient conditions for the existence of strictly stationary solutions of (. under the assumption of i.i.d. (Z t t Z with not necessarily finite variance have been given in the past (see e.g. Cline and Brockwell [3], where it is assumed that the tails of the distribution of Z t are regularly varying and that neither Φ nor Θ has a zero on the unit circle, necessary conditions are much more difficult to prescribe since the simple argument based on the spectral density in (.2 does not apply. Yohai and Maronna [7], in their study of estimation for heavy-tailed autoregressive processes, make the assumption that E log + Z t < (.3 to ensure stationarity and remark that it seems difficult to relax it. More recently Mikosch et al. [6] and Davis [4] have investigated parameter estimation for ARMA processes for which Z t is in the domain of attraction of a stable law. In spite of the considerable interest in heavy-tailed ARMA processes there appears however to have been no 2

3 systematic investigation of precise conditions under which the equation (. has a strictly stationary solution and conditions under which such a solution is unique. The purposes of this note are to establish necessary and sufficient conditions on both the i.i.d. noise and the zeroes of the defining polynomials in (. under which a strictly stationary solution (Y t t Z of the equations (. exists, to specify a solution when these conditions are satisfied and to give necessary and sufficient conditions for its uniqueness. Such conditions are given in Theorem 2. which shows in particular that the condition (.3 cannot be relaxed except in the trivial case when all the singularities of Θ(z/Φ(z are removable. Furthermore, if (Z t t Z is non-deterministic, a strictly stationary solution of (. cannot exist unless all zeroes of Φ on the unit circle are also zeroes of Θ of at least the same multiplicity. Under either of these conditions a strictly stationary solution is given by the expression (familiar from the finite variance case, Y t = ψ k Z t k, (.4 k= where ψ k is the coefficient of z k in the Laurent expansion of Θ(z/Φ(z. If Φ(z 0 for all z C such that z = the solution (.4 is the unique strictly stationary solution of (.. If Φ(z = 0 for some z C such that z = and there exists a strictly stationary solution of (. then we show in Lemma 2 that it is not the only one (assuming that the probability space on which (Z t t Z is defined is rich enough to support a random variable uniformly distributed on [0, ] and independent of (Z t t Z. Lemma 2.2, which justifies the cancellation of common factors of Φ(z and Θ(z under appropriate conditions, plays a key role in the proof of Theorem 2.. A version with less restrictive conditions is established as Theorem 3.2. If (Z t t Z is deterministic, i.e. if there exists a constant c C such that P (Z t = c = for all t Z, the conditions are slightly different. They are specified in Theorem 3. 2 Existence of a strictly stationary solution The following theorem, which is the main result of the paper, gives necessary and sufficient conditions for the existence of a strictly stationary solution of (. with non-deterministic i.i.d. noise (Z t t Z. Theorem. Suppose that (Z t t Z is a non-deterministic i.i.d. sequence. Then the ARMA equation (. admits a strictly stationary solution (Y t t Z if and only if (i all singularities of Θ(z/Φ(z on the unit circle are removable and E log + Z <, or (ii all singularities of Θ(z/Φ(z in C are removable. 3

4 If (i or (ii above holds, then a strictly stationary solution of (. is given by where k= Y t = k= ψ k Z t k, t Z, (2. ψ k z k = Θ(z, δ < z < + δ for some δ (0,, Φ(z is the Laurent expansion of Θ(z/Φ(z. The sum in (2. converges absolutely almost surely. If Φ does not have a zero on the unit circle, then (2. is the unique strictly stationary solution of (.. Before proving Theorem we need to establish conditions under which common factors of Φ(z and Θ(z can be cancelled. This is done in the proof of the following lemma. Lemma. [Reduction Lemma] Suppose that Y = (Y t t Z is a strict ARMA(p, q process satisfying (.. Suppose that λ C is such that Φ(λ = Θ(λ = 0, and define Φ (z := Φ(z λ z, Θ (z := Θ(z λ z, z C. If λ = suppose further that all finite dimensional distributions of Y are symmetric, that Z 0 is symmetric and that Φ (λ = 0, i.e. the multiplicity of the zero λ of Φ is at least 2. Then Y is a strict ARMA(p, q process with autoregressive polynomial Φ and moving average polynomial Θ, i.e. Φ (BY t = Θ (BZ t, t Z. (2.2 (As will be shown in Theorem 2 below, the symmetry conditions imposed in the case λ = can be eliminated. Proof of Lemma. Define W t := Φ (BY t, t Z. (2.3 Then (W t t Z is strictly stationary and since Φ(z = ( λ zφ (z we have W t λ W t = Θ(BZ t, t Z. Iterating gives for n N 0 n W t = λ n W t n + Θ(BZ t j, t Z. 4

5 Writing Θ(BZ t = q k=0 θ kz t k (with θ 0 :=, it follows that for n q W t λ n = = W t n n q k=0 q q + θ k Z t k j ( j n θ k λ k Z t j + k=0 λ n j ( q k=j+ θ k λ k j=q ( q θ k λ k Introducing the assumption that Θ(λ = 0, we easily find that q ( q k=j+ q θ k λ k z j = so that (2.4 can be written in the form k=0 Z t j Z t n j. (2.4 ( j θ k λ k z j = Θ (z, z C, k=0 W t λ n W t n = Θ (BZ t λ n Θ (BZ t n, n q, t Z. (2.5 Now if λ >, then the strict stationarity of (W t t Z and an application of Slutsky s lemma show that λ n W t n converges in probability to 0 as n, and similarly for λ n Θ (BZ t, so that for λ > we have Φ (BY t = W t = Θ (BZ t, t Z, which is (2.2. If λ <, multiplying (2.5 by λ n and substituting u = t n gives W u λ n W u+n = Θ (BZ u λ n Θ (BZ u+n, n q, u Z. Letting n shows again that W u = Θ (BZ u, so that (2.2 is true also in this case. Now let λ = and assume that Φ (λ = 0. Define and Φ 2 (z := Φ (z λ z = Φ(z ( λ z, z C, 2 X t := Φ 2 (BY t, t Z. Then (X t t Z is strictly stationary, and X t λ X t = Φ (BY t = W t. (2.6 Defining C t := W t Θ (BZ t, t Z, (2.7 5

6 it follows from (2.5 that C t = λ n C t n = λ n W t n λ n Θ (BZ t n, n q, t Z. (2.8 Summing this over n from q to N for N q and inserting (2.6 gives (N q + C t = N n=q λ n (X t n λ X t n N n=q λ n Θ (BZ t n, so that for t Z and N q we get C t (N q+ ( λ q X t q λ (N+ X t N = (N q+ N n=q λ n Θ (BZ t n. (2.9 But since (X t t Z is strictly stationary and since λ =, Slutsky s lemma shows that the left hand side of (2.9 converges in probability to C t as N. Hence the right hand side of (2.9 converges also in probability to C t. But it is easy to see that the probability limit as N of the right hand side must be measurable with respect to the tail-σalgebra n N σ ( k n σ(z t k, which by Kolmogorov s zero-one law is P -trivial. Hence C t is independent of itself, so that C t must be deterministic. Using the assumed symmetry of Y and Z 0, it follows that W t must be symmetric for each t Z, as is W t C t = Θ (BZ t. But this is only possible if the deterministic C t is zero. Equation (2.7 then shows that Φ (BY t = W t = Θ (BZ t, t Z, which is (2.2. Proof of Theorem. Sufficiency of condition (ii can be established by writing equation (. as Φ(BY t = Φ(BΘ (BZ t, where Θ (z is a polynomial. It is then clear that Y t := Θ (BZ t, t Z, is a strictly stationary solution of (. regardless of the distribution of Z 0. To establish the sufficiency of condition (i, define (Y t t Z as in (2.. To see that the defining sum converges absolutely with probability one, observe that there are constants c, d > 0 such that ψ k de c k for all k Z. For c (0, c this gives ( P ψ k Z t k > e c k k Z k Z P (log + (d Z t k > (c c k = k Z P (log + (d Z 0 > (c c k <, the last inequality being due to the condition E log + Z 0 <. The Borel-Cantelli lemma then shows that the event { ψ k Z t k > e c k infinitely often} has probability zero, giving 6

7 the almost sure absolute convergence of the series. It is clear that (Y t t Z as defined by (2. is strictly stationary and, from the absolute convergence of the series, that it satisfies (.. The uniqueness of the strictly stationary solution under either of the conditions (i or (ii when Φ has no zeroes on the unit circle is guaranteed by Lemma 3. of Brockwell and Lindner [2]. To show that the conditions are necessary, let (Y t t Z be a strictly stationary solution of Φ(BY t = Θ(BZ t. We shall mimic the proofs of Proposition 2.5 and Theorem 4.2 in [2]. Let λ be a zero of Φ of multiplicity m Φ, and denote by m Θ N 0 the multiplicity of λ as a zero of Θ. We shall assume that the singularity of Θ/Φ at λ is not removable, i.e. that m Φ > m Θ, and show that this implies E log + Z 0 < in each of the cases (a λ > and (b λ <, while we derive a contradiction in the case (c λ =. (a Let λ >. By Lemma, we may assume without loss of generality that m Θ = 0 and m Φ. Define W t = Φ (BY t as in (2.3 with Φ (z = Φ(z/( λ z. Then it follows from (2.4 that q W t λ n W t n j=q ( j θ k λ k k=0 Z t j λ n q ( q k=j+ θ k λ k Z t n j n = Θ(λ Z t j. (2.0 Due to the stationarity of (W t t Z and of (Z t t Z, the left hand side of (2.0 converges in probability as n. Hence so does the right hand side. But since m Θ = 0 we have Θ(λ 0, so that j=q λ j Z t j converges in probability and hence almost surely as a sum with independent summands. Since (Z t t Z is i.i.d., it follows from the Borel-Cantelli- Lemma that P (log + Z 0 > n log λ = n=0 so that E log + Z 0 < as claimed. = P ( Z 0 > λ n n=0 n=0 P ( λ n Z n > <, (b The case λ < can be treated similarly, multiplying (2.0 by λ n, substituting u = t n, and letting n with u fixed. (c Let λ =. We shall show that the assumption m Φ > m Θ leads to a contradiction. By taking an independent copy (Y t, Z t t Z of (Y t, Z t t Z, we see that Φ(BY t = Θ(BZ t 7

8 and hence that the symmetrized versions Y sym t := Y t Y t and Z sym t := Z t Z t satisfy Φ(BY sym t = Θ(BZ sym t, t Z. Hence we may assume without loss of generality that all finite dimensional distributions of Y are symmetric and that Z 0 is symmetric, with Z 0 0 due to the assumption that Z 0 is not deterministic. By Lemma we may hence assume that m Φ and m Θ = 0, so that Θ(λ 0. Defining W t as in (2.3, it follows from the stationarity of (W t t Z and of (Z t t Z and from λ = that there exists a constant K > 0 such that the probability that the modulus of the left hand side of (2.0 is less than K is greater than or equal to /2 for all n q, so that by (2.0 also ( P Θ(λ n j=q Z j < K 2 n q. In particular, Θ(λ n j=q λ j Z j does not converge in probability to + as n. Since Θ(λ = 0 and since n j=q λ j Z j is a sum of independent symmetric terms, this implies that λ j Z j converges almost surely (see Kallenberg [5], Theorem 4.7. The Borel-Cantelli lemma then shows that P ( Z 0 > r = P ( λ j Z j > r <. Hence it follows that P ( Z 0 > r = 0 for each r > 0, so that Z 0 = 0 a.s., which is impossible since Z 0 (and hence its symmetrisation was assumed to be non-deterministic. 3 Uniqueness, order reduction and deterministic Z In Theorem 2. we gave necessary and sufficient conditions for the existence of a strictly stationary solution of (. and a sufficient condition, namely Φ(z 0 for all z C such that z =, for uniqueness of the strictly stationary solution. In the following lemma we show that the latter condition is also necessary for uniqueness if the probability space (Ω, F, P on which (Z t t Z is defined is sufficiently rich. Lemma 2. If equation (. has a strictly stationary solution, if Φ(z has a zero λ on the unit circle and if there exists a random variable U on the probability space (Ω, F, P which is uniformly distributed on the interval [0, ] and independent of (Z t t Z, then the strictly stationary solution is not unique. 8

9 Proof. Define Ỹ t := λ t exp(2πiu, t Z. It is clear that (Ỹt t Z is strictly stationary since λ = and Ỹt λ Ỹ t = 0 so that Φ(BỸt = 0. Consequently, if (Y t t Z denotes the solution of (. given by (2., then (Y t + Ỹt t Z is another solution of (. and is strictly stationary since it is the sum of two independent strictly stationary processes. The following theorem is a strengthened version of the Reduction Lemma (Lemma 2.2 which was used in the proof of Theorem 2.. The proof makes use of Theorem 2. to extend the lemma so as to allow cancellation of factors corresponding to autoregressive and moving average roots on the unit circle under less restrictive conditions. Theorem 2. [Order Reduction] Let Λ denote the set of distinct zeroes of the autoregressive polynomial Φ(z in (.. For each λ Λ denote by m Φ (λ N the multiplicity of λ as a zero of Φ, and by m Θ (λ N 0 the multiplicity of λ as a zero of Θ(z (with m Θ (λ := 0 if Θ(λ 0. Let (Z t t Z be a non-deterministic i.i.d. sequence and Y = (Y t t Z be a strictly stationary solution of the ARMA(p, q-equations (.. Then (by Theorem 2. m Θ (λ m Φ (λ for all λ Λ such that λ =. If we define the reduced polynomials Φ red (z := Φ(z ( λ z min(m Φ(λ,m Θ (λ ( λ z mφ(λ, z C, Θ red (z := Θ(z λ Λ: λ λ Λ: λ ( λ z min(m Φ(λ,m Θ (λ λ Λ: λ = λ Λ: λ = ( λ z m Φ(λ, z C, then Y is also a strictly stationary solution of the reduced ARMA equations Φ red (BY t = Θ red (BZ t, t Z. (3. Conversely, if m Φ (λ m Θ (λ for all λ Λ such that λ = and Y is a strictly stationary solution of (3., then Y is also a strictly stationary solution of (.. Proof. Let Y = (Y t t Z be a strictly stationary solution of (.. Then m Θ (λ m Φ (λ for all λ Λ such that λ = by Theorem, so that Θ red is indeed a polynomial. To show that Y satisfies (3., it is sufficient to show that Lemma is true without the symmetry assumptions on Y and Z 0 if λ =, since (3. will then follow by induction (removing one factor at a time. Suppose therefore that λ Λ with λ = and m Θ (λ m Φ (λ 2. With W t and C t as defined in (2.3 and (2.7, respectively, it follows from the proof of Lemma that C t is deterministic for every t Z (this conclusion did not use the symmetry. From (2.8 we have C t = λ n C t n for each t Z and n q, from which it follows that C t = λ t C 0, t Z, 9

10 for some complex number C 0, and the proof of (2.2 and hence of (3. is finished once C 0 is established to be 0. To show that C 0 = 0, observe from (2.7 that W t = Θ (BZ t + λ t C 0, where (W t t Z and (Θ (BZ t t Z are both strictly stationary. Hence λ t C 0 must be independent of t, which is possible only if λ = or C 0 = 0. give To show that C 0 = 0 in either case, suppose that λ =, in which case (2.3 and (2.7 Y t Y t = Θ (BZ t + C 0, t Z. But since m Θ ( m Φ ( 2, we can define the polynomial Θ 2 (z := ( z 2 Θ(z and V t := Θ 2 (BZ t, t Z. Then (V t t Z is strictly stationary, and Y t Y t = Θ (BZ t + C 0 = V t V t + C 0, t Z. This implies that Y t Y t n = V t V t n + C 0 n, n N. If we divide this equation by n and let n tend to infinity, we see that the left hand side converges in probability to 0, and the right hand side to C 0. Consequently C 0 must be zero regardless of the value of λ. This completes the proof of (2.2 and hence (3.. The converse, that strictly stationary solutions of (3. are strictly stationary solutions of (., is clear. Theorem 2 shows that for the determination of stationary solutions of (., common zeroes λ of Φ and Θ can be factored out without restrictions if λ =, and if m Θ (λ m Φ (λ 2 and λ =, then ( λ z mφ(λ can be factored out from Φ and Θ. That in general one cannot factor out ( λ z m Φ(λ, i.e. all common zeroes with multiplicity on the unit circle, is a consequence of the non-uniqueness associated with an autoregressive root on the unit circle. For example, if Θ(z = Φ(z = ( + z 2 and (Z t t Z is nondeterministic and i.i.d., then Theorem 2 shows that every strictly stationary solution Y of Φ(BY t = Θ(BZ t is also a strictly stationary solution of Y t + Y t = Z t + Z t and vice versa, but is not necessarily a solution of Y t = Z t, as shown in Lemma 2. If (Z t t Z is deterministic the conditions for existence of a strictly stationary solution of (. are a little different from those which apply when (Z t t Z is i.i.d. and nondeterministic as has been assumed until now. The deterministic case is covered by the following theorem. 0

11 Theorem 3. Suppose that (Z t t Z is deterministic, i.e. there is a constant c C such that Z t = c with probability for every t. Then if (Y t t Z is a strictly stationary solution of (. it must be expressible as Y t = K + λ: λ = A λ,0 λ t, (3.2 where the sum is over the distinct zeroes of Φ(z on the unit circle, the coefficients A λ,0 are complex-valued random variables and Θ(c/Φ( if m Φ ( = 0, K = (3.3 0 if m Φ ( > 0, where m Φ ( denotes the multiplicity of as a zero of Φ(z. A strictly stationary solution of (. exists in this case if and only if m Φ (Θ(c = 0. (3.4 Proof. If Z t = c with probability for every t, the defining equation (. becomes Φ(BY t = Θ(c, (3.5 and every random sequence satisfying (3.5 can be written as Y t = f 0 (t + m Φ (λ A λ,i t i λ t, (3.6 λ i=0 where λ denotes the sum over distinct zeroes of Φ(z, m Φ(λ denotes the multiplicity of the zero λ, the coefficients A λ,i are random variables and the deterministic complex-valued function f 0 is the particular solution of (3.5 defined by where f 0 (t = Φ(z = Θ(c Φ(m Φ (! tm Φ(, (3.7 Φ(z ( z m Φ(. (The validity of (3.6 for each fixed ω in the underlying probability space (Ω, F, P is a standard result from the theory of linear difference equations (see, e.g., Brockwell and Davis [], pp The required measurability of the functions A λ,i follows from that of the random variables Y t. In order for the sequence (Y t t Z specified by (3.6 to be strictly stationary, it is clear (on considering the behaviour of the solution as t that the terms corresponding

12 to the zero or zeroes with largest absolute value greater than must have zero coefficients. Having set these coefficients to zero, it is then clear that coefficients of terms corresponding to the zero or zeroes with next largest absolute value greater than must also be zero. Continuing with this argument we find that all terms in the sum (3.5 corresponding to zeroes with absolute values greater than must vanish. In the same way, by letting t we conclude that all the terms corresponding to zeroes with absolute values less than must vanish. The same argument can also be applied to eliminate all terms of the form A λ,i t i λ t with λ = and i > 0, and to see that Θ(c = 0 is necessary for a strictly stationary solution to exist if m Φ ( > 0. Thus f 0 (t reduces to K as defined in (3.3, Y has representation (3.2, and it is apparent that the condition (3.4 is necessary for the existence of a strictly stationary solution. On the other hand if this condition is satisfied then Y t = K is in fact a strictly stationary solution. As in the non-deterministic case, the strictly stationary solution is unique if Φ(z has no zeroes on the unit circle, while the argument of Lemma 2 shows that in general Y t = K is not the only strictly stationary solution if Φ(z has zeroes on the unit circle. References [] Brockwell, P.J. and Davis, R.A. (99 Time Series: Theory and Methods, 2nd ed., Springer-Verlag, New York. [2] Brockwell, P.J. and Lindner, A. (2009 Existence and uniqueness of stationary Lévy driven CARMA processes. Stoch. Proc. Appl. 9, [3] Cline, D.B.H. and Brockwell, P.J. (985 Linear prediction of ARMA processes with infinite variance. Stoch. Proc. Appl. 9, [4] Davis, R.A. (996 Gauss-Newton and M-estimation for ARMA processes with infinite variance. Stoch. Proc. Appl. 63, [5] Kallenberg, O. (2002 Foundations of Modern Probability. Second Edition, Springer, New York. [6] Mikosch, T., Gadrich, T., Klüppelberg, C. and Adler, R. (995 Parameter estimation for ARMA models with infinite variance innovations, Ann. Statist. 23, [7] Yohai, V.J. and Maronna, R.A. (977 Asymptotic behavior of least-squares estimates for autoregressive processes with infinite variances. Ann. Statist. 5,

Strictly stationary solutions of spatial ARMA equations

Strictly stationary solutions of spatial ARMA equations Strictly stationary solutions of spatial ARMA equations Martin Drapatz October 16, 13 Abstract The generalization of the ARMA time series model to the multidimensional index set Z d, d, is called spatial

More information

Strictly stationary solutions of spatial ARMA equations

Strictly stationary solutions of spatial ARMA equations Strictly stationary solutions of spatial ARMA equations Martin Drapatz January, 15 Abstract The generalization of the ARMA time series model to the multidimensional index set Z d, d, is called spatial

More information

GARCH processes probabilistic properties (Part 1)

GARCH processes probabilistic properties (Part 1) GARCH processes probabilistic properties (Part 1) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/

More information

Strictly stationary solutions of multivariate ARMA equations with i.i.d. noise

Strictly stationary solutions of multivariate ARMA equations with i.i.d. noise Strictly stationary solutions of multivariate ARMA equations with i.i.d. noise arxiv:1105.3510v1 [math.st] 18 May 2011 Peter J. Brockwell Alexander Lindner Bernd Vollenbröker June 4, 2018 Abstract We obtain

More information

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be ed to

Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be  ed to TIME SERIES Part III Example Sheet 1 - Solutions YC/Lent 2015 Comments and corrections should be emailed to Y.Chen@statslab.cam.ac.uk. 1. Let {X t } be a weakly stationary process with mean zero and let

More information

3 Theory of stationary random processes

3 Theory of stationary random processes 3 Theory of stationary random processes 3.1 Linear filters and the General linear process A filter is a transformation of one random sequence {U t } into another, {Y t }. A linear filter is a transformation

More information

Strictly stationary solutions of ARMA equations in Banach spaces

Strictly stationary solutions of ARMA equations in Banach spaces Strictly stationary solutions of ARMA equations in Banach spaces arxiv:206.0982v [math.pr] 5 Jun 202 Felix Spangenberg June 6, 202 Abstract We obtain necessary and sufficient conditions for the existence

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Asymptotic inference for a nonstationary double ar(1) model

Asymptotic inference for a nonstationary double ar(1) model Asymptotic inference for a nonstationary double ar() model By SHIQING LING and DONG LI Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong maling@ust.hk malidong@ust.hk

More information

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn }

{ } Stochastic processes. Models for time series. Specification of a process. Specification of a process. , X t3. ,...X tn } Stochastic processes Time series are an example of a stochastic or random process Models for time series A stochastic process is 'a statistical phenomenon that evolves in time according to probabilistic

More information

On uniqueness of moving average representations of heavy-tailed stationary processes

On uniqueness of moving average representations of heavy-tailed stationary processes MPRA Munich Personal RePEc Archive On uniqueness of moving average representations of heavy-tailed stationary processes Christian Gouriéroux and Jean-Michel Zakoian University of Toronto, CREST 3 March

More information

Strictly Stationary Solutions of Multivariate ARMA and Univariate ARIMA Equations

Strictly Stationary Solutions of Multivariate ARMA and Univariate ARIMA Equations Strictly Stationary Solutions of Multivariate ARMA and Univariate ARIMA Equations Von der Carl-Friedrich-Gauß-Fakultät Technische Universität Carolo-Wilhelmina zu Braunschweig zur Erlangung des Grades

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

7. Forecasting with ARIMA models

7. Forecasting with ARIMA models 7. Forecasting with ARIMA models 309 Outline: Introduction The prediction equation of an ARIMA model Interpreting the predictions Variance of the predictions Forecast updating Measuring predictability

More information

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45

ESSE Mid-Term Test 2017 Tuesday 17 October :30-09:45 ESSE 4020 3.0 - Mid-Term Test 207 Tuesday 7 October 207. 08:30-09:45 Symbols have their usual meanings. All questions are worth 0 marks, although some are more difficult than others. Answer as many questions

More information

Some Time-Series Models

Some Time-Series Models Some Time-Series Models Outline 1. Stochastic processes and their properties 2. Stationary processes 3. Some properties of the autocorrelation function 4. Some useful models Purely random processes, random

More information

A time series is called strictly stationary if the joint distribution of every collection (Y t

A time series is called strictly stationary if the joint distribution of every collection (Y t 5 Time series A time series is a set of observations recorded over time. You can think for example at the GDP of a country over the years (or quarters) or the hourly measurements of temperature over a

More information

ARMA (and ARIMA) models are often expressed in backshift notation.

ARMA (and ARIMA) models are often expressed in backshift notation. Backshift Notation ARMA (and ARIMA) models are often expressed in backshift notation. B is the backshift operator (also called the lag operator ). It operates on time series, and means back up by one time

More information

Multivariate Markov-switching ARMA processes with regularly varying noise

Multivariate Markov-switching ARMA processes with regularly varying noise Multivariate Markov-switching ARMA processes with regularly varying noise Robert Stelzer 23rd June 2006 Abstract The tail behaviour of stationary R d -valued Markov-Switching ARMA processes driven by a

More information

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50 GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6

More information

Strictly stationary solutions of spatial ARMA equations

Strictly stationary solutions of spatial ARMA equations Ann Inst Stat Math 6 68:385 4 DOI.7/s463-4-5-y Strictly stationary solutions of spatial ARMA equations Martin Drapatz Received: 3 January 4 / Revised: October 4 / Published online: 8 December 4 The Institute

More information

Multivariate Generalized Ornstein-Uhlenbeck Processes

Multivariate Generalized Ornstein-Uhlenbeck Processes Multivariate Generalized Ornstein-Uhlenbeck Processes Anita Behme TU München Alexander Lindner TU Braunschweig 7th International Conference on Lévy Processes: Theory and Applications Wroclaw, July 15 19,

More information

Chapter 4: Models for Stationary Time Series

Chapter 4: Models for Stationary Time Series Chapter 4: Models for Stationary Time Series Now we will introduce some useful parametric models for time series that are stationary processes. We begin by defining the General Linear Process. Let {Y t

More information

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by

Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by Calculation of ACVF for ARMA Process: I consider causal ARMA(p, q) defined by φ(b)x t = θ(b)z t, {Z t } WN(0, σ 2 ) want to determine ACVF {γ(h)} for this process, which can be done using four complementary

More information

Chapter 9: Forecasting

Chapter 9: Forecasting Chapter 9: Forecasting One of the critical goals of time series analysis is to forecast (predict) the values of the time series at times in the future. When forecasting, we ideally should evaluate the

More information

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015

EC402: Serial Correlation. Danny Quah Economics Department, LSE Lent 2015 EC402: Serial Correlation Danny Quah Economics Department, LSE Lent 2015 OUTLINE 1. Stationarity 1.1 Covariance stationarity 1.2 Explicit Models. Special cases: ARMA processes 2. Some complex numbers.

More information

EXACT MAXIMUM LIKELIHOOD ESTIMATION FOR NON-GAUSSIAN MOVING AVERAGES

EXACT MAXIMUM LIKELIHOOD ESTIMATION FOR NON-GAUSSIAN MOVING AVERAGES Statistica Sinica 19 (2009), 545-560 EXACT MAXIMUM LIKELIHOOD ESTIMATION FOR NON-GAUSSIAN MOVING AVERAGES Nan-Jung Hsu and F. Jay Breidt National Tsing-Hua University and Colorado State University Abstract:

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

On moving-average models with feedback

On moving-average models with feedback Submitted to the Bernoulli arxiv: math.pr/0000000 On moving-average models with feedback DONG LI 1,*, SHIQING LING 1,** and HOWELL TONG 2 1 Department of Mathematics, Hong Kong University of Science and

More information

Lesson 9: Autoregressive-Moving Average (ARMA) models

Lesson 9: Autoregressive-Moving Average (ARMA) models Lesson 9: Autoregressive-Moving Average (ARMA) models Dipartimento di Ingegneria e Scienze dell Informazione e Matematica Università dell Aquila, umberto.triacca@ec.univaq.it Introduction We have seen

More information

Applied time-series analysis

Applied time-series analysis Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 18, 2011 Outline Introduction and overview Econometric Time-Series Analysis In principle,

More information

7. MULTIVARATE STATIONARY PROCESSES

7. MULTIVARATE STATIONARY PROCESSES 7. MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalar-valued random variables on the same probability

More information

Regular Variation and Extreme Events for Stochastic Processes

Regular Variation and Extreme Events for Stochastic Processes 1 Regular Variation and Extreme Events for Stochastic Processes FILIP LINDSKOG Royal Institute of Technology, Stockholm 2005 based on joint work with Henrik Hult www.math.kth.se/ lindskog 2 Extremes for

More information

Time Series 3. Robert Almgren. Sept. 28, 2009

Time Series 3. Robert Almgren. Sept. 28, 2009 Time Series 3 Robert Almgren Sept. 28, 2009 Last time we discussed two main categories of linear models, and their combination. Here w t denotes a white noise: a stationary process with E w t ) = 0, E

More information

Ch. 14 Stationary ARMA Process

Ch. 14 Stationary ARMA Process Ch. 14 Stationary ARMA Process A general linear stochastic model is described that suppose a time series to be generated by a linear aggregation of random shock. For practical representation it is desirable

More information

Identifiability, Invertibility

Identifiability, Invertibility Identifiability, Invertibility Defn: If {ǫ t } is a white noise series and µ and b 0,..., b p are constants then X t = µ + b 0 ǫ t + b ǫ t + + b p ǫ t p is a moving average of order p; write MA(p). Q:

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

Covariances of ARMA Processes

Covariances of ARMA Processes Statistics 910, #10 1 Overview Covariances of ARMA Processes 1. Review ARMA models: causality and invertibility 2. AR covariance functions 3. MA and ARMA covariance functions 4. Partial autocorrelation

More information

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1

γ 0 = Var(X i ) = Var(φ 1 X i 1 +W i ) = φ 2 1γ 0 +σ 2, which implies that we must have φ 1 < 1, and γ 0 = σ2 . 1 φ 2 1 We may also calculate for j 1 4.2 Autoregressive (AR) Moving average models are causal linear processes by definition. There is another class of models, based on a recursive formulation similar to the exponentially weighted moving

More information

GARCH processes continuous counterparts (Part 2)

GARCH processes continuous counterparts (Part 2) GARCH processes continuous counterparts (Part 2) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/

More information

{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n.

{ ϕ(f )Xt, 1 t n p, X t, n p + 1 t n, { 0, t=1, W t = E(W t W s, 1 s t 1), 2 t n. Innovations, Sufficient Statistics, And Maximum Lielihood In ARM A Models Georgi N. Boshnaov Institute of Mathematics, Bulgarian Academy of Sciences 1. Introduction. Let X t, t Z} be a zero mean ARMA(p,

More information

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong

STAT 443 Final Exam Review. 1 Basic Definitions. 2 Statistical Tests. L A TEXer: W. Kong STAT 443 Final Exam Review L A TEXer: W Kong 1 Basic Definitions Definition 11 The time series {X t } with E[X 2 t ] < is said to be weakly stationary if: 1 µ X (t) = E[X t ] is independent of t 2 γ X

More information

Time Series Analysis -- An Introduction -- AMS 586

Time Series Analysis -- An Introduction -- AMS 586 Time Series Analysis -- An Introduction -- AMS 586 1 Objectives of time series analysis Data description Data interpretation Modeling Control Prediction & Forecasting 2 Time-Series Data Numerical data

More information

Random Bernstein-Markov factors

Random Bernstein-Markov factors Random Bernstein-Markov factors Igor Pritsker and Koushik Ramachandran October 20, 208 Abstract For a polynomial P n of degree n, Bernstein s inequality states that P n n P n for all L p norms on the unit

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d)

Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Parametric Signal Modeling and Linear Prediction Theory 1. Discrete-time Stochastic Processes (cont d) Electrical & Computer Engineering North Carolina State University Acknowledgment: ECE792-41 slides

More information

Multivariate Markov-switching ARMA processes with regularly varying noise

Multivariate Markov-switching ARMA processes with regularly varying noise Multivariate Markov-switching ARMA processes with regularly varying noise Robert Stelzer 26th February 2007 Abstract The tail behaviour of stationary R d -valued Markov-Switching ARMA processes driven

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 )

Covariance Stationary Time Series. Example: Independent White Noise (IWN(0,σ 2 )) Y t = ε t, ε t iid N(0,σ 2 ) Covariance Stationary Time Series Stochastic Process: sequence of rv s ordered by time {Y t } {...,Y 1,Y 0,Y 1,...} Defn: {Y t } is covariance stationary if E[Y t ]μ for all t cov(y t,y t j )E[(Y t μ)(y

More information

SEPARABLE TERM STRUCTURES AND THE MAXIMAL DEGREE PROBLEM. 1. Introduction This paper discusses arbitrage-free separable term structure (STS) models

SEPARABLE TERM STRUCTURES AND THE MAXIMAL DEGREE PROBLEM. 1. Introduction This paper discusses arbitrage-free separable term structure (STS) models SEPARABLE TERM STRUCTURES AND THE MAXIMAL DEGREE PROBLEM DAMIR FILIPOVIĆ Abstract. This paper discusses separable term structure diffusion models in an arbitrage-free environment. Using general consistency

More information

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Alea 4, 117 129 (2008) Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Anton Schick and Wolfgang Wefelmeyer Anton Schick, Department of Mathematical Sciences,

More information

Refining the Central Limit Theorem Approximation via Extreme Value Theory

Refining the Central Limit Theorem Approximation via Extreme Value Theory Refining the Central Limit Theorem Approximation via Extreme Value Theory Ulrich K. Müller Economics Department Princeton University February 2018 Abstract We suggest approximating the distribution of

More information

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for

Chapter 1. Basics. 1.1 Definition. A time series (or stochastic process) is a function Xpt, ωq such that for Chapter 1 Basics 1.1 Definition A time series (or stochastic process) is a function Xpt, ωq such that for each fixed t, Xpt, ωq is a random variable [denoted by X t pωq]. For a fixed ω, Xpt, ωq is simply

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Claudia Klüppelberg Technische Universität München Zentrum Mathematik. Oslo, September 2012

Claudia Klüppelberg Technische Universität München Zentrum Mathematik. Oslo, September 2012 Claudia Klüppelberg Technische Universität München Zentrum Mathematik Oslo, September 2012 Collaborators Fred E. Benth, Oslo U Christine Bernhard, former Diploma student TUM Peter Brockwell Vicky Fasen,

More information

Time Series 2. Robert Almgren. Sept. 21, 2009

Time Series 2. Robert Almgren. Sept. 21, 2009 Time Series 2 Robert Almgren Sept. 21, 2009 This week we will talk about linear time series models: AR, MA, ARMA, ARIMA, etc. First we will talk about theory and after we will talk about fitting the models

More information

A COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS. OFFER LIEBERMAN and PETER C. B. PHILLIPS

A COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS. OFFER LIEBERMAN and PETER C. B. PHILLIPS A COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS BY OFFER LIEBERMAN and PETER C. B. PHILLIPS COWLES FOUNDATION PAPER NO. 1247 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS

More information

Time Series Examples Sheet

Time Series Examples Sheet Lent Term 2001 Richard Weber Time Series Examples Sheet This is the examples sheet for the M. Phil. course in Time Series. A copy can be found at: http://www.statslab.cam.ac.uk/~rrw1/timeseries/ Throughout,

More information

Autoregressive and Moving-Average Models

Autoregressive and Moving-Average Models Chapter 3 Autoregressive and Moving-Average Models 3.1 Introduction Let y be a random variable. We consider the elements of an observed time series {y 0,y 1,y2,...,y t } as being realizations of this randoms

More information

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS

ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS ADVANCED CALCULUS - MTH433 LECTURE 4 - FINITE AND INFINITE SETS 1. Cardinal number of a set The cardinal number (or simply cardinal) of a set is a generalization of the concept of the number of elements

More information

Computing the Autocorrelation Function for the Autoregressive Process

Computing the Autocorrelation Function for the Autoregressive Process Rose-Hulman Undergraduate Mathematics Journal Volume 18 Issue 1 Article 1 Computing the Autocorrelation Function for the Autoregressive Process Omar Talib Modern College of Business and Science, Muscat,

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

SAMPLE CHAPTERS UNESCO EOLSS STATIONARY PROCESSES. E ξ t (1) K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria

SAMPLE CHAPTERS UNESCO EOLSS STATIONARY PROCESSES. E ξ t (1) K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria STATIONARY PROCESSES K.Grill Institut für Statistik und Wahrscheinlichkeitstheorie, TU Wien, Austria Keywords: Stationary process, weak sense stationary process, strict sense stationary process, correlation

More information

Autoregressive Moving Average (ARMA) Models and their Practical Applications

Autoregressive Moving Average (ARMA) Models and their Practical Applications Autoregressive Moving Average (ARMA) Models and their Practical Applications Massimo Guidolin February 2018 1 Essential Concepts in Time Series Analysis 1.1 Time Series and Their Properties Time series:

More information

11. Further Issues in Using OLS with TS Data

11. Further Issues in Using OLS with TS Data 11. Further Issues in Using OLS with TS Data With TS, including lags of the dependent variable often allow us to fit much better the variation in y Exact distribution theory is rarely available in TS applications,

More information

Zeros of lacunary random polynomials

Zeros of lacunary random polynomials Zeros of lacunary random polynomials Igor E. Pritsker Dedicated to Norm Levenberg on his 60th birthday Abstract We study the asymptotic distribution of zeros for the lacunary random polynomials. It is

More information

Ch 4. Models For Stationary Time Series. Time Series Analysis

Ch 4. Models For Stationary Time Series. Time Series Analysis This chapter discusses the basic concept of a broad class of stationary parametric time series models the autoregressive moving average (ARMA) models. Let {Y t } denote the observed time series, and {e

More information

Time Series Analysis. Asymptotic Results for Spatial ARMA Models

Time Series Analysis. Asymptotic Results for Spatial ARMA Models Communications in Statistics Theory Methods, 35: 67 688, 2006 Copyright Taylor & Francis Group, LLC ISSN: 036-0926 print/532-45x online DOI: 0.080/036092050049893 Time Series Analysis Asymptotic Results

More information

NATIONAL UNIVERSITY OF SINGAPORE Department of Mathematics MA4247 Complex Analysis II Lecture Notes Part II

NATIONAL UNIVERSITY OF SINGAPORE Department of Mathematics MA4247 Complex Analysis II Lecture Notes Part II NATIONAL UNIVERSITY OF SINGAPORE Department of Mathematics MA4247 Complex Analysis II Lecture Notes Part II Chapter 2 Further properties of analytic functions 21 Local/Global behavior of analytic functions;

More information

Stochastic processes: basic notions

Stochastic processes: basic notions Stochastic processes: basic notions Jean-Marie Dufour McGill University First version: March 2002 Revised: September 2002, April 2004, September 2004, January 2005, July 2011, May 2016, July 2016 This

More information

A Primer on Asymptotics

A Primer on Asymptotics A Primer on Asymptotics Eric Zivot Department of Economics University of Washington September 30, 2003 Revised: October 7, 2009 Introduction The two main concepts in asymptotic theory covered in these

More information

2 markets. Money is gained or lost in high volatile markets (in this brief discussion, we do not distinguish between implied or historical volatility)

2 markets. Money is gained or lost in high volatile markets (in this brief discussion, we do not distinguish between implied or historical volatility) HARCH processes are heavy tailed Paul Embrechts (embrechts@math.ethz.ch) Department of Mathematics, ETHZ, CH 8092 Zürich, Switzerland Rudolf Grübel (rgrubel@stochastik.uni-hannover.de) Institut für Mathematische

More information

Nonparametric regression with martingale increment errors

Nonparametric regression with martingale increment errors S. Gaïffas (LSTA - Paris 6) joint work with S. Delattre (LPMA - Paris 7) work in progress Motivations Some facts: Theoretical study of statistical algorithms requires stationary and ergodicity. Concentration

More information

The sample autocorrelations of financial time series models

The sample autocorrelations of financial time series models The sample autocorrelations of financial time series models Richard A. Davis 1 Colorado State University Thomas Mikosch University of Groningen and EURANDOM Abstract In this chapter we review some of the

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables

Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables Jaap Geluk 1 and Qihe Tang 2 1 Department of Mathematics The Petroleum Institute P.O. Box 2533, Abu Dhabi, United Arab

More information

Lecture 1: Fundamental concepts in Time Series Analysis (part 2)

Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Lecture 1: Fundamental concepts in Time Series Analysis (part 2) Florian Pelgrin University of Lausanne, École des HEC Department of mathematics (IMEA-Nice) Sept. 2011 - Jan. 2012 Florian Pelgrin (HEC)

More information

Natural boundary and Zero distribution of random polynomials in smooth domains arxiv: v1 [math.pr] 2 Oct 2017

Natural boundary and Zero distribution of random polynomials in smooth domains arxiv: v1 [math.pr] 2 Oct 2017 Natural boundary and Zero distribution of random polynomials in smooth domains arxiv:1710.00937v1 [math.pr] 2 Oct 2017 Igor Pritsker and Koushik Ramachandran Abstract We consider the zero distribution

More information

Inference for Lévy-Driven Continuous-Time ARMA Processes

Inference for Lévy-Driven Continuous-Time ARMA Processes Inference for Lévy-Driven Continuous-Time ARMA Processes Peter J. Brockwell Richard A. Davis Yu Yang Colorado State University May 23, 2007 Outline Background Lévy-driven CARMA processes Second order properties

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Temporal aggregation of stationary and nonstationary discrete-time processes

Temporal aggregation of stationary and nonstationary discrete-time processes Temporal aggregation of stationary and nonstationary discrete-time processes HENGHSIU TSAI Institute of Statistical Science, Academia Sinica, Taipei, Taiwan 115, R.O.C. htsai@stat.sinica.edu.tw K. S. CHAN

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

The Behavior of Multivariate Maxima of Moving Maxima Processes

The Behavior of Multivariate Maxima of Moving Maxima Processes The Behavior of Multivariate Maxima of Moving Maxima Processes Zhengjun Zhang Department of Mathematics Washington University Saint Louis, MO 6313-4899 USA Richard L. Smith Department of Statistics University

More information

FE570 Financial Markets and Trading. Stevens Institute of Technology

FE570 Financial Markets and Trading. Stevens Institute of Technology FE570 Financial Markets and Trading Lecture 5. Linear Time Series Analysis and Its Applications (Ref. Joel Hasbrouck - Empirical Market Microstructure ) Steve Yang Stevens Institute of Technology 9/25/2012

More information

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES

HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES HEAVY-TRAFFIC EXTREME-VALUE LIMITS FOR QUEUES by Peter W. Glynn Department of Operations Research Stanford University Stanford, CA 94305-4022 and Ward Whitt AT&T Bell Laboratories Murray Hill, NJ 07974-0636

More information

6 NONSEASONAL BOX-JENKINS MODELS

6 NONSEASONAL BOX-JENKINS MODELS 6 NONSEASONAL BOX-JENKINS MODELS In this section, we will discuss a class of models for describing time series commonly referred to as Box-Jenkins models. There are two types of Box-Jenkins models, seasonal

More information

Riemann sphere and rational maps

Riemann sphere and rational maps Chapter 3 Riemann sphere and rational maps 3.1 Riemann sphere It is sometimes convenient, and fruitful, to work with holomorphic (or in general continuous) functions on a compact space. However, we wish

More information

Adjusted Empirical Likelihood for Long-memory Time Series Models

Adjusted Empirical Likelihood for Long-memory Time Series Models Adjusted Empirical Likelihood for Long-memory Time Series Models arxiv:1604.06170v1 [stat.me] 21 Apr 2016 Ramadha D. Piyadi Gamage, Wei Ning and Arjun K. Gupta Department of Mathematics and Statistics

More information

Bootstrapping continuous-time autoregressive processes

Bootstrapping continuous-time autoregressive processes Ann Inst Stat Math (2014) 66:75 92 DOI 10.1007/s10463-013-0406-0 Bootstrapping continuous-time autoregressive processes Peter J. Brockwell Jens-Peter Kreiss Tobias Niebuhr Received: 4 October 2012 / Revised:

More information

Properties of Zero-Free Spectral Matrices Brian D. O. Anderson, Life Fellow, IEEE, and Manfred Deistler, Fellow, IEEE

Properties of Zero-Free Spectral Matrices Brian D. O. Anderson, Life Fellow, IEEE, and Manfred Deistler, Fellow, IEEE IEEE TRANSACTIONS ON AUTOMATIC CONTROL, VOL 54, NO 10, OCTOBER 2009 2365 Properties of Zero-Free Spectral Matrices Brian D O Anderson, Life Fellow, IEEE, and Manfred Deistler, Fellow, IEEE Abstract In

More information

Statistics of stochastic processes

Statistics of stochastic processes Introduction Statistics of stochastic processes Generally statistics is performed on observations y 1,..., y n assumed to be realizations of independent random variables Y 1,..., Y n. 14 settembre 2014

More information

Time Series Analysis Fall 2008

Time Series Analysis Fall 2008 MIT OpenCourseWare http://ocw.mit.edu 14.384 Time Series Analysis Fall 008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Introduction 1 14.384 Time

More information

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY Econometric Theory, 26, 2010, 1855 1861. doi:10.1017/s0266466610000216 NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY UWE HASSLER Goethe-Universität Frankfurt

More information

Moving Average (MA) representations

Moving Average (MA) representations Moving Average (MA) representations The moving average representation of order M has the following form v[k] = MX c n e[k n]+e[k] (16) n=1 whose transfer function operator form is MX v[k] =H(q 1 )e[k],

More information

Class 1: Stationary Time Series Analysis

Class 1: Stationary Time Series Analysis Class 1: Stationary Time Series Analysis Macroeconometrics - Fall 2009 Jacek Suda, BdF and PSE February 28, 2011 Outline Outline: 1 Covariance-Stationary Processes 2 Wold Decomposition Theorem 3 ARMA Models

More information

A BOREL SOLUTION TO THE HORN-TARSKI PROBLEM. MSC 2000: 03E05, 03E20, 06A10 Keywords: Chain Conditions, Boolean Algebras.

A BOREL SOLUTION TO THE HORN-TARSKI PROBLEM. MSC 2000: 03E05, 03E20, 06A10 Keywords: Chain Conditions, Boolean Algebras. A BOREL SOLUTION TO THE HORN-TARSKI PROBLEM STEVO TODORCEVIC Abstract. We describe a Borel poset satisfying the σ-finite chain condition but failing to satisfy the σ-bounded chain condition. MSC 2000:

More information

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y).

On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). On 1.9, you will need to use the facts that, for any x and y, sin(x+y) = sin(x) cos(y) + cos(x) sin(y). cos(x+y) = cos(x) cos(y) - sin(x) sin(y). (sin(x)) 2 + (cos(x)) 2 = 1. 28 1 Characteristics of Time

More information