On Linear Processes with Dependent Innovations

Size: px
Start display at page:

Download "On Linear Processes with Dependent Innovations"

Transcription

1 The University of Chicago Department of Statistics TECHNICAL REPORT SERIES On Linear Processes with Dependent Innovations Wei Biao Wu, Wanli Min TECHNICAL REPORT NO. 549 May 24, S. University Avenue Chicago, IL 60637

2 On linear processes with dependent innovations Wei Biao Wu, Wanli Min 1 May 24, 2004 Department of Statistics, The University of Chicago, Chicago, IL 60637, USA Abstract We investigate asymptotic properties of partial sums and sample covariances for linear processes whose innovations are dependent. Central limit theorems and invariance principles are established under fairly mild conditions. Our results go beyond earlier ones by allowing a quite wide class of innovations which includes many important non-linear time series models. Applications to linear processes with GARCH innovations and other non-linear time series models are discussed. 1 Introduction Let {ɛ t } t Z be independent and identically distributed (iid) random elements and F be a measurable function such that a t = F (..., ɛ t 1, ɛ t ) (1) is a well-defined random variable. Then {a t } t Z is a stationary and ergodic infinite-order nonlinear moving average process. Assume that a t has mean 0, finite variance and that {ψ i } i 0 is a sequence of real numbers such that i,j=0 ψ iψ j E(a 0 a i j ) <. Then the linear process X t = ψ i a t i (2) i=0 exists and E(X 2 t ) <. In this paper we consider asymptotic distributions of the properly normalized partial sum process S n = n i=1 X i and the sample covariances ˆγ h = n X tx t+h /n, h 0. We will establish an invariance principle for the former and a central limit theorem (CLT) for ˆγ h. Such results are important and useful in statistical inference of stationary processes. 1 address: wbwu,wmin@galton.uchicago.edu 1

3 In the classical time series analysis, the innovations {a t } t Z in the linear process X t are often assumed to be iid; see for example Brockwell and Davis (1991) and Box and Jenkins (1976). In this case asymptotic behaviors of the sample means and partial sum processes have been extensively studied in the literature. It would be hard to compile a complete list. Here we only mention some representatives: Davydov (1970), Gorodetskii (1977), Hall and Heyde (1980), Phillips and Solo (1992), Yokoyama (1995) and Hosking (1996). See references therein for further background. There are basically two types of results. If the coefficients ψ t are absolutely summable, then the covariances of X t are summable and we say that X t is short-range dependent (SRD). Under SRD, the normalizing constant for the sum n i=1 X i is n, which is of the same order as that in the classical CLT for iid observations. When the coefficients ψ t are not summable, then X t is long-range dependent (LRD) and the normalizing constant for n i=1 X i is typically larger than n. Fractional ARIMA model (Hosking, 1981) is an important class which may exhibit LRD. Asymptotic normality for sample covariances has also been widely discussed; see for example, Box and Jenkins (1976), Hannan (1976), Hall and Heyde (1980), Giraitis and Surgailis (1990), Brockwell and Davis (1991), Phillips and Solo (1992) and Hosking (1996). The asymptotic problem of partial sums and sample covariances becomes more difficult if dependence among a t is allowed. Recently, fractional ARIMA (FARIMA) processes with GARCH innovations have been proposed to model some econometric time series. former feature allows LRD and the latter one allows that the conditional variance can change over time, namely heteroscedasticity. Financial time series often exhibit these two features. Hence FARIMA models with GARCH innovations provide a natural vehicle for modelling processes with such features; see Baillie, Chung and Tieslau (1995), Hauser and Kunst (2001) and Lien and Tse (1999). Romano and Thombs (1996) point out that the traditional large sample inference on autocorrelations under the assumption of iid innovations is misleading if the underlying {a t } are actually dependent. Results so far obtained in this direction require that a t are m-dependent (Dianana, 1953) or martingale differences (Hannan and Heyde, 1972). Recently, Wang, Lin and Gulati (2002) consider invariance principles for iid or martingale differences a t. The However, the proof in the latter paper contains a serious error [cf Remark 1]. Chung (2002) considers long-memory processes under the assumption that a t are martingale differences with a prohibitively strong restriction [cf. (18)] that exclude the 2

4 widely-used ARCH models. In this paper, we shall apply the central limit theory and the idea of martingale approximations developed in Woodroofe (1992) and Wu and Woodroofe (2003) to obtain asymptotic distributions of the sample means and covariances of (2) under sufficiently mild conditions. Our results go beyond earlier ones by allowing a large class of important nonlinear process a t, which substantially weakens the iid or martingale differences assumptions. In particular, our conditions are satisfied if a t are the GARCH, random coefficient AR, bilinear AR and threshold AR models etc. The goal of the paper is twofold: the first one is to obtain asymptotic distributions of X n and ˆγ h which are certainly needed for statistical inferences of such processes. The second major goal is to introduce another type of dependence structure as well as to provide a unified methodology for asymptotic problems in econometrics models based on martingale approximations. With our dependence structure, martingales can be constructed which efficiently approximate the original sequences. Based on such martingale approximations, we can apply celebrated results in martingale theory such as martingale central limit theorems (MCLT) and martingale inequalities. The proposed dependence structure only involves the computation of conditional moments and it is easily verifiable. This feature is quite different from strong mixing conditions which might be too restrictive and hard to be verified. On the other hand, our condition is sufficiently mild as well. Recently Wu and Mielniczuk (2002), Hsing and Wu (2003), Wu (2003a, 2003b) apply the idea of martingale approximations and solve certain open asymptotic problems. The paper is organized as follows. Main theorems are stated in Section 2 and comparisons with earlier results are elaborated. Proofs are given in the Section 4. In Section 3 we present applications to some nonlinear processes. 2 Results Let X t be the linear process defined by (2) and recall E(a n ) = 0; let F t = (..., ɛ t 1, ɛ t ) be the shift process. Then F t is a Markov Chain. For a random variable ξ define the projections P t ξ = E(ξ F t ) E(ξ F t 1 ) and its L p (p 1) norm by ξ p = [E( ξ p )] 1/p. Let = 2 and write ξ L p if ξ p <. We say c n d n for two sequences {c n } and {d n } if lim c n /d n = 1. For n 0 write A n = i=n ψ2 i, Ψ n = n i=0 ψ i and B 2 n = n 1 i=0 Ψ2 i. 3

5 Let Ψ n = 0 for n < 0 and define σ 2 n = (Ψ n+i Ψ i ) 2. (3) i= n Before we state our main results, we now introduce the basic concept of L p weakly dependence (cf Definition 1), which unlike strong mixing conditions, only involves the estimation of the decay rate of conditional moments. On the other hand, it appears to be very mild. In Section 3 we argue that many nonlinear time series models satisfy this condition. More importantly, it provides a natural vehicle for the central limit theory for stationary processes; see Woodroofe (1992) and Lemma 3 (Lemma 1 below) in Wu (2003a). An interesting observation of Lemma 1 is that there is a close form for the asymptotic variance in the limiting normal distribution. Definition 1. The process Y n = g(f n ), where g is a measurable function, is said to be L p weakly dependent with order r (p 1 and r 0) if E( Y n p ) < and n r P 1 Y n p <. (4) If (4) holds with r = 0, then Y n is said to be L p weakly dependent. n=1 Lemma 1. (Wu (2003a)) Let Y n = g(f n ) be L 2 weakly dependent. Then n i=1 (Y i EY 1 )/ n N(0, ξ 2 ), where ξ = j=1 P 1Y j L 2 and denotes convergence in distribution. The intuition of the definition weakly dependence is that the projection of the future Y n to the space M 1 M 0 = {Z L p : Z is F 1 measurable and E(Z F 0 ) = 0} has a small magnitude. So the future depends weakly on the current states. If Y n are martingale differences, then (4) is automatically satisfied for all r 0 if Y 0 L p. Lemma 1 readily entails the following corollary for the process (2). Corollary 1. Assume that a n is L 2 weakly dependent and ψ i <. (5) i=0 Then P 1X t < and n X t/ n N(0, ξ 2 ), where ξ = P 1X t. 4

6 2.1 Invariance Principles Let W n (t), 0 t 1 be a continuous, piece-wise linear function such that W n (t) = S k / S n at t = k/n, k = 0,..., n, and W n (t) = S k / S n + (nt k)x k+1 / S n when k/n t (k + 1)/n. Let space C[0, 1] be the collection of all continuous function on [0, 1]; let the distance between two functions f, g C[0, 1] be ρ(f, g) = sup 0 t 1 f(t) g(t). Billingsley (1968) contains an extensive treatment of the convergence theory on C[0, 1]. Recall Ψ n = n i=0 ψ i for n 0, Ψ n = 0 for n < 0 and B 2 n = n 1 i=0 Ψ2 i. Theorem 1 asserts that the limiting distribution is the standard Brownian motion under (6). The norming sequence has the form S n = l (n) n for some slowly varying function l (n). If ψ n decays sufficiently slowly, then (6) is violated and we may have fractional Brownian motions (Mandelbrot and Van Ness, 1969) as limits. In Theorem 2 we assume that the coefficients have the form ψ n = l(n)/n β for n 1, where 1/2 < β < 1 and l is a slowly varying function. Then ψ n is not summable and norming sequences for S n are different. Theorem 1. Assume that for some α > 2, {a n } is L α weakly dependent with order 1, (Ψ n+i Ψ i ) 2 = o(bn) 2 (6) i=0 and B n. Then l (n) := S n / n is a slowly varying function, B n / n l (n) n 1 i=0 Ψ i /n and {W n (t), 0 t 1} converges in C[0, 1] to the standard Brownian motion W (t) on [0, 1]. Theorem 2. Let ψ n = l(n)/n β for n 1, where 1/2 < β < 1. Assume that a n is L 2 weakly dependent with order 1. Then {W n (t), 0 t 1} converges in C[0, 1] to the standard fractional Brownian motion {W H (t), 0 t 1] with Hurst index H = 3/2 β, and S n σ n η, with η = P 1a t and σ n defined in Lemma 3. Invariance principle is a powerful tool in statistical inference of econometric time series such as unit root testing problems, and it enables one to obtain limiting distributions for many statistics. This problem has a substantial history. The celebrated Donsker s theorem asserts invariance principles for iid sequence of X n. For dependent sequences, see the survey by Bradley (1986) and Peligrad (1986) for strong mixing processes, McLeish (1975, 1977) for mixingale sequences, Billingsley (1968) and Hall and Heyde (1980). 5

7 In classical theory of invariance principles for linear processes, it is often assumed that innovations {a n } are iid or martingale differences; see Davydov (1970), Gorodetskii (1977), Hall and Heyde (HH, pp. 146, 1980) and the recent work by Wang, Lin and Gulati (WLG, 2002) among others. We believe that even for the special case that {a n } are martingale differences, Theorem 1 imposes the weakest sufficient conditions on the coefficients {ψ n }. Remark 1. WLG (2002) attempted to generalize previous results on invariance principle and wanted to establish {S kn (t)/σ n, 0 t 1} {W (t), 0 t 1} in D[0, 1], where X n is defined by (2) with iid a t s, D[0, 1] is the collection of all right continuous functions with left limits on [0, 1] and k n (t) = sup{m : B 2 m tb 2 n} (cf Theorem 2.1 in the latter paper). However, there is a serious error in their argument. A key step in their paper is to use their equation (36), namely the distributional identity k n(t) k=1 k n(t) a k Ψ kn (t) k = D to establish the invariance principle via k=1 a k Ψ k 1, 0 t 1, (7) {S kn (t)/b n, 0 t 1} {W (t), 0 t 1} (8) k n(t) {Bn 1 k=1 a k Ψ k 1, 0 t 1} {W (t), 0 t 1}. (9) The claim (7) holds only for a single t and it fails to be valid jointly for 0 t 1. To see this, choose t 1 and t 2 such that k n (t 1 ) = 1 and k n (t 2 ) = 2. In this case it is easily seen that (7) fails since the random vectors (a 1 ψ 0, a 1 (ψ 0 + ψ 1 ) + a 2 ψ 0 ) and (a 1 ψ 0, a 1 ψ 0 + a 2 (ψ 0 + ψ 1 )) generally have different distributions (even though the marginal distributions are the same). The invariance principle certainly requires the joint behavior over 0 t 1. Remark 2. It would be interesting to compare our result with previous ones including WLG even though the latter paper contains an error. Our Theorem 1 differs from HH (pp. 146, 1980) and WLG (2002) in several important aspects. First, we allow a fairly general class of a n which includes many nonlinear time series models. If a n are iid or martingale differences, then a n are automatically L α weakly 6

8 dependent with any order provided a 0 L α. Our moment conditions is slightly stronger since α > 2 is required. In HH (1980) and WLG (2002), a n are assumed to iid or martingale differences and the existence of E(a 2 n) is required. Second, in WLG (2002), the conditions on ψ k 1 B n max Ψ j 0 and 1 j n n j=0 A 1/2 j = o(b n ) (10) are imposed, which are stronger than (6). To see this, assume that {a n } are stationary martingale differences with a n = 1. Then E(S n F 0 ) = i=0 (Ψ n+i Ψ i ) 2, and for j 0, E(X j F 0 ) = A 1/2 j. Observe that E(S n F 0 ) n j=1 E(X j F 0 ). Thus (10) implies (6). In HH (1980), two sufficient conditions for the invariance principle are presented. WLG s (2002) assumption (10) weakens one sufficient condition (cf Inequality (5.38) of HH s (1980)) j=1 when one-sided linear processes is considered. (5.37) in HH (1980) A 1/2 j < (11) However, the other sufficient condition ( ψ l ) 2 < (12) n=1 l=n cannot be derived from WLG s (10). For example, let ψ n = ( 1) n n 2/3, n 1 and ψ 0 = 1. Then l=n ψ l = O(n 2/3 ) and (12) holds. However, WLG s (10) does not hold since A n = m=n ψ2 n 3n 1/3 and n j=0 A1/2 j 1.2 3n 5/6. Interestingly, (12) does imply our condition (6) by noting that Ψ n+i Ψ i = l=i+1 ψ l l=i+1+n ψ l. Thus (6) unifies (10), (11) and (12). Third, WLG (2002) posed the open problem whether B n can be replaced by σ n. Theorem 1 provides an affirmative answer to this problem. Let {a n } be martingale differences with a n = 1. Since E(S n F 0 ) and S n E(S n F 0 ) are orthogonal, S n 2 = E(S n F 0 ) 2 + S n E(S n F 0 ) 2, which implies that σ 2 n = i=0 (Ψ n+i Ψ i ) 2 + B 2 n B 2 n by (6). Moreover, our Theorem 1 asserts that σ n necessarily has the form l (n) n and reveals the inner relations B n / n l (n) n 1 i=0 Ψ i /n. 7

9 Finally, the form of our result {W n (t), 0 t 1} {W (t), 0 t 1} in C[0, 1] is a typical one for invariance principles. It is slightly different from the one in WLG s (8). Our form seems more convenient for application and it actually implies the latter. To see this, let Ŵn(t) and Ŵ (t) be defined on a (possibly richer) probability space such that Ŵ D n = W n, Ŵ = D W and sup 0 t 1 Ŵn(t) Ŵ (t) P 0. Here η D 1 = η 2 denotes that η 1 and η 2 have the same distribution and P means convergence in probability. Since Bn/n 2 is a slowly varying function, by Lemma 4 with γ = 1, lim max Bk 2 1 k n k Bn 2 n = 0, (13) and since Ŵ has a version with continuous path, max 1 k n Ŵ (B2 k /B2 n) Ŵ (k/n) P 0. Hence max 1 k n Ŵn(k/n) Ŵ (B2 k /B2 P n) 0, which implies the invariance principle of WLG (2002). Example 1. Let ψ k = l(k)/k, k 1, where l is a slowly varying function such that k=1 ψ k =. By Lemma 3, Ψ n is slowly varying, Ψ n n k=0 ψ k, σn 2 nψ 2 n and lim (1/σn) 2 j=n (Ψ j Ψ j n ) 2 = 0. Hence (6) is satisfied. 2.2 Sample Covariances Covariances play a fundamental role in the study of stationary processes. In this section we present asymptotic results for sample covariances. Theorem 3. Assume that {a n } is L 4 weakly dependent, and ψ i A i+1 < (14) i=0 P 1 (a i a j ) <. (15) i,j=0 Then P 1X 2 t < ; namely X 2 t is L 2 weakly dependent. Theorem 3 is useful in proving Corollary 2 concerning asymptotic normality of sample covariances. 8

10 Proposition 1. A sufficient condition for (14) is tψ 2 t <. (16) Corollary 2. Let the conditions of Theorem 3 be satisfied. For a fixed integer h 0 let the column random vector X t,h = (X t h,..., X t ) T, where T Γ(h) = E(X 0 X h,h ). Then 1 n where Σ h = E(ξ h ξ T h ) and ξ h = t= P 1(X t X t+h,h ) L 2. stands for transpose and n [X t X t+h,h Γ(h)] N(0, Σ h ) (17) In Section 3.1, we will illustrate how to apply Theorem 3 and Corollary 2 to the GARCH models. Notice that (14) and (16) allow some non-summable sequences ψ k. Consider the Fractional ARIMA(0, d, 0) model (1 B) d X n = a n, where B is the back-shift operator (BX n = X n 1 ) and 1/2 < d < 1/2. Then X n = (1 B) d a n = i=0 ψ ia n i and as j, ψ j = Γ(j + d)/[γ(j)γ(d)] j d 1 /Γ(d). If 1/4 > d > 0, then (16) holds. Asymptotic distribution for sample correlations can be easily obtained from Corollary 2. Remark 3. In Theorem 6.7 of HH (pp 188, 1980), a CLT for sample correlations is derived under the condition (16), {a t } t Z are martingale differences and E(a 2 t F t 1 ) = a positive constant. (18) Relation (18) generalizes the case in which {a n } are iid. Chung (2002) recently derives various limit theorems for martingale differences {a t } t Z satisfying (18). Unfortunately the latter assumption appears too restrictive and it excludes many important models. One of the most interesting case is the ARCH model. To see this, let a t = ɛ t θ θ 2 2a 2 t 1 be the ARCH(1) model, where {ɛ t } t Z are iid with mean 0 and variance 1 and θ 1 and θ 2 are parameters. Then E(a 2 t F t 1 ) = θ θ 2 2a 2 t 1, which cannot be almost surely constant unless θ 2 = 0. Thus limit theorems by Chung (2002) and HH (1980) can not be directly applied to linear processes with ARCH innovations. Our results avoids this severe limitation. Remark 4. If a n are martingale differences, then the L p weakly dependence condition trivially holds if a n L p. Since P 1 a i a j = 0 for i > j 1, condition (15) is reduced to P 1 a 2 i <. i=1 9

11 On the other hand, if (18) holds, then for i 2, E(a 2 i F 1 ) = E[E(a 2 i F i 1 ) F 1 ] is almost surely a constant and hence P 1 a 2 i = 0 almost surely. 3 Applications The class of non-linear time series models that (1) represents is clearly huge. For example, it includes threshold autoregressive model (TAR, Tong, 1990), bilinear model (Meyn and Tweedie, 1994), generalized autoregressive conditional heteroscedastic model (GARCH, Bollerslev, 1986), random coefficient autoregressive model (RCA) (Nicholls and Quinn, 1982) etc. Here we argue that those widely used non-linear time series models are L p weakly dependent, and moreover, P 1 a n p decays to 0 exponentially fast. Namely there exists C > 0 and ρ (0, 1) such that P 1 a n p Cρ n for all n 0. In this section C > 0 and ρ (0, 1) stand for constants which may vary from line to line. Let {ɛ t} t Z be an iid copy of {ɛ t } t Z and a n = F (..., ɛ 1, ɛ 0, ɛ 1,..., ɛ n ) be a coupled version of a n. We say that {a t } is geometrically moment contracting (GMC) if there exists α > 0 such that for all n 0, E( a n a n α ) Cρ n. (19) The GMC condition (19) implies that the process {a t } forgets the past F 0 at an exponential rate by measuring the distance between a n and its coupled version a n. Recently Hsing and Wu (2003) obtains asymptotic theory for U-statistics of processing satisfying (19). Proposition 2. (i) If (19) holds with some α 1, then E(a n F 0 ) α Cρ n and hence P 1 a n α Cρ n. (ii) Assume that {a n } satisfies (19) for some α 0 > 0 and a n L q for some q > 1. Then (19) holds for all α (0, q). Proposition 3. Assume that {a t } L 4 and (19) holds with α = 4. Then there exist C > 0 and ρ (0, 1) such that for all t, k 0, P 1 (a t a t+k ) Cρ t+k. (20) Hence {a t } satisfies (15). Propositions 2 and 3 make Theorems 1, 2, and Corollary 2 applicable by providing easily verifiable and mild conditions. An important special class of (1) is the so-called 10

12 iterated random functions. Let G(, ) be a bivariate measurable function with Lipschitz constant L ε = sup G(x, ε) G(x, ε) / x x and Z x x n be defined recursively by Z n = G(Z n 1, ε n ). (21) Diaconis and Freedman (1999) show that {Z n } has a unique stationary distribution if E(log L ε ) < 0, E(L α ε ) < and E[ z 0 G(z 0, ε) α ] < (22) hold for some α > 0 and z 0. The same set of conditions actually also imply the GMC (19) (cf Lemma 3 in Wu and Woodroofe, 2000). Example 2. Let the TAR(1) a n = φ 1 max(a n 1, 0) + φ 2 max( a n 1, 0) + ɛ n, the condition (22) is satisfied when L ɛ = max( φ 1, φ 2 ) < 1 and E( ɛ 0 α ) < for some α > 0. For the bilinear model a n = (α 1 + β 1 ɛ n )a n 1 + ɛ n, where α 1 and β 1 are real parameters and E( ɛ 0 α ) < for some α > 0, we have a close form of the Lipschitz constant L ɛ = α 1 +β 1 ɛ and (22) holds if E(L α ɛ ) < 1. Similarly for the RCA model a n = (φ 1 + η n )a n 1 + ɛ n, where η n are iid, then the Lipschitz constant L ɛ = φ 1 + η n. 3.1 GARCH Models Let ɛ t, t Z be iid random variables with mean 0 and variance 1; let a t = h t ɛ t and h t = α 0 + α 1 a 2 t α q a 2 t q + β 1 h t β p h t p (23) be the generalized autoregressive conditional heteroscedastic model GARCH(p, q), where α 0 > 0, α j 0 for 1 j q and β i 0 for 1 i p. Chen and An (1998) show that if q α j + j=1 p β i < 1, (24) i=1 then a t is strictly stationary. Notice that a t form martingale differences. Hence P 1 a t = 0 for t 2, P 1 a 1 = a 1 and (4) holds for any r 0 if a 1 L p. The existence of moments for GARCH models has been widely studied; see He and Teräsvirta (1999), Ling (1999), and Ling and McAleer (2002) and references therein. 11

13 Let Y t = (a 2 t,..., a 2 t q+1, h t,..., h t p+1 ) T and b t = (α 0 ɛ 2 t, 0,..., 0, α 0, 0,..., 0) T. It is well known that GARCH models admits the following representation (see for example Theorem in Taniguchi and Kakizawa (2000)): α 1 ɛ 2 t... α q 1 ɛ 2 t α q ɛ 2 t β 1 ɛ 2 t... β p 1 ɛ 2 t β p ɛ 2 t Y t = M t Y t 1 + b t, where M t = α 1... α q 1 α q β 1... β p 1 β p For a square matrix M let ρ(m) be its largest eigenvalue. Let be the usual Kronecker product; let Y be the Euclidean length of the vector Y. Assume E(ɛ 4 t ) <. Ling (1999) shows that if ρ[e(mt 2 )] < 1, then a t has a stationary distribution and E(a 4 t ) <. Ling and McAleer (2002) argue that the condition ρ[e(mt 2 )] < 1 is also necessary for the finiteness of the fourth moment. Our Proposition 4 states that the same condition actually implies (19) as well. Proposition 4. Assume that ɛ t (25) are iid with mean 0 and variance 1, E(ɛ 4 t ) < and ρ[e(m 2 t )] < 1. Then E( a n a n 4 ) Cρ n for some C < and ρ (0, 1). Namely (19) holds. Under the conditions of Proposition 4, it is clear that Proposition 3 and hence Theorem 3 are applicable since Pa i a j and Pa i decays to zero exponentially fast. To derive asymptotic distributions related to stationary processes, traditional approaches normally require strong mixing conditions. However, it is a challenging problem to obtain strong mixing conditions of GARCH models; see Carrasco and Chen (2002) for an recent attempt. Our approach, however, provides a framework that completely avoids strong mixing conditions. 12

14 4 Proofs We assume E(a n ) = 0 throughout. Recall A n = i=n ψ2 i, Ψ n n 1 i=0 Ψ2 i and (3) for σn. 2 = n i=0 ψ i and B 2 n = Lemma 2. Let {D i, i N} be a martingale difference sequence in L p for some p 2. Then i=1 D i p C p [ i=1 D i 2 p] 1/2, where C p = 18p 3/2 /(p 1) 1/2. Proof of Lemma 2. It is a straightforward consequence of Burkholder s inequality (cf Theorem in Chow and Teicher (1978)) and Minkowski s inequality ( ) p/2 [ ] p/2 D i p p CpE p Di 2 Cp p Di 2 p/2 i=1 i=1 since p/2 becomes a norm when p/2 1. Part (i) of Lemma 3 is also used in WLG (2002). For the sake of completeness, we provide a proof here. Part (ii) is well-known and it is an easy consequence of Karamata s theorem. Lemma 3. (i) Let ψ k = l(k)/k, k 1 and assume that k=1 ψ k =. Then Ψ n is slowly varying, l(n)/ψ n 0, Ψ n n k=0 ψ k, σn 2 nψ 2 n and lim (1/σn) 2 j=n (Ψ j Ψ j n ) 2 = 0. (ii) Let ψ k = l(k)/k β, k 1 and 1/2 < β < 1. Then Ψ n n 1 β l(n)/(1 β) and σ n n 3/2 β l(n)c β, where c 2 β = 0 [x1 β max(x 1, 0) 1 β ] 2 dx/(1 β) 2. Proof of Lemma 3. (i) Since l is slowly varying, there exists N 0 N such that either l(n) > 0 for all n N 0 or l(n) < 0 for all n N 0. Without loss of generality we assume the former. So k=1 ψ k = implies that lim Ψ n / n k=0 ψ k = 1. For any 0 < δ < 1 and G > 1, 0 Ψ n Ψ δn 1 n k=δn ψ k δn 1 k=δn/g ψ k = i=1 l(n) n k=δn 1/k l(δn) δn 1 k=δn/g log δ 1 = 1/k log G which approaches 0 as G. Thus by definition Ψ n is a slowly varying function. The same argument also implies that lim l(n)/ψ n = 0. By Karamata s theorem, σn 2 n 1 j=0 Ψ2 n nψ 2 n since Ψ n is slowly varying. For any fixed δ > 0, 1 nψ 2 n j=(1+δ)n (Ψ j Ψ j n ) 2 = 13 1 nψ 2 n j=(1+δ)n O(nψ j ) 2 = 0

15 since l(n)/ψ n 0 and both l(n) and Ψ n are slowly varying. Thus 1 nψ 2 n j=n (Ψ j Ψ j n ) 2 = 1 nψ 2 n 1 nψ 2 n (1+δ)n j=n (1+δ)n j=n (Ψ j Ψ j n ) 2 2(Ψ 2 j + Ψ 2 j n) 4δ which completes the proof of (i) since δ > 0 is arbitrarily chosen. Lemma 4. Let γ > 0 and l(n) be a slowly varying function. Then In particular, lim max 1 k n (k/n) γ l(k)/l(n) = 1. Proof of Lemma 4. lim max 1 k n (k/n)γ l(k)/l(n) 1 = 0. (26) Let l(m) have the representation l(m) = c m e m 1 η(u)/udu (Bingham, Goldie and Teugels, 1987) with c m c > 0. Choose K 0 N be sufficiently large such that i γ/4 l(i) i γ/4 holds for all i K 0. Then max K 0 k n and for any δ (0, 1), max (k/n) γ l(k)/l(n) 1 = 0, (27) 1 k K 0 ( k n )γ l(k) l(n) max (k nδ k n n )γ l(k) l(n) 1 = 1 By (27), (28) and (29), for sufficiently large n, max K 0 k n ( k n )γ k γ/4 n γ/4 = 0, (28) max l(k) 1 = 0. (29) nδ k n l(n) max 1 k n (k/n)γ l(k)/l(n) 1 max (k/n) γ l(k)/l(n) 1 n k δn δ γ + max (k/n) γ c k /c n e n k η(u)/udu n k δn δ γ + max (k/n) γ e max n u δn η(u) n k 1/udu δ γ + δ γ/2. n k δn Thus the lemma follows since max n u δn η(u) γ/2 for sufficiently large n and δ is arbitrarily chosen. 14

16 Lemma 5. Let {η k, k Z} be a stationary and ergodic process with mean 0 and define n = j=0 (1/σ2 n)(ψ j Ψ j n ) 2 η j, where {ψ k, k 0} satisfies conditions in (ii) of Theorem 2. Then lim E n = 0. Proof of Lemma 5. Let T n = n 1 k=0 η k and δ k = sup n k E T n /n. By the ergodic theorem, δ k 0. We first consider the case 1 < β < 1. For any fixed M > 1 write 2 n 1 n = [ j=0 (1+M)n + j=n + j=(1+m)n+1] 1 (Ψ σn 2 j Ψ j n ) 2 η j =: n,1 + Ω n,m + Ξ n,m. (30) In the sequel we will show that lim E n,1 = 0 and lim E Ω n,m = 0. Using the Abelian summation technique, n,1 = Ψ 2 n 1T n /σ 2 n + n 1 j=1 (1/σ2 n)(ψ 2 j Ψ 2 j 1)T j. Thus Ψ 2 E n,1 n 1E T n σn 2 + n 1 Ψ 2 j Ψ 2 j 1 E T j j=1 σ 2 n (31) Since E T n = o(n) and nψ 2 n 1/σ 2 n = O(1) by Lemma 3, the first term in the proceeding display vanishes. For the second one, let K 1 be a fixed integer. Then K 1 Ψ 2 j Ψ 2 j 1 E T j E n,1 σ 2 j=1 n + δ K n 1 j Ψ 2 j Ψ 2 j 1. (32) Observe that as j, Ψ 2 j Ψ 2 j 1 = ψ j 2Ψ j 1 + ψ j 2j 1 2β l 2 (j)/(1 β). Hence n 1 j Ψ 2 j Ψ 2 j 1 j=1 σ 2 n = n 1 j=1 O[j 2 2β l 2 (j)] σ 2 n j=k σ 2 n O[n 3 2β l 2 (n)] = σn 2 = O(1) by Karamata s theorem. So (32) implies that E n,1 = 0 since K is arbitrarily chosen and δ K 0 as K. The claim lim E Ω n,m = 0 can be similarly proved. Actually, by the same arguments in (31) and (32), it suffices to show the analogy of (33): (33) 1 σ 2 n nm j=1 j (Ψ n+j 1 Ψ j 1 ) 2 (Ψ n+j Ψ j ) 2 <. (34) Simple algebra gives (Ψ n+j 1 Ψ j 1 ) 2 (Ψ n+j Ψ j ) 2 2 ψ n+j ψ j Ψ n+j 1 Ψ j 1 + ψ n+j ψ j 2. (35) 15

17 Since ψ n+j ψ j = O[j β l(j)] and Ψ n+j 1 Ψ j 1 = O[n 1 β l(n)] for 1 j nm, (34) follows from nm j=1 jj β l(j) = O[n 2 2β l 2 (n)] = O(σ 2 n/n) and nm j σ 2 j=1 n ψ n+j ψ j Ψ n+j 1 Ψ j 1 = nm j=1 O[j 1 β l(j)n 1 β l(n)] σ 2 n = O[n3 2β l 2 (n)] σ 2 n by Karamata s theorem. Let M 1. Since l is slowly varying, there exists a constant c > 0 such that for all sufficiently large n, Ψ j Ψ j n c nj β l(j) holds for all j j n = (1 + M)n + 1. Therefore by (30) and Karamata s theorem, E n σ 2 j=(1+m)n+1 n E n,1 + 1 [c nj β l(j)] 2 E η 1 = E Ω n,m + E Ξ n,m for some c 1 <. Hence lim E n = 0 by letting M. jn 1 2β l 2 (j n ) c 2 (2β 1)σ E η n 2 1 = c 1 M 1 2β For the case β = 1, by (ii) of Lemma 3 and (30), it follows as in the the proof of the case 1/2 < β < 1 from n 1 j Ψ 2 j Ψ 2 j 1 j=1 σ 2 n = n 1 j=1 O(j ψ j Ψ j ) σ 2 n = n 1 j=1 O( l(j)ψ j ) σ 2 n since l(j)ψ j is also a slowly varying function and nl(n)ψ n /σ 2 n = 0 by Lemma 3. = 0 Lemma 6. Assume that {a n } is L p (p 2) weakly dependent with order 1. Then for S n = n i=1 X i, there exists a constant C, independent of n, such that for all n N, S n p Cσ n. (36) Proof of Lemma 6. Observe that E(a t F 0 ) = 0 k= P ka t. Then 0 E(a t F 0 ) p P k a t p = P 1 a j p = t P 1 a t p <. (37) t=0 t=0 k= t=0 j=t+1 Hence b k := t=k E(a t F k ) L p and the Poisson equation b k = a k +E(b k+1 F k ) holds. Let d k = b k E(b k F k 1 ), which by definition are stationary and ergodic martingale differences. Let Xt = j=0 ψ jd t j, X # t = X t Xt, Sn = n i=1 X t and S n # = n i=1 X# t. Then Sn = (Ψ j Ψ j n )d n j and S n # = ψ j [E(b 1 j F j ) E(b n+1 j F n j )]. (38) j=0 j=0 16

18 The essence of our approach is to approximate S n by S n, which admits martingale structures. By Lemma 2, S n p C p σ n d 0 p. Inequality (37) implies that d 0 p 2 b 0 p <. To establish (36), it then remains to verify that j=0 ψ je(b 1 j F j ) L p, which entails S n # p = O(1). To this end, for k 0 let V k = 0 i= ψ ip i k b 1+i. Then j=0 ψ je(b 1 j F j ) = k=0 V k. Since P i k b 1+i forms martingale differences in i, by Lemma 2, Therefore, V k 2 p C 2 p 0 i= ψ i P i k b 1+i 2 p = Cp( 2 ψj 2 ) P 1 b 1+k 2 p. j=0 ψ j E(b 1 j F j ) p j=0 k=0 V k p C p A 1/2 0 P 1 b 1+k p k=0 which is finite in view of P 1 b 1+k p P 1 E(a t F k+1 ) p = k=0 k=0 t=k+1 P 1 a t p = k=0 t=k+1 t P 1 a t p <. Here we have applied P 1 E(a t F k+1 ) = P 1 a t for t k Thus S # n p = O(1). Proof of Corollary 1. It follows from Lemma 1 in view of P 1 X t ψ i P 1 a t i = i=0 ψ i P 1 a t i = i=0 t=i+1 ψ i P 1 a k <. i=0 k=1 since P 1 a k = 0 for k 0. Proof of Theorem 1. By the weak convergence theory of random functions, it suffices to establish (i) finite dimensional convergence and (ii) tightness of the random function sequence W n (t). To show the finite dimensional convergence, let b k = t=k E(a t F k ) L 2 and d k = b k E(b k F k 1 ), which are in L 2 from the proof of Lemma 6. Recall (38) for S # n and S n. Then (6) implies that E(S n F 0 ) = o( S n ) and S n B n d 1. By Theorem 1 in 17

19 Wu and Woodroofe (2003), l 1 (n) = S n / n is a slowly varying function, and moreover for H ni = Ψ n d i, where Ψ n = n 1 j=0 Ψ j/n, we have max 1 k n S k Then S n 2 k i=1 H ni 2 = n H n1 2 and since S # n = O(1), k H ni = o( Sn ). (39) i=1 l (n) := S n n S n n = l 1 (n) Ψ n. Therefore, the finite dimensional convergence trivially follows from (39) since for 0 < t < 1, S nt n Ψn = nt i=1 H ni n Ψn + o P (1) = nt i=1 d i n + o P (1) N(0, t d 1 2 ). For the tightness, by Theorem 12.3 in Billingsley (1968), we need to show that there exists a constant C < and τ > 1 such that for all 1 k n, E[ W n (k/n) α ] C(k/n) τ. (40) We claim that (40) holds for τ = (2 + α)/4 > 1. By Lemma 6, (40) is reduced to max (n 1 k n k ) 2+α 4α S k S n = max (n 1 k n k ) 2+α 4α kl (k) nl (n) <. By Lemma 4, the limit in the proceeding display is actually 1. Proof of Theorem 2. As in the proof of Theorem 1, we need to verify the finite dimensional convergence and the tightness. Since {a n } are L 2 weakly dependent with order 1, from the proof of Lemma 6, we can define b k = t=k E(a t F k ) L 2 and d k = b k E(b k F k 1 ). Recall (38) for S n # and Sn. Note that S n # = O(1), it suffices to show that Sn/σ n N(0, d 1 2 ). To this end, we shall apply the martingale central limit theorem. Let ζ n,i = (Ψ i Ψ i n )/σ n. Then n X t /σ n = i=0 ζ n,id n i and i=0 ζ2 n,i = 1 for each n. The Lindeberg condition is obvious in view of sup ζ n,i sup ζ n,i + sup ζ n,i = O(nsup ψ j /σ n ) + O(n 1 β l(n)/σ n ) = O(1/ n). i 0 i>2n 0 i 2n j n 18

20 Let η n,i = E(d 2 n i F i 1 ) E(d 2 n i). Then the convergence of conditional variance easily follows from Lemma 6, which asserts that lim E i=0 ζ2 n,iη n,i = 0. For the tightness, we shall show that (40) holds with α = 2 and τ = H + 1/2. By (ii) of Lemma 3, l (n) = σ n /n H is slowly varying. Then (40) is equivalent to max (n 1 k n k )H 1/2 [ l (k) l (n) ]2 <, which is an easy consequence of Lemma 4. Proof of Theorem 3. Let z t,1 = t 1 i=0 ψ ia t i and z t,0 = i=t ψ ia t i. Then X t = z t,1 + z t,0. Since z t,0 is measurable with respect to F 0, P 1 z 2 t,0 = 0, and by the triangle inequality, P 1 X 2 t P 1 z 2 t,1 + 2 z t,0 P 1 z t,1 P 1 z 2 t,1 + 2 z t,0 4 P 1 z t,1 4. We will show P 1z 2 t,1 < and z t,0 4. P 1 z t,1 4 <. For the former, 1 2 P 1z 2 t,1 = ψ i ψ i P 1 a t i a t i ψ i ψ i P 1 a t i a t i 0 i i<t ψ i ψ i+j P 1 a k a k+j i =0 i=i t=i+1 k=1 j=0 i=0 k=1 j=0 A 0 P 1 a k a k+j, where the last inequality is due to i=0 ψ iψ j i=0 ψ2 i. Hence (15) entails the former inequality. By (14), the latter one holds if we can show that z t,0 4 C A t for some constant C > 0 in view of z t,0 4 P 1 z t,1 4 C t 1 A t ψ j P 1 a t j 4 = C At ψ j P 1 a t j 4 C j=0 ψ j A j+1 ( j=0 t=j+1 j=0 t=j+1 P 1 a t j 4 ) <. To prove z t,0 4 C A t, define U k = j=t ψ je(a t j F t j k ), k 0. Then U 0 = z t,0 and U k U k+1 = j=t ψ jp t j k a t j = t i= ψ ip t+i k a t+i. Observe that the summands ψ i P t+i k a t+i of U k U k+1 form martingale differences in i. By Lemma 2, { t U k U k+1 4 C 4 i= } 1/2 ψ i P t+i k a t+i 2 4 = C 4 A 1/2 t P 0 a k 4, 19

21 which implies that z t,0 4 = (U k U k+1 ) 4 k=0 k=0 U k U k+1 4 = O(A 1/2 t ) since {a k } are L 4 weakly dependent, namely k=1 P 0a k 4 <. Proof of Proposition 1. By the Cauchy inequality, the proposition follows from [ ψ t ] 2 [ ] [ ] [ ] [ ] i 1 A t+1 tψ 2 t A t+1 t 1/2 tψ 2 t ψi 2 t 1/2 in view of i 1 t 1/2 2 i for i 2. Proof of Corollary 2. We first consider h = 1. Let ψ 0 = ψ 0, ψ k = ψ k + ψ k 1 for k 1 and A k = i=k ψ i 2. Since ψ k are square summable, it is easily seen that (14) implies i=0 ψ i A i+1 <, which by Theorem 3 entails P 1(X t + X t+1 ) 2 < in view of X t +X t+1 = k=0 ψ k a t+1 k. Similarly, P 1(X t X t+1 ) 2 <. Hence by the elementary identity X t X t+1 = [(X t + X t+1 ) 2 (X t X t+1 ) 2 ]/4, we have P 1(X t X t+1 ) < by the triangle inequality. Therefore, P 1(X t X t+h,h ) < easily follows and (17) holds by the Cramer-Wold device. Proof of Proposition 3. We shall show that P 1 a t a t+k Cρ k and P 1 a t a t+k Cρ t hold respectively, which imply that P 1 a t a t+k C min(ρ k, ρ t ) Cρ (k+t)/2. In this proof constants C > 0 and ρ (0, 1) may change from line to line. For the former, for t, k 0, by Cauchy s inequality, E(a t a t+k F 0 ) = E(E(a t a t+k F t ) F 0 ) E(a t a t+k F t ) = E(a 0 a k F 0 ) = E(a 0 (a k a k) F 0 ) a 0 4 a k a k 4 Cρ k. Thus P 1 a t a t+k E(a t a t+k F 1 ) + E(a t a t+k F 0 ) Cρ k. For the latter, observe that γ k = E(a t a t+k ) = E(a ta t+k F 0), E(a t a t+k F 0 ) γ k = E(a t a t+k a ta t+k) F 0 a t a t+k a ta t+k i=2 a t (a t+k a t+k) + (a t a t)a t+k a t 4 a t+k a t+k 4 + a t a t 4 a t+k 4 Cρ t+k + Cρ t Cρ t, 20

22 which, combined with a similar inequality E(a t a t+k F 1 ) γ k Cρ t, yields P 1 a t a t+k Cρ t via triangle inequality. Proof of Proposition 2. (i) Since E(a n F 0 ) = 0, it easily follows from E(a n a n F 0 ) α a n a n α. (ii) Let λ n = ρ n/(2α). By Markov s inequality, (19) implies P ( a n a n λ n ) Cλ α n. By Hölder s inequality, E[ a n a n p (1 an a n λ n + 1 an a n <λ n )] a n a n p q[e(1 an a n λ n )] 1 p/q + λ p n, which entails E( a n a n p ) Cρ n since a n L q. Observe that E(a n F 0 ) = 0. Again by Hölder s inequality, E(a n F 0 ) p = E(a n a n F 0 ) p a n a n p. Finally, E(a n F 1 ) E(a n F 0 ) p E(a n 1 F 0 ) p + E(a n F 0 ) p implies P 1 a n p Cρ n. Proof of Proposition 4. Let Y 0, independent of {ɛ t, t Z} be an iid copy of Y 0 and define recursively Y t = M t Y t 1 + b t, t 1; let Yn = Y n Y n. Then Yn (AB) (CD) = (A C)(B D), we have Y n 2 = Mn 2 Y 2 n 1 =... = Mn 2... M 2 1 Y 2 0. = M n Y n 1. Using Hence E(Yn 2 ) = [E(M1 2 )] n E(Y0 2 ) and (19) easily follows since ρ[e(mt 2 )] < 1. REFERENCES Baillie, R.T., C.F. Chung & M.A. Tieslau (1996) Analysing inflation by the fractionally integrated ARFIMA-GARCH model. Journal of Applied Econometrics 11, Billingsley, P. (1968) Convergence of Probability Measures. New York: Wiley. Bingham, N.H., C.M. Goldie & J.L. Teugels (1987) Regular Variation. Cambridge, UK: Cambridge University Press. Bollerslev, T. (1986) Generalized Autoregressive Conditional Heteroscedasticity. Journal of Econometrics 31, Box, G.E.P. & G.M. Jenkins (1976) Time series analysis, forecasting and control. Holden- Day, San Francisco. Bradley, R. (1986) Basic properties of strong mixing conditions. In E. Eberlein and M. Taqqu, editor, Dependence in Probability and Statistics: A Survey of Recent Results, pages Birkhauser, Boston, MA. 21

23 Brockwell, P.J. & R.A. Davis (1991) Time Series: Theory and Methods. New York: Springer-Verlag. Carrasco, M. & X. Chen (2002) Mixing and moment properties of various GARCH and stochastic volatility models. Econometric Theory 18, Chen, M.& H. An (1998) A note on the stationarity and existence of moments of the GARCH models. Statistica Sinica 8, Chung, C. F. (2002) Sample Means, Sample Autocovariances, and Linear Regression of Stationary Multivariate Long Memory Processes. Econometric Theory 18, Chow, Y.S. & H. Teicher (1988) Probability Theory, 2nd ed. New York: Springer-Verlag. Davydov, Yu.A (1970) The invariance principle for stationary process. Theory of Probability and Its Applications 15, Diaconis, P. & D. Freedman (1999) Iterated random functions. SIAM Review 41, Diananda, P.H. (1953) Some probability limit theorems with statistical applications. Proceedings of Cambridge Philosophy Society, Giraitis, L. & D. Surgailis (1990) A central limit theorem for quadratic forms in strongly dependent linear variables and its application to asymptotic normality of Whittle s estimate. Probability Theory and Related Fields 86, Gorodetskii, V.V. (1977) Convergence to Semi-Stable Gaussian Processes. Theory of Probability and Its Applications 22, Haggan, V. & T., Ozaki (1981) Modeling Nonlinear Vibrations Using an Amplitude- Dependent Autoregressive Time Series Models. Biometrika 68, Hall, P. & C.C. Heyde (1980) Martingale Limit Theorem and Its Application, New York, Academic Press. Hannan, E.J. (1976) The asymptotic distribution of serial covariances. Annals of Statistics 4, Hannan, E.J. & C.C. Heyde (1972) On limit theorems for quadratic functions of discrete time series. Annals of Mathematical Statistics 43, Hauser, M.A. & R.M. Kunst (2001) Forecasting high-frequency financial data with the ARFIMA-ARCH model. Journal of Forecasting 20, He, C., T. Teräsvirta (1999) Fourth moment structure of the GARCH(p,q) process. Econometric Theory 15, Hosking, J.R.M (1981) Fractional Differencing. Biometrika 73,

24 Hosking, J. R. M. (1996) Asymptotic distributions of the sample mean, autocovariances, and autocorrelations of long-memory time series. Journal of Econometrics 73, Hsing, T., Wu, W. B. On weighted U statistics for stationary processes. To appear, Ann. Probability, 2003 Lien, D., Y. K. Tse (1999) Forecasting the Nikkei spot index with fractional cointegration. Journal of Forecasting 18, Lien, D., Y. K. Tse (1999) Fractional cointegration and futures hedging. Journal of Futures Markets 19, Ling, S., W. K. Li (1997) On fractionally integrated autoregressive moving-average time series models with conditional heteroscedasticity. Journal of the American Statistical Association 92, Ling, S., M. McAleer (2002) Necessary and sufficient moment conditions for the GARCH (r, s) and asymmetric power GARCH(r, s) models. Econometric Theory 18, Mandelbrot, B., Van Ness, W.J. (1969) Fractional Brownian motions, fractional noises and applications, SIAM Review, 10, McLeish, D.L. (1975) Invariance Principles for dependent variables. Z. Wahrsch. Verw. Gebiete 32, McLeish, D.L. (1977) On the invariance principle for non stationary mixingales. Annals of Probability 5, Meyn, S.P., R.L. Tweedie (1994) Markov Chains and Stochastic Stability, Springer-Verlag. Nicholls, D.F., Quinn, B.G.(1982) Random coefficient autoregressive models: an introduction. New York: Springer. Phillips, P.C.B., V. Solo (1992) Asymptotics for linear processes. Annals of Statistics 20, Peligrad, M. (1986) Recent advances in the central limit theorem and its weak invariance principle for mixing sequences of random variables (A survey), In E. Eberlein and M. Taqqu, editor, Dependence in Probability and Statistics: A Survey of Recent Results, pages Birkhauser, Boston, MA. Romano, P.J. & L.A. Thombs (1996) Inference For Autocorrelations Under Weak Assumption. Journal of the American Statistical Association 91, Taniguchi, M., Y. Kakizawa (2000) Asymptotic Theory of Statistical Inference for Time 23

25 Series, New York: Springer. Tong, H. (1990) Non-linear Time Series: A Dynamic System Approach. Oxford University Press. Wang, Q.Y., Y.X. Lin, C.M. Gulati (2002) The invariance principle for linear processes with applications. Econometric Theory 18, Woodroofe, M. (1992) A central limit theorem for functions of a Markov chain with applications to shifts. Stochastic Processes and their Applications 41, Wu, W. B. (2002) Central limit theorems for functionals of linear processes and their applications. Statistica Sinica 12, Wu, W. B. (2003a) Empirical Processes of Long-memory Sequences. 9, , Bernoulli. Wu, W. B. (2003b). Additive Functionals of Infinite-Variance Moving Averages. Statistica Sinica 13, Wu, W. B., M. Woodroofe (2000) A central limit theorem for iterated random functions. Journal of Applied Probability. 37, Wu, W. B., M. Woodroofe (2004) Martingale approximations for sums of stationary processes. Ann. Probability 32, Wu, W. B., J. Mielniczuk (2002) Kernel density estimation for linear processes. Annals of Statistics 30, Yokoyama, R. (1995) On the central limit theorem and law of the iterated logarithm for stationary processes with applications to linear processes. Stochastic Processes and their Applications 59,

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Additive functionals of infinite-variance moving averages Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Departments of Statistics The University of Chicago Chicago, Illinois 60637 June

More information

UNIT ROOT TESTING FOR FUNCTIONALS OF LINEAR PROCESSES

UNIT ROOT TESTING FOR FUNCTIONALS OF LINEAR PROCESSES Econometric Theory, 22, 2006, 1 14+ Printed in the United States of America+ DOI: 10+10170S0266466606060014 UNIT ROOT TESTING FOR FUNCTIONALS OF LINEAR PROCESSES WEI BIAO WU University of Chicago We consider

More information

Invariance principles for fractionally integrated nonlinear processes

Invariance principles for fractionally integrated nonlinear processes IMS Lecture Notes Monograph Series Invariance principles for fractionally integrated nonlinear processes Xiaofeng Shao, Wei Biao Wu University of Chicago Abstract: We obtain invariance principles for a

More information

FOURIER TRANSFORMS OF STATIONARY PROCESSES 1. k=1

FOURIER TRANSFORMS OF STATIONARY PROCESSES 1. k=1 FOURIER TRANSFORMS OF STATIONARY PROCESSES WEI BIAO WU September 8, 003 Abstract. We consider the asymptotic behavior of Fourier transforms of stationary and ergodic sequences. Under sufficiently mild

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY Econometric Theory, 26, 2010, 1855 1861. doi:10.1017/s0266466610000216 NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY UWE HASSLER Goethe-Universität Frankfurt

More information

Some functional (Hölderian) limit theorems and their applications (II)

Some functional (Hölderian) limit theorems and their applications (II) Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen

More information

Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach

Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach By Shiqing Ling Department of Mathematics Hong Kong University of Science and Technology Let {y t : t = 0, ±1, ±2,

More information

Self-Normalized Dickey-Fuller Tests for a Unit Root

Self-Normalized Dickey-Fuller Tests for a Unit Root Self-Normalized Dickey-Fuller Tests for a Unit Root Gaowen Wang Department of Finance and Banking, Takming College Wei-Lin Mao Department of Economics, National Cheng-Chi University Keywords: domain of

More information

Mi-Hwa Ko. t=1 Z t is true. j=0

Mi-Hwa Ko. t=1 Z t is true. j=0 Commun. Korean Math. Soc. 21 (2006), No. 4, pp. 779 786 FUNCTIONAL CENTRAL LIMIT THEOREMS FOR MULTIVARIATE LINEAR PROCESSES GENERATED BY DEPENDENT RANDOM VECTORS Mi-Hwa Ko Abstract. Let X t be an m-dimensional

More information

GARCH( 1, 1) processes are near epoch dependent

GARCH( 1, 1) processes are near epoch dependent Economics Letters 36 (1991) 181-186 North-Holland 181 GARCH( 1, 1) processes are near epoch dependent Bruce E. Hansen 1Jnrrwsity of Rochester, Rochester, NY 14627, USA Keceived 22 October 1990 Accepted

More information

QED. Queen s Economics Department Working Paper No. 1244

QED. Queen s Economics Department Working Paper No. 1244 QED Queen s Economics Department Working Paper No. 1244 A necessary moment condition for the fractional functional central limit theorem Søren Johansen University of Copenhagen and CREATES Morten Ørregaard

More information

CENTRAL LIMIT THEOREMS FOR FUNCTIONALS OF LINEAR PROCESSES AND THEIR APPLICATIONS

CENTRAL LIMIT THEOREMS FOR FUNCTIONALS OF LINEAR PROCESSES AND THEIR APPLICATIONS Statistica Sinica 12(2002), 635-649 CENTRAL LIMIT THEOREMS FOR FUNCTIONALS OF LINEAR PROCESSES AND THEIR APPLICATIONS Wei Biao Wu University of Chicago Abstract: This paper establishes central limit theorems

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Asymptotic Statistics-III. Changliang Zou

Asymptotic Statistics-III. Changliang Zou Asymptotic Statistics-III Changliang Zou The multivariate central limit theorem Theorem (Multivariate CLT for iid case) Let X i be iid random p-vectors with mean µ and and covariance matrix Σ. Then n (

More information

Practical conditions on Markov chains for weak convergence of tail empirical processes

Practical conditions on Markov chains for weak convergence of tail empirical processes Practical conditions on Markov chains for weak convergence of tail empirical processes Olivier Wintenberger University of Copenhagen and Paris VI Joint work with Rafa l Kulik and Philippe Soulier Toronto,

More information

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley

Time Series Models and Inference. James L. Powell Department of Economics University of California, Berkeley Time Series Models and Inference James L. Powell Department of Economics University of California, Berkeley Overview In contrast to the classical linear regression model, in which the components of the

More information

α i ξ t i, where ξ t i.i.d. (0, 1),

α i ξ t i, where ξ t i.i.d. (0, 1), PROBABILITY AND MATHEMATICAL STATISTICS Vol. 31, Fasc. 1 (2011), pp. 300 000 NONLINEARITY OF ARCH AND STOCHASTIC VOLATILITY MODELS AND BARTLETT S FORMULA BY PIOTR S. KO KO S Z K A (LOGAN) AND DIMITRIS

More information

The autocorrelation and autocovariance functions - helpful tools in the modelling problem

The autocorrelation and autocovariance functions - helpful tools in the modelling problem The autocorrelation and autocovariance functions - helpful tools in the modelling problem J. Nowicka-Zagrajek A. Wy lomańska Institute of Mathematics and Computer Science Wroc law University of Technology,

More information

A CLT FOR MULTI-DIMENSIONAL MARTINGALE DIFFERENCES IN A LEXICOGRAPHIC ORDER GUY COHEN. Dedicated to the memory of Mikhail Gordin

A CLT FOR MULTI-DIMENSIONAL MARTINGALE DIFFERENCES IN A LEXICOGRAPHIC ORDER GUY COHEN. Dedicated to the memory of Mikhail Gordin A CLT FOR MULTI-DIMENSIONAL MARTINGALE DIFFERENCES IN A LEXICOGRAPHIC ORDER GUY COHEN Dedicated to the memory of Mikhail Gordin Abstract. We prove a central limit theorem for a square-integrable ergodic

More information

Central Limit Theorem for Non-stationary Markov Chains

Central Limit Theorem for Non-stationary Markov Chains Central Limit Theorem for Non-stationary Markov Chains Magda Peligrad University of Cincinnati April 2011 (Institute) April 2011 1 / 30 Plan of talk Markov processes with Nonhomogeneous transition probabilities

More information

Gaussian Processes. 1. Basic Notions

Gaussian Processes. 1. Basic Notions Gaussian Processes 1. Basic Notions Let T be a set, and X : {X } T a stochastic process, defined on a suitable probability space (Ω P), that is indexed by T. Definition 1.1. We say that X is a Gaussian

More information

Martingale approximations for random fields

Martingale approximations for random fields Electron. Commun. Probab. 23 (208), no. 28, 9. https://doi.org/0.24/8-ecp28 ISSN: 083-589X ELECTRONIC COMMUNICATIONS in PROBABILITY Martingale approximations for random fields Peligrad Magda * Na Zhang

More information

CREATES Research Paper A necessary moment condition for the fractional functional central limit theorem

CREATES Research Paper A necessary moment condition for the fractional functional central limit theorem CREATES Research Paper 2010-70 A necessary moment condition for the fractional functional central limit theorem Søren Johansen and Morten Ørregaard Nielsen School of Economics and Management Aarhus University

More information

Online Appendix. j=1. φ T (ω j ) vec (EI T (ω j ) f θ0 (ω j )). vec (EI T (ω) f θ0 (ω)) = O T β+1/2) = o(1), M 1. M T (s) exp ( isω)

Online Appendix. j=1. φ T (ω j ) vec (EI T (ω j ) f θ0 (ω j )). vec (EI T (ω) f θ0 (ω)) = O T β+1/2) = o(1), M 1. M T (s) exp ( isω) Online Appendix Proof of Lemma A.. he proof uses similar arguments as in Dunsmuir 979), but allowing for weak identification and selecting a subset of frequencies using W ω). It consists of two steps.

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Stochastic volatility models: tails and memory

Stochastic volatility models: tails and memory : tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier Plan Model assumptions; Limit theorems for partial sums and

More information

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case

The largest eigenvalues of the sample covariance matrix. in the heavy-tail case The largest eigenvalues of the sample covariance matrix 1 in the heavy-tail case Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia NY), Johannes Heiny (Aarhus University)

More information

UC San Diego Recent Work

UC San Diego Recent Work UC San Diego Recent Work Title The Variance of Sample Autocorrelations: Does Barlett's Formula Work With ARCH Data? Permalink https://escholarship.org/uc/item/68c247dp Authors Kokoszka, Piotr S. Politis,

More information

Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations

Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations Diagnostic Test for GARCH Models Based on Absolute Residual Autocorrelations Farhat Iqbal Department of Statistics, University of Balochistan Quetta-Pakistan farhatiqb@gmail.com Abstract In this paper

More information

An almost sure invariance principle for additive functionals of Markov chains

An almost sure invariance principle for additive functionals of Markov chains Statistics and Probability Letters 78 2008 854 860 www.elsevier.com/locate/stapro An almost sure invariance principle for additive functionals of Markov chains F. Rassoul-Agha a, T. Seppäläinen b, a Department

More information

Random Bernstein-Markov factors

Random Bernstein-Markov factors Random Bernstein-Markov factors Igor Pritsker and Koushik Ramachandran October 20, 208 Abstract For a polynomial P n of degree n, Bernstein s inequality states that P n n P n for all L p norms on the unit

More information

A generalization of Strassen s functional LIL

A generalization of Strassen s functional LIL A generalization of Strassen s functional LIL Uwe Einmahl Departement Wiskunde Vrije Universiteit Brussel Pleinlaan 2 B-1050 Brussel, Belgium E-mail: ueinmahl@vub.ac.be Abstract Let X 1, X 2,... be a sequence

More information

1 Linear Difference Equations

1 Linear Difference Equations ARMA Handout Jialin Yu 1 Linear Difference Equations First order systems Let {ε t } t=1 denote an input sequence and {y t} t=1 sequence generated by denote an output y t = φy t 1 + ε t t = 1, 2,... with

More information

Gaussian processes. Basic Properties VAG002-

Gaussian processes. Basic Properties VAG002- Gaussian processes The class of Gaussian processes is one of the most widely used families of stochastic processes for modeling dependent data observed over time, or space, or time and space. The popularity

More information

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1

Lecture 9. d N(0, 1). Now we fix n and think of a SRW on [0,1]. We take the k th step at time k n. and our increments are ± 1 Random Walks and Brownian Motion Tel Aviv University Spring 011 Lecture date: May 0, 011 Lecture 9 Instructor: Ron Peled Scribe: Jonathan Hermon In today s lecture we present the Brownian motion (BM).

More information

Extremogram and Ex-Periodogram for heavy-tailed time series

Extremogram and Ex-Periodogram for heavy-tailed time series Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal

More information

Introduction to Stochastic processes

Introduction to Stochastic processes Università di Pavia Introduction to Stochastic processes Eduardo Rossi Stochastic Process Stochastic Process: A stochastic process is an ordered sequence of random variables defined on a probability space

More information

AR, MA and ARMA models

AR, MA and ARMA models AR, MA and AR by Hedibert Lopes P Based on Tsay s Analysis of Financial Time Series (3rd edition) P 1 Stationarity 2 3 4 5 6 7 P 8 9 10 11 Outline P Linear Time Series Analysis and Its Applications For

More information

On moving-average models with feedback

On moving-average models with feedback Submitted to the Bernoulli arxiv: math.pr/0000000 On moving-average models with feedback DONG LI 1,*, SHIQING LING 1,** and HOWELL TONG 2 1 Department of Mathematics, Hong Kong University of Science and

More information

CUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS. Sangyeol Lee

CUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS. Sangyeol Lee CUSUM TEST FOR PARAMETER CHANGE IN TIME SERIES MODELS Sangyeol Lee 1 Contents 1. Introduction of the CUSUM test 2. Test for variance change in AR(p) model 3. Test for Parameter Change in Regression Models

More information

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50

GARCH Models. Eduardo Rossi University of Pavia. December Rossi GARCH Financial Econometrics / 50 GARCH Models Eduardo Rossi University of Pavia December 013 Rossi GARCH Financial Econometrics - 013 1 / 50 Outline 1 Stylized Facts ARCH model: definition 3 GARCH model 4 EGARCH 5 Asymmetric Models 6

More information

COVARIANCES ESTIMATION FOR LONG-MEMORY PROCESSES

COVARIANCES ESTIMATION FOR LONG-MEMORY PROCESSES Adv. Appl. Prob. 4, 137 157 (010) Printed in Northern Ireland Applied Probability Trust 010 COVARIANCES ESTIMATION FOR LONG-MEMORY PROCESSES WEI BIAO WU and YINXIAO HUANG, University of Chicago WEI ZHENG,

More information

Test for Parameter Change in ARIMA Models

Test for Parameter Change in ARIMA Models Test for Parameter Change in ARIMA Models Sangyeol Lee 1 Siyun Park 2 Koichi Maekawa 3 and Ken-ichi Kawai 4 Abstract In this paper we consider the problem of testing for parameter changes in ARIMA models

More information

Weak invariance principles for sums of dependent random functions

Weak invariance principles for sums of dependent random functions Available online at www.sciencedirect.com Stochastic Processes and their Applications 13 (013) 385 403 www.elsevier.com/locate/spa Weak invariance principles for sums of dependent random functions István

More information

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes

Chapter 2. Some basic tools. 2.1 Time series: Theory Stochastic processes Chapter 2 Some basic tools 2.1 Time series: Theory 2.1.1 Stochastic processes A stochastic process is a sequence of random variables..., x 0, x 1, x 2,.... In this class, the subscript always means time.

More information

On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables

On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables On the Set of Limit Points of Normed Sums of Geometrically Weighted I.I.D. Bounded Random Variables Deli Li 1, Yongcheng Qi, and Andrew Rosalsky 3 1 Department of Mathematical Sciences, Lakehead University,

More information

ECON 616: Lecture 1: Time Series Basics

ECON 616: Lecture 1: Time Series Basics ECON 616: Lecture 1: Time Series Basics ED HERBST August 30, 2017 References Overview: Chapters 1-3 from Hamilton (1994). Technical Details: Chapters 2-3 from Brockwell and Davis (1987). Intuition: Chapters

More information

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek

Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek J. Korean Math. Soc. 41 (2004), No. 5, pp. 883 894 CONVERGENCE OF WEIGHTED SUMS FOR DEPENDENT RANDOM VARIABLES Han-Ying Liang, Dong-Xia Zhang, and Jong-Il Baek Abstract. We discuss in this paper the strong

More information

Extremogram and ex-periodogram for heavy-tailed time series

Extremogram and ex-periodogram for heavy-tailed time series Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal

More information

Stein s Method and Characteristic Functions

Stein s Method and Characteristic Functions Stein s Method and Characteristic Functions Alexander Tikhomirov Komi Science Center of Ural Division of RAS, Syktyvkar, Russia; Singapore, NUS, 18-29 May 2015 Workshop New Directions in Stein s method

More information

A note on the specification of conditional heteroscedasticity using a TAR model

A note on the specification of conditional heteroscedasticity using a TAR model A note on the specification of conditional heteroscedasticity using a TAR model Fabio H. Nieto Universidad Nacional de Colombia Edna C. Moreno Universidad Santo Tomás, Bogotá, Colombia Reporte Interno

More information

Introduction to ARMA and GARCH processes

Introduction to ARMA and GARCH processes Introduction to ARMA and GARCH processes Fulvio Corsi SNS Pisa 3 March 2010 Fulvio Corsi Introduction to ARMA () and GARCH processes SNS Pisa 3 March 2010 1 / 24 Stationarity Strict stationarity: (X 1,

More information

Nonparametric Density Estimation fo Title Processes with Infinite Variance.

Nonparametric Density Estimation fo Title Processes with Infinite Variance. Nonparametric Density Estimation fo Title Processes with Infinite Variance Author(s) Honda, Toshio Citation Issue 2006-08 Date Type Technical Report Text Version URL http://hdl.handle.net/10086/16959 Right

More information

Lecture 2: Univariate Time Series

Lecture 2: Univariate Time Series Lecture 2: Univariate Time Series Analysis: Conditional and Unconditional Densities, Stationarity, ARMA Processes Prof. Massimo Guidolin 20192 Financial Econometrics Spring/Winter 2017 Overview Motivation:

More information

Modeling and testing long memory in random fields

Modeling and testing long memory in random fields Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous

More information

Asymptotic inference for a nonstationary double ar(1) model

Asymptotic inference for a nonstationary double ar(1) model Asymptotic inference for a nonstationary double ar() model By SHIQING LING and DONG LI Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong maling@ust.hk malidong@ust.hk

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot November 2, 2011 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS*

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS* LARGE EVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILE EPENENT RANOM VECTORS* Adam Jakubowski Alexander V. Nagaev Alexander Zaigraev Nicholas Copernicus University Faculty of Mathematics and Computer Science

More information

1. Stochastic Processes and filtrations

1. Stochastic Processes and filtrations 1. Stochastic Processes and 1. Stoch. pr., A stochastic process (X t ) t T is a collection of random variables on (Ω, F) with values in a measurable space (S, S), i.e., for all t, In our case X t : Ω S

More information

A Functional Central Limit Theorem for an ARMA(p, q) Process with Markov Switching

A Functional Central Limit Theorem for an ARMA(p, q) Process with Markov Switching Communications for Statistical Applications and Methods 2013, Vol 20, No 4, 339 345 DOI: http://dxdoiorg/105351/csam2013204339 A Functional Central Limit Theorem for an ARMAp, q) Process with Markov Switching

More information

Stochastic process for macro

Stochastic process for macro Stochastic process for macro Tianxiao Zheng SAIF 1. Stochastic process The state of a system {X t } evolves probabilistically in time. The joint probability distribution is given by Pr(X t1, t 1 ; X t2,

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

4 Sums of Independent Random Variables

4 Sums of Independent Random Variables 4 Sums of Independent Random Variables Standing Assumptions: Assume throughout this section that (,F,P) is a fixed probability space and that X 1, X 2, X 3,... are independent real-valued random variables

More information

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains

A regeneration proof of the central limit theorem for uniformly ergodic Markov chains A regeneration proof of the central limit theorem for uniformly ergodic Markov chains By AJAY JASRA Department of Mathematics, Imperial College London, SW7 2AZ, London, UK and CHAO YANG Department of Mathematics,

More information

Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables

Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables Asymptotic Tail Probabilities of Sums of Dependent Subexponential Random Variables Jaap Geluk 1 and Qihe Tang 2 1 Department of Mathematics The Petroleum Institute P.O. Box 2533, Abu Dhabi, United Arab

More information

A strong consistency proof for heteroscedasticity and autocorrelation consistent covariance matrix estimators

A strong consistency proof for heteroscedasticity and autocorrelation consistent covariance matrix estimators A strong consistency proof for heteroscedasticity and autocorrelation consistent covariance matrix estimators Robert M. de Jong Department of Economics Michigan State University 215 Marshall Hall East

More information

Limit Theorems for Exchangeable Random Variables via Martingales

Limit Theorems for Exchangeable Random Variables via Martingales Limit Theorems for Exchangeable Random Variables via Martingales Neville Weber, University of Sydney. May 15, 2006 Probabilistic Symmetries and Their Applications A sequence of random variables {X 1, X

More information

Multivariate Normal-Laplace Distribution and Processes

Multivariate Normal-Laplace Distribution and Processes CHAPTER 4 Multivariate Normal-Laplace Distribution and Processes The normal-laplace distribution, which results from the convolution of independent normal and Laplace random variables is introduced by

More information

Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes

Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes By Tomohito NAITO, Kohei ASAI and Masanobu TANIGUCHI Department of Mathematical Sciences, School of Science and Engineering,

More information

Location Multiplicative Error Model. Asymptotic Inference and Empirical Analysis

Location Multiplicative Error Model. Asymptotic Inference and Empirical Analysis : Asymptotic Inference and Empirical Analysis Qian Li Department of Mathematics and Statistics University of Missouri-Kansas City ql35d@mail.umkc.edu October 29, 2015 Outline of Topics Introduction GARCH

More information

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0}

Krzysztof Burdzy University of Washington. = X(Y (t)), t 0} VARIATION OF ITERATED BROWNIAN MOTION Krzysztof Burdzy University of Washington 1. Introduction and main results. Suppose that X 1, X 2 and Y are independent standard Brownian motions starting from 0 and

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

Continuous Time Approximations to GARCH(1, 1)-Family Models and Their Limiting Properties

Continuous Time Approximations to GARCH(1, 1)-Family Models and Their Limiting Properties Communications for Statistical Applications and Methods 214, Vol. 21, No. 4, 327 334 DOI: http://dx.doi.org/1.5351/csam.214.21.4.327 Print ISSN 2287-7843 / Online ISSN 2383-4757 Continuous Time Approximations

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information

Statistical signal processing

Statistical signal processing Statistical signal processing Short overview of the fundamentals Outline Random variables Random processes Stationarity Ergodicity Spectral analysis Random variable and processes Intuition: A random variable

More information

Understanding Regressions with Observations Collected at High Frequency over Long Span

Understanding Regressions with Observations Collected at High Frequency over Long Span Understanding Regressions with Observations Collected at High Frequency over Long Span Yoosoon Chang Department of Economics, Indiana University Joon Y. Park Department of Economics, Indiana University

More information

Stochastic Processes

Stochastic Processes Stochastic Processes Stochastic Process Non Formal Definition: Non formal: A stochastic process (random process) is the opposite of a deterministic process such as one defined by a differential equation.

More information

SMSTC (2007/08) Probability.

SMSTC (2007/08) Probability. SMSTC (27/8) Probability www.smstc.ac.uk Contents 12 Markov chains in continuous time 12 1 12.1 Markov property and the Kolmogorov equations.................... 12 2 12.1.1 Finite state space.................................

More information

GARCH processes probabilistic properties (Part 1)

GARCH processes probabilistic properties (Part 1) GARCH processes probabilistic properties (Part 1) Alexander Lindner Centre of Mathematical Sciences Technical University of Munich D 85747 Garching Germany lindner@ma.tum.de http://www-m1.ma.tum.de/m4/pers/lindner/

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM

GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM GAUSSIAN PROCESSES; KOLMOGOROV-CHENTSOV THEOREM STEVEN P. LALLEY 1. GAUSSIAN PROCESSES: DEFINITIONS AND EXAMPLES Definition 1.1. A standard (one-dimensional) Wiener process (also called Brownian motion)

More information

Statistics 612: L p spaces, metrics on spaces of probabilites, and connections to estimation

Statistics 612: L p spaces, metrics on spaces of probabilites, and connections to estimation Statistics 62: L p spaces, metrics on spaces of probabilites, and connections to estimation Moulinath Banerjee December 6, 2006 L p spaces and Hilbert spaces We first formally define L p spaces. Consider

More information

The sample autocorrelations of financial time series models

The sample autocorrelations of financial time series models The sample autocorrelations of financial time series models Richard A. Davis 1 Colorado State University Thomas Mikosch University of Groningen and EURANDOM Abstract In this chapter we review some of the

More information

Regular Variation and Extreme Events for Stochastic Processes

Regular Variation and Extreme Events for Stochastic Processes 1 Regular Variation and Extreme Events for Stochastic Processes FILIP LINDSKOG Royal Institute of Technology, Stockholm 2005 based on joint work with Henrik Hult www.math.kth.se/ lindskog 2 Extremes for

More information

Establishing Stationarity of Time Series Models via Drift Criteria

Establishing Stationarity of Time Series Models via Drift Criteria Establishing Stationarity of Time Series Models via Drift Criteria Dawn B. Woodard David S. Matteson Shane G. Henderson School of Operations Research and Information Engineering Cornell University January

More information

Robust Backtesting Tests for Value-at-Risk Models

Robust Backtesting Tests for Value-at-Risk Models Robust Backtesting Tests for Value-at-Risk Models Jose Olmo City University London (joint work with Juan Carlos Escanciano, Indiana University) Far East and South Asia Meeting of the Econometric Society

More information

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi)

Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Lecture 3 Stationary Processes and the Ergodic LLN (Reference Section 2.2, Hayashi) Our immediate goal is to formulate an LLN and a CLT which can be applied to establish sufficient conditions for the consistency

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY

Time Series Analysis. James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY Time Series Analysis James D. Hamilton PRINCETON UNIVERSITY PRESS PRINCETON, NEW JERSEY PREFACE xiii 1 Difference Equations 1.1. First-Order Difference Equations 1 1.2. pth-order Difference Equations 7

More information

Adjusted Empirical Likelihood for Long-memory Time Series Models

Adjusted Empirical Likelihood for Long-memory Time Series Models Adjusted Empirical Likelihood for Long-memory Time Series Models arxiv:1604.06170v1 [stat.me] 21 Apr 2016 Ramadha D. Piyadi Gamage, Wei Ning and Arjun K. Gupta Department of Mathematics and Statistics

More information

Empirical Processes: General Weak Convergence Theory

Empirical Processes: General Weak Convergence Theory Empirical Processes: General Weak Convergence Theory Moulinath Banerjee May 18, 2010 1 Extended Weak Convergence The lack of measurability of the empirical process with respect to the sigma-field generated

More information

ARTICLE IN PRESS Statistics and Probability Letters ( )

ARTICLE IN PRESS Statistics and Probability Letters ( ) Statistics and Probability Letters ( ) Contents lists available at ScienceDirect Statistics and Probability Letters journal homepage: www.elsevier.com/locate/stapro The functional central limit theorem

More information

Estimation of the functional Weibull-tail coefficient

Estimation of the functional Weibull-tail coefficient 1/ 29 Estimation of the functional Weibull-tail coefficient Stéphane Girard Inria Grenoble Rhône-Alpes & LJK, France http://mistis.inrialpes.fr/people/girard/ June 2016 joint work with Laurent Gardes,

More information

BERNOULLI DYNAMICAL SYSTEMS AND LIMIT THEOREMS. Dalibor Volný Université de Rouen

BERNOULLI DYNAMICAL SYSTEMS AND LIMIT THEOREMS. Dalibor Volný Université de Rouen BERNOULLI DYNAMICAL SYSTEMS AND LIMIT THEOREMS Dalibor Volný Université de Rouen Dynamical system (Ω,A,µ,T) : (Ω,A,µ) is a probablity space, T : Ω Ω a bijective, bimeasurable, and measure preserving mapping.

More information

Temporal aggregation of stationary and nonstationary discrete-time processes

Temporal aggregation of stationary and nonstationary discrete-time processes Temporal aggregation of stationary and nonstationary discrete-time processes HENGHSIU TSAI Institute of Statistical Science, Academia Sinica, Taipei, Taiwan 115, R.O.C. htsai@stat.sinica.edu.tw K. S. CHAN

More information

Bootstrapping Long Memory Tests: Some Monte Carlo Results

Bootstrapping Long Memory Tests: Some Monte Carlo Results Bootstrapping Long Memory Tests: Some Monte Carlo Results Anthony Murphy and Marwan Izzeldin University College Dublin and Cass Business School. July 2004 - Preliminary Abstract We investigate the bootstrapped

More information

Documents de Travail du Centre d Economie de la Sorbonne

Documents de Travail du Centre d Economie de la Sorbonne Documents de Travail du Centre d Economie de la Sorbonne A note on self-similarity for discrete time series Dominique GUEGAN, Zhiping LU 2007.55 Maison des Sciences Économiques, 106-112 boulevard de L'Hôpital,

More information