Empirical likelihood confidence intervals for the mean of a long-range dependent process

Size: px
Start display at page:

Download "Empirical likelihood confidence intervals for the mean of a long-range dependent process"

Transcription

1 Empirical likelihood confidence intervals for the mean of a long-range dependent process Daniel J. Nordman Dept. of Mathematics University of Wisconsin - La Crosse La Crosse, WI Philipp Sibbertsen Dept. of Statistics University of Dortmund Dortmund, Germany Soumendra N. Lahiri Dept. of Statistics Iowa State University Ames, IA Abstract: This paper considers blockwise empirical likelihood for real-valued time processes which may exhibit either short- or long-range dependence. Empirical likelihood approaches intended for weakly dependent time series can often break down in the presence of strong dependence. However, a modified blockwise method is proposed for confidence interval estimation of the process mean, which is valid for various dependence structures including long-range dependence. The finite-sample performance of the method is evaluated through a simulation study. Key Words: blocking, confidence interval, empirical likelihood, FARIMA, long-range dependence Contact address: Dan Nordman, University of Wisconsin-La Crosse, 1725 State Street, La Crosse, Wisconsin 54601, USA 1

2 1 Introduction Empirical likelihood (EL), proposed by Owen (1988, 1990), makes possible likelihood-based statistical inference without a specified distribution for the data. For independent data, the method generates a nonparametric likelihood that shares many qualities associated with parametric likelihood, such as limiting chi-square distributions for log-likelihood ratios [cf. Hall and La Scala (1990), Qin and Lawless (1994)]. Owen (2001) describes further properties and applications of EL, where much research has focused on EL with independent observations. The aim of this paper is to extend EL inference to dependent data, which could exhibit either weak or potentially strong forms of dependence. We show that EL inference on the process mean is possible for a large class of stationary time series, which we next detail. Let Z denote the set of integers. Suppose that {Y t }, t Z, is a sequence of stationary random variables with mean µ and integrable spectral density function f(λ), λ π, that satisfies f(λ) C d λ 2d, as λ 0, (1) for some d ( 1/2, 1/2) and positive constant C d > 0 (where denotes that the ratio of two terms equals one in the limit). When 0, we refer to the process {Y t } as weakly or shortrange dependent (SRD). The process {Y t } will be called strongly or long-range dependent (LRD) if d > 0. For d < 0, the process shall be termed anti-persistent. This classification of {Y t } as SRD or LRD depending on the value of d is common [cf. Hosking (1981)], in which long-range dependence (LRD) entails a pole of f at the origin [cf. Beran (1994)]. Formulations of weak dependence based on mixing assumptions often imply the 0 case of short-range dependence (SRD). For weakly dependent time series that satisfy a mixing condition to ensure SRD, Kitamura (1997) has proposed a blockwise EL method. This method applies the Owen (1990) formulation of EL to observational blocks rather than individual observations. As described in Kitamura (1997), blockwise EL involves data blocking techniques used successfully by other nonparametric likelihood methods for handling weak dependence, such as the block bootstrap and subsampling procedures [cf. Carlstein (1986), Künsch (1989), Politis and Romano (1993)]. However, blocking methods developed for SRD can often fail under LRD and require 2

3 careful modification. For example, the moving block bootstrap of Künsch (1989) and Liu and Singh (1992) is known to break down under forms of LRD [c.f. Lahiri (1993)]. The blockbased subsampling methods of Politis and Romano (1993) and Hall and Jing (1996), also developed for weak dependence, must be significantly reformulated for variance estimation with LRD [c.f. Hall, Jing and Lahiri (1998), Nordman and Lahiri (2004)]. Since the blockwise EL of Kitamura (1997) also involves data blocking intended for SRD, one might expect strong dependence to create complications with this version of EL as well. Indeed, our results indicate that this blockwise EL formulation can break down under either LRD or anti-persistence (i.e., d < 0), partially because the usual standardization n(ȳn µ) of a size n sample mean Ȳn might fail to produce a non-degenerate limit. This paper finds that an adapted version of Kitamura s (1997) EL is applicable for confidence intervals with time processes exhibiting very different kinds of dependence structures. We give a modified blockwise EL method for interval estimation of the process mean that is valid for SRD or LRD time series satisfying (1). This EL approach also has an advantage over confidence intervals based on the sample mean Ȳn in that direct variance estimation Var(Ȳn) is not required. In the weak dependence case (e.g., 0), we establish results on blockwise EL for the mean in a manner alternative to Kitamura (1997), which does not involve mixing conditions to quantify the decay rate of the process {Y t } covariances. The rest of the paper is organized as follows. Section 2 details assumptions and other background for describing the behavior of EL under strong and weak forms of dependence. A blockwise EL ratio for the mean and its limiting distribution under LRD are given in Section 3. For processes satisfying (1), a method of setting EL confidence intervals for the process mean is proposed. Section 4 provides a simulation study of these EL confidence intervals. Section 5 contains proofs of the main results. 2 Background 2.1 Time process assumptions We assume that the data (Y 1,..., Y n ) represent a realization from a (strictly) stationary process {Y t }, t Z, with spectral density satisfying (1). There then exists a process {ε j } of uncorrelated random variables with mean E(ε j ) = 0 and variance 0 < E(ε 2 j ) = σ2 < such 3

4 that {Y t } has a moving average representation of the form [cf. Chapter 16.7, Ibragimov and Linnik (1971)]: Y t = µ + j Z b j ε t j, t Z, where the constants {b j } fulfill 0 < j Z b2 j < and f(λ) = σ2 b(λ) 2 /(2π), λ π, with b(λ) = j Z b je jλ 1. We require two fairly mild assumptions for proving the results. Assumptions: (A.1). The strictly stationary process {Y t }, t Z, has a spectral density as in (1). For 0 < d < 1/2, the process autocovariances r(k) = Cov(Y t, Y t+k ), k Z satisfy r(k) C d R d k (1 2d), as k, (2) for some real R d 0; if 0, then f is bounded on every compact subinterval of (0, π]. (A.2). The random variables {ε j } are independent and identically distributed (iid). The process assumptions encompass many types of time series used for describing both SRD and LRD. In particular, our framework includes two popular models for LRD: the fractional Gaussian processes of Mandelbrot and Van Ness (1968) and the fractional autoregressive integrated moving average (FARIMA) models of Adenstedt (1974), Granger and Joyeux (1980) and Hosking (1981). When d > 0, we assume the covariances of the LRD process decay very slowly so that the sum of autocovariances k=1 r(k) diverges, whereas the process covariances sum to zero k Z r(k) = 0 when d < 0. Under Assumption (A.1), the behavior of f in (1) and the autocovariance decay in (2) are closely related when d 0. For d > 0, both (1) and (2) are often used to formulate LRD [cf. Beran (1994)], although the conditions are not necessarily equivalent [Robinson (1995a), p. 1634]. The autocovariance representation (2) implies the limit of f in (1) for the case d < 0 if R 2Γ(1 2d) sin(dπ) [cf. (2.6.V), (2.24.V) of Zygmund (1968)]; additionally for d > 0, if f is of bounded variation on compact subintervals of (0, π], then the pole of f in (1) implies that (2) holds. If the covariances r(k) are decreasing [Bingham et al. (1987), p. 240] or quasimonotonically convergent to zero [Robinson (1995a), p. 1634], then (1) and (2) are equivalent for prescribing LRD that is for d > 0. 4

5 By our condition for SRD under (A.1) (i.e., 0), we assume that a finite, positive limit lim n n 1 n k=1 (n k)r(k) exists but we do not explicitly require the absolute summability of the covariances k=1 r(k) <, which is a slightly stronger assumption that is often used to categorize SRD. The innovation assumption (A.2) could perhaps be weakened to allow strictly stationary innovations that satisfy a mixing condition [cf. Lahiri (2003)]. But to facilitate our exposition and proofs, we do not pursue this further. 2.2 Asymptotic distribution of the sample mean and related issues To help motivate a theory for EL under LRD, we review some differences in the behavior of the sample mean Ȳn = n i=1 Y i/n of SRD and LRD data (Y 1,..., Y n ). Theorem 1 describes how the correct scaling a n for the standardized sample mean (Ȳn µ)/a n to have a limit distribution depends crucially on the dependence parameter d in (1). In the following, denote convergence in distribution by d. Theorem 1 Let {Y t }, t Z be a process which satisfies Assumption (A.1), d ( 1/2, 1/2). (a) As n, n 1 2d Var(Ȳn) V d > 0, where V C d R d /{d(1+2d)} if d 0 and V 0 = 2πC 0 for 0. (b) Let a 2 n = n 1+2d V d. If Assumption (A.2) holds in addition, then (Ȳn µ)/a n n, where Z denotes a standard normal random variable. d Z as Under SRD, Var(Ȳn) decays at the usual O(1/n) rate associated with iid data. However, under LRD or anti-persistence, the decay rate of Var(Ȳn) may be slower or faster than O(1/n), which implies that the usual n scaling of the sample mean n(ȳn µ) may fail to produce a limit distribution. This aspect of LRD complicates inference on the process mean E(Y t ) = µ based on either the sample mean Ȳn or data blocking techniques developed for SRD, such as the block bootstrap, subsampling and even the blockwise EL in Kitamura (1997) [cf. Lahiri (1993), Hall, Jing and Lahiri (1998)]. To recognize potential problems with blockwise EL under LRD, it helps to understand why the method is successful for SRD. For inference on the process mean E(Y t ) = µ, Kitamura (1997) explains that the non-blocking version of EL fails under SRD by neglecting the data dependence structure (i.e., using individual observations) which causes the variance 5

6 Var(Ȳn) of a sample mean to be improperly estimated. However, the blockwise EL of Kitamura (1997) permits correct variance estimation under weak dependence (e.g., 0) by exploiting the following property of SRD processes: Var(Ȳn) Var(Ȳl) l n (3) for time series lengths l, n, l/n 0. Hence, in the inner mechanics of the blockwise EL under SRD, a collection of smaller length l < n data blocks from the available series (Y 1,..., Y n ) can be used to estimate the variance Var(Ȳl) of a size l sample mean and this estimate can be rescaled for inference on the full size n sample mean by (3): Var(Ȳn) (l/n)var(ȳl). Consequently, blockwise EL statistics in Kitamura (1997) require adjustments involving n/l [cf. Kitamura (1997), p. 2089]. Note that, under SRD, subsampling techniques provide similar block-based variance estimates of Var(Ȳn) from (3) [cf. Politis and Romano (1993)]. However, we see from Theorem 1 that (3) may fail outside of the SRD case which implies the blockwise EL of Kitamura (1997) must be modified for LRD. 3 Blockwise empirical likelihood under long-range dependence 3.1 Construction of blockwise empirical likelihood for the mean We give an EL function for the process mean E(Y t ) = µ, following the blockwise formulation given in Kitamura (1997). Let 1 l n be the block length. Under a maximally overlapping (OL) version of blocking, we define blocks B l i,ol = (Y i,..., Y i+l 1 ), 1 i N OL n l + 1. In this case, we use the maximal number of length l blocks among the observed time sequence (Y 1,..., Y n ). We can also consider another common blocking scheme involving nonoverlapping (NOL) blocks given by B l i,nol = Bl 1+l(i 1),OL for 1 i N OL n/l, where n/l represents the smallest integer not exceeding n/l. In the following, let N denote either N OL or N NOL and write the associated blocks B l i,ol or B l i,nol as Bl i, 1 i N. Let M li denote the sample mean of the l variables in B l i, 1 i N (e.g., M li = i+l 1 j=i Y j /l if B l i = Bl i,ol ). To assess the plausibility of a value for the mean parameter, we assign probabilities {p i } N i=1 to each block sample mean {M li } N i=1 to develop a multinomial likelihood function, N i=1 p i. 6

7 The profile blockwise EL function for µ is given by { N } N N L n (µ) = sup p i : p i > 0, p i = 1, p i M li = µ i=1 i=1 i=1 (4) where we maximize the product in (4) over distributions or probabilities {p i } N i=1 on {M li} N i=1 with expectation µ. The function L n (µ) is positive for µ ( min M l,i, max M l,i) and 1 i N 1 i N outside of this interval we may define L n (µ) =. When positive, L n (µ) represents a maximum realized at unique weights p i = N 1 {1+λ µ (M li µ)} 1 where λ µ R is determined by 0 = N i=1 M li µ 1 + λ µ (M li µ) g µ(λ µ ). (5) Owen (1988, 1990) discusses these and further computational aspects of EL. Without the mean µ linear constraint in (4), the product N i=1 p i is maximized when each p i = N 1, corresponding to the empirical distribution of {M li } N i=1. The profile empirical likelihood ratio for the mean µ is then given by R n (µ) = L n(µ) N N = N {1 + λ µ (M li µ)} 1 (6) i=1 and a confidence interval for the mean is determined by those values of µ with relatively high EL, namely, intervals of the form {µ : R n (µ) A}, (7) where A > 0 is chosen to obtain a desired confidence level [cf. Hall and La Scala (1990), Theorem 2.2]. We next discuss the asymptotic distribution of the EL ratio in (6) required for calibrating the confidence intervals in (7). 3.2 Limit distribution of empirical likelihood ratio A key feature of EL with independent data is that it allows a nonparametric casting of Wilks theorem for constructing confidence regions [Wilks (1938)], meaning that the logarithm of the EL ratio has an asymptotic chi-square distribution when evaluated at a true parameter value [cf. Owen (1988, 1990)]. Kitamura (1997) showed a similar result applies to the blockwise EL with weakly dependent mixing processes. 7

8 We now give the asymptotic distribution of the blockwise EL ratio in (6) for SRD or LRD time processes fulfilling (1). As in the EL framework of Kitamura (1997), we require a block correction factor to ensure our EL ratio has a non-degenerate limit. Write B n = N 1 (n/l) 1 2d for the block adjustment term. Theorem 2 Suppose {Y t }, t Z satisfies Assumption (A.1)-(A.2) and l 1 + l/n = o(1), (n/l 2 ) O(1). If E(Y t ) = µ 0 denotes the true process mean, then as n, 2B n log R n (µ 0 ) d χ 2 1 (8) where χ 2 1 denotes a chi-square random variable with 1 degree of freedom. In the case of SRD with 0 in (1), the block adjustment factor B n reduces to the one (n/l)n 1 used by Kitamura (1997) [p. 2089] for weakly dependent processes. However, we find that a blocking adjustment B n that is appropriate for SRD or LRD must generally incorporate the dependence exponent d from (1). We also note one important implication about the block size l in Theorem 2. Under LRD, we found the block condition (n/l 2 ) O(1) to be necessary in deriving the distribution of the blockwise log-el ratio. This represents an additional constraint over the usual l 1 +l/n = o(1) block stipulation under weak dependence (i.e., 0). However, a block size choice of l = C n, C > 0, fulfills all requirements of Theorem 2 without knowledge of the exact value of d in (1). Interestingly, Hall, Jing and Lahiri (1998) proposed similar block lengths l = C n (e.g., C = 1, 3) for a subsampling method under LRD and in a sense the distributional result in Theorem 2 supports this blocking intuition. 3.3 Empirical likelihood confidence intervals for the mean When the dependence structure is unknown, we require a consistent estimator of d in the block adjustment B n without rigid model assumptions and several such estimators ˆd n are available based on the periodogram (see Section 3.4). Provided with a consistent estimator ˆd n of d, we use (8) to set an EL confidence interval for the mean µ of a time process satisfying (1). Namely, an approximate two-sided 100(1 α)% confidence interval for µ, of the form (7), is given by I n (1 α) = { µ R : 2 B } n log R n (µ) χ 2 1;1 α, Bn = N 1 (n/l) 1 2 ˆd n, (9) 8

9 where χ 2 1;1 α represents the 1 α percentile of a χ2 1 random variable, 0 < α < 1. One-sided approximate 100(1 α)% lower/upper confidence bounds for µ correspond to the lower/upper endpoints of the interval I n (1 2 1 α). By choosing a block length l = C n, C > 0, the above EL interval has asymptotically correct coverage for any dependence type d under (1), as detailed in Theorem 3. Let p denote convergence in probability in the following. Theorem 3 Suppose {Y t }, t Z satisfies Assumption (A.1)-(A.2), the estimator ˆd n of d ( 1/2, 1/2) fulfills ˆd n d log n the true process mean, then as n, p 0, and l = C n for some C > 0. If E(Y t ) = µ 0 denotes ( ) P µ 0 I n (1 α) 1 α. EL confidence intervals for the mean µ have an advantage over intervals set immediately with Ȳn under Theorem 1 because direct variance estimation of Var(Ȳn) is not needed. That is, confidence intervals based on a normal approximation of (Ȳn µ)/a n would require at least estimation of the constant C d in (1) in addition to an estimate of the exponent d. However, semi-parametric estimates for C d, which do not require strong assumptions on f, are known to be less accurate (i.e., slower convergence rates) than corresponding estimators for d, as in the case of log-periodogram regression [cf. Robinson (1995b)]. 3.4 Estimation of d We describe two possible frequency domain estimators for the dependence parameter d required in the adjustment B n from (8). Define the periodogram I n (λ) = (2πn) 1 n t=1 Y te λt 1 2, λ [0, π] and let m be a bandwidth parameter such that m, m/n 0 as n. The popular Geweke-Porter-Hudak (GPH) estimator of d is found by a regression of the log-periodogram against the first m Fourier frequencies: ˆd GP H m,n = m j=1 (g j ḡ) log(i n (λ j )) m j=1 (g j ḡ) 2, where ḡ = m j=1 g j/m with g j = 2 log 1 e λ j 1, λ j = 2πj/n, j = 1,..., m [Geweke and Porter-Hudak (1983)]. Consistency of the estimator, with a rate of convergence ( ˆd GP H m,n d) = O p (m 1/2 ), can be obtained without knowledge of the distribution of the data generating process [cf. Robinson (1995b) or Hurvich et al. (1998)]. However, because the representation 9

10 (1) of the spectral density holds only near the origin, a trade-off between the bias and variance of ˆd GP H m,n must be made in choosing the optimal bandwidth m. Whereas Geweke and Porter-Hudak(1983) proposed m = n 1/2, which is still used in many applications, Hurvich et al. (1998) showed that a bandwidth m = O(n 4/5 ) is mean squared error (MSE)-optimal. A second possibility for ˆd, proposed by Künsch (1987), is the so-called Gaussian semiparametric estimate (GSE) or local Whittle estimate ˆd GSE m,n = argmin d [ 1/2+ɛ,1/2 ɛ] log 1 m m j=1 j I n (λ j ) 2d m λ 2d m log(λ j ), involving a small fixed ɛ > 0. Robinsion (1995a) established particularly mild conditions for the consistency of ˆd GSE m,n at a rate m 1/2 in probability. Compared to the GPH estimator that only requires a linear regression, the GSE estimator is more complicated to compute but, under certain conditions, its asymptotic variance Var( ˆd GSE m,n ) 1/(4m) is smaller relative to Var( ˆd GP H m,n ) π 2 /(24m), as n. The MSE-optimal bandwidth for be m = O(n 4/5 ) [cf. Andrews and Sun (2004)]. See Moulines and Soulier (2003) for further theoretical properties of other possible estimators of d. j=1 ˆd GSE m,n is also known to ˆd GP H m,n, ˆd GSE m,n and for 4 A numerical study We conducted a simulation study of the finite-sample coverage accuracy of blockwise EL confidence intervals. The design and results of the study are described in the following. 4.1 Simulation parameters Let {Ỹt}, t Z, represent a FARIMA(0, d, 0) series Ỹ t = j=0 Γ(j + d) Γ(j + 1)Γ(d) ε t j, (10) involving iid (mean zero) innovations {ε t }, t Z and the gamma function Γ( ). To examine the performance of the EL method for several time series fulfilling (1), we considered FARIMA processes Y t = ϕy t 1 + Ỹt + ϑỹt 1, t Z, formed with one of the following ARMA filters, innovation distributions and d values: 10

11 ϕ = ϑ = 0 (); ϕ = 0.3, ϑ = 0.1 (); ϕ = 0.7, ϑ = 0.3 (); {ε t } in (10) has a standard normal or t distribution with 3 degrees of freedom; 0.1, 0, 0.1, 0.25, 0.4. The processes {Y t } above fulfill (1) with various decay rates d based on innovations that may be Gaussian or non-gaussian with heavy tails. Each process also has mean zero E(Y t ) = 0. We obtained size n time stretches (Y 1,..., Y n ) from each FARIMA series above by using a sample Ỹn = (Ỹ1,..., Ỹn) from the process (10) as the innovations in the appropriate ARMA model. Gaussian samples Ỹn were simulated by the circulant embedding method of Wood and Chan (1994) with FARIMA(0, d, 0) covariances [cf. Beran (1994)], whereas non- Gaussian Ỹn were generated by truncating the moving average expression in (10) after the first M = 1000 terms and then using n+m independent t variables to build an approximate truncated series [see Bardet et al (2003), p. 590 for details]. The sample sizes considered were n = 100, 500, For each series type and sample size, we calculated lower and upper approximate 95% one-sided confidence intervals (or bounds) with the blockwise EL method from (9) using both overlapping (OL) or non-overlapping (NOL) blocks of length l = C 1 n 1/2, C 1 {1/2, 1, 2} as well as the two estimators ˆd GP H m,n and ˆd GSE m,n of d from Section 3.4 with bandwidths m = C 2 n 4/5, C 2 {1/4, 1/2, 1}. Coverage probabilities for the EL confidence bounds for the process mean µ = 0 were approximated by an average over 1000 simulation runs for each process and sample size n. 4.2 Coverage accuracy of EL intervals To report the simulation results, we firstly note that the block type had little effect on coverage accuracy of the EL intervals. Under weak dependence, similarities in the finite-sample efficacy of OL and NOL blocks have also been observed with other block-based methods [cf. Hall and Jing (1996)]. Focusing on OL blocks, it suffices to summarize numerical findings only for the upper confidence bounds, because the lower bounds closely matched the upper bounds in performance at each level combination in the study. Figures 1-4 display the coverage accuracy, expressed as a percentage, of upper 95% EL confidence bounds for the mean. These graphics are differentiated by the innovation type (Gaussian/t) and the estimator of d (GPH/GSE) 11

12 used; each figure also involves the same two bandwidths m = n 4/5 /4 or n 4/5 /2. Based on the figures, we make some observations on the effects of the following factors: d estimator: There are few differences in the EL coverage accuracies across GPH and GSE estimators. Perhaps the performance with the GPH estimator is slightly better when the memory parameter d is larger. innovation type: The EL intervals perform slightly better for t-distributed innovations than for Gaussian innovations, which appear to have somewhat lower coverage probabilities for 0.4 in particular. bandwidth m: Performance differences in bandwidths m = n 4/5 /4 or n 4/5 /2 appear only with, which introduces a high positive autoregressive component. For this filter, the EL confidence bounds become conservative when m = n 4/5 /2 and perform better with a shorter bandwidth m = n 4/5 /4. Because a process with a high positive autoregressive coefficient can mimic long memory and can be difficult to distinguish from LRD, a shorter bandwidth may help to refine estimation of the memory exponent d in this situation. The EL coverage accuracies with bandwidths m = n 4/5 and m = n 4/5 /2 were comparable throughout the study. block length l: The blockwise EL confidence intervals tend to become more anticonservative as the block sizes increase. This behavior is most apparent with the smallest sample size n = 100 and also under the strongest forms of LRD 0.4. The smallest block size considered l = n 0.5 /2 appears to yield the most accurate EL intervals. sample size n: With the smallest sample size n = 100, the coverage accuracy of the EL intervals quickly deteriorates as the memory parameter d increases. This type of behavior is well-known in location estimation with strong LRD processes due to the persistence of the series [cf. Beran (1994)], so that a size n = 100 sample may be too small for inference in these cases. However, the coverage probabilities of EL intervals become closer to the nominal level as the sample sizes increase and the accuracy is often quite good even for a sample size of n = 500. The simulation evidence here indicates that a block size l = n 0.5 /2, along with GPH estimator of d with bandwidth m = n 0.8 /4, may be recommended for the coverage accuracy of the 12

13 blockwise EL confidence bounds for the process mean. The performance of the EL method generally depends on the strength of the underlying dependence structure, but improves with increasing sample sizes. It is presently unclear if an optimal block size l exists for computing the blockwise EL intervals (9). Under weak dependence, optimal block lengths (and corresponding estimators) are available for implementing some block-based methods, such as the block bootstrap and subsampling [cf. Hall and Jing (1996), Lahiri (2003), Chapter 7]. However, more research on block lengths is required both with blockwise EL and with strong dependence. 5 Proofs For the proofs, we denote the positive integers as N and use C to denote generic positive constants that do not depend on n. For two sequences {s n }, {t n } of nonzero numbers, we write s n t n to denote lim n s n /t n = Proof of Theorem 1 Theorem 1(b) follows from part (a) because the process {Y t } is linear with iid innovations and Var(n Ȳn) [cf. Davydov (1970), Theorem of Ibragimov and Linnik (1971)]. Theorem 1(a) is next briefly justified. When d > 0, the result is standard and follows from the covariance decay in (2) [c.f. Corollary A2 of Taqqu (1977)]. When 0, we use the nonnegative Fejer kernel K n (λ) = (2πn) 1 n j= n (n j )ejλ 1, λ π, to write π nvar(ȳn) 2πC 0 = 2π K n (λ) [ ] f(λ) C 0 dλ, π since π π K n(λ)dλ = 1. For a fixed 0 < δ π, it holds that f(λ) C 0 is bounded on δ λ π and δ λ π K n(λ)dλ 0 as n [c.f. Brockwell and Davis (1987), Section 2.11] so that lim sup n nvar(ȳn) 2πC 0 2π sup 0< λ δ f(λ) C 0. (11) Letting δ 0, the righthand side above converges to zero by (1), establishing the case 0. When d < 0, we use (2) and k Z r(k) = 0 [which follows from k Z r(k) < and (1) by 13

14 Corollary of Brockwell and Davis (1987)] to deduce that A 1n n k= n r(k) = 2 k=n+1 r(k) C dr d d n2d, A 2n 2n 1 n k=1 kr(k) 2C dr d 1 + 2d n2d, as n by Corollary A2 of Taqqu (1977). Since nvar(ȳn) = A 1n A 2n, Theorem 1(a) now follows for d < Proof of Theorem 2 To prove Theorem 2, we require some preliminary results. For LRD and SRD linear processes with bounded innovations, Proposition 1 implies that overlapping block means at distant lags are asymptotically uncorrelated, based on a condition suggested by Hall, Jing and Lahiri (1998), Theorem 2.4. Proposition 1 Suppose l 1 + l/n = o(1) and {Y t }, t Z, satisfies the conditions of Theorem 1(b) with bounded innovations, i.e., P ( ɛ t C) = 1 for some C > 0. Then, for any integers a, b N {0} and 0 < ɛ < 1, max nɛ i n E [ a a n (M l1,ol µ) a a b n (M li,ol µ) b] E(Z a ) E(Z b ) = o(1), as n, where M li,ol = i+l 1 j=i Y j /l, i 1, and Z is a standard normal variable. Proof of Proposition 1. The case d > 0 is treated in Nordman and Lahiri (2004, Lemma 2) which requires Var(Ȳn)/Var(Ȳl) = (l/n) 1 2 o(1) and n 2 Var(Ȳn). The cases 0 and d < 0 follow with a modification to show (essentially for the a = b = 1 situation) that n,ɛ max nɛ i n Cov(M l1,ol, M li,ol ) /a 2 l = o(1) holds for a fixed 0 < ɛ < 1. When d > 0, we may use (2) and Theorem 1(a) to show ( n,ɛ Cl 1 2d max max r(i + j) = O (l/n) 1 2d) = o(1). nɛ i n j l When 0 and a 2 l = 2πC 0l 1 by Theorem 1(a), we have with the Fejer kernel K l that Cov(M l1,ol, M li,ol ) a 2 l = 1 π K l (λ) e (i 1)λ 1 [ ] f(λ) C 0 dλ, i > l, C 0 π holds since π π K l(λ) e (i 1)λ 1 dλ = 0 for i > l; then, n,ɛ = o(1) follows from the same argument as in (11) as l using e (i 1)λ 1 = 1, λ π, i N. 14

15 Proposition 2 Suppose l 1 + l/n = o(1) and {Y t }, t Z, satisfies the conditions of Theorem 1(b). Let M nµ = N 1 N i=1 (M li µ) and â 2 lµ = N 1 N i=1 (M li µ) 2, where N represents either N OL or N NOL (i.e., either block type). Then, as n, (a) M nµ /a n (b) â 2 lµ /a2 l d Z, a standard normal variable; p 1; (c) P ( min 1 i N M li < µ < max 1 i N M li ) 1. Proof of Proposition 2. We consider only the overlapping block case N = N OL = n l+1. The proof with N = N NOL = n/l is very similar and hence omitted. Noting that the first and last l observations in (Y 1,..., Y n ) contribute less to M nµ, we may write N M nµ = n(ȳn µ) H n where lh n = l j=1 (l j)(y j +Y n j+1 2µ) = l 1 i=1 i j=1 (Y j + Y n j+1 2µ). Using Holder s inequality and n 2 a 2 n = n 1+2d V d, n N, we find E(Hn) 2 4 l 1 i Var l i=1 j=1 Y j ( ) 1 l 1 = O i 2 a 2 i l i=1 ( = O l 1+2d) = o(n 2 a 2 n) so that (na n ) 1 H n p 0 follows. Then applying Slutsky s theorem with Theorem 1(b) and n/n 1, we have the distributional result for M nµ /a n in Proposition 2(a). Let E(ε 2 0 ) = σ2 and denote the indicator function as I{ }. For each c N and t Z, define variables ε t,c = ε t I{ ε t c} E(ε t I{ ε t c}) and Y t,c = µ + j Z b j(σε t j,c /σ c ), where w.l.o.g. σc 2 = E(ε 2 0,c ) > 0. For each n, c, i N, write analogs M li,c = i+l 1 j=i Y j,c /l and â 2 lµ,c = N 1 N i=1 (M li,c µ) 2 with respect to Y 1,c,..., Y n,c. The processes {Y t } and {Y t,c } have identical means µ and covariances because each series involves the same linear filter with iid innovations of mean 0 and variance σ 2. Since E â 2 lµ â2 lµ,c E {(M l1 µ) + (M l1,c µ)}(m l1 M l1,c ) holds, we may apply Holder s inequality to show that for each c N, E â 2 lµ â2 lµ,c 4Var(Ȳl){1 σ 1 c σ 1 E(ε 0 ε 0,c )} 1/2, (12) using Var(Ȳl) = Var(M l1 ) = Var(M l1,c ) and Var(M l1 M l1,c ) = Var(Ȳl) Var(σ 1 ε 0 σ 1 c ε 0,c ) by the iid property of the innovations. For each c N, we find that E(â 2 lµ,c ) = Var(Ȳl) a 2 l and, through applying Proposition 1, that Var(â 2 lµ,c ) = o(a4 l ) follows as in the proof of 15

16 Theorem 2.5 of Hall, Jing and Lahiri (1998). From this and (12), we deduce that â 2 lµ 1 lim sup E 1 lim sup {E â 2 lµ n â2 lµ,c + [ } Var(â 2 lµ,c )] 1/2 + E(â 2 lµ,c ) a 2 l n a 2 l for each c N. Proposition 2(b). a 2 l 4{1 σ 1 c σ 1 E(ε 0 ε 0,c )} 1/2 Because lim c σc 1 E(ε 0 ε 0,c ) = σ, it follows now that â 2 lµ /a2 l p 1 in For x R, let Φ(x) = P (Z x) denote the distribution function of a standard normal Z and define Φ n (x) = N 1 N i=1 {(M li µ)/a l x}. Fix δ > 0. To show Proposition 2(c), it suffices to prove that if δ = δ, then Φ n (δ ) Φ(δ ) as n ; such convergence implies ( ) P min (M i µ) δ < 0 < δ < max (M i µ) 1 1 i N 1 i N p because 0 < Φ(±δ) < 1. We next establish that lim n E Φ n (δ ) Φ(δ ) = 0, for δ = ±δ, which will follow from (13)-(14) below. For x R and c N, define Φ n,c (x) = N 1 N i=1 {(M li,c µ)/a l x}. Since the process {Y t,c } has all moments finite, the convergence in Proposition 1 applies, and (M l1,c µ)/a l converges to a standard normal by Theorem 1(b), we have that lim E Φ n,c (δ ) Φ(δ ) = 0 (13) n holds for any fixed c N by Theorem 2.4 of Hall, Jing and Lahiri (1998) [see the proof of their Theorem 2.4]. Using the standard normal limits of (M l1 µ)/a l and (M l1,c µ)/a l under Theorem 1 and that lim n E(M l1,c M l1 ) 2 /a 2 l c N, we can show = 2{1 σ 1 c σ 1 E(ε 0 ε 0,c )} for each lim c lim sup E Φ n (δ ) Φ n,c (δ ) 2 n lim lim sup c n ( { (Ml1 µ) E I δ } { (Ml1,c µ) I δ }) 2 = 0, (14) a l a l as in the proof of Theorem 1 of Nordman and Lahiri (2004). Proof of Theorem 2. To save space, we consider only for the overlapping version of blockwise EL from Section 3.1 with N = N OL = n l + 1. The proof with non-overlapping blocks is analogous but less involved. 16

17 Applying Lemma 2(c) with µ = µ 0, we find P (R n (µ 0 ) > 0) 1 as n so that, with probability approaching 1 as n, R n (µ 0 ) can be written as in (6) with probabilities p i = N 1 {1 + λ µ0 (M li µ 0 )} 1, 1 i N, where g µ0 (λ µ0 ) = 0 in (5). Let Z nµ0 = max 1 i N M li µ 0. By the finiteness of E Y t 2, the Borel-Cantelli lemma and stationarity may be used to show max 1 i n Y i µ 0 = o p (n 1/2 ) so that Z nµ0 = o p (l 1 n 1/2 ) as n [cf. Lemma 3 of Owen (1990)]. Then, we find that ( ) n a 2 d l a n Z nµ0 = O O p (n 1/2 l Z nµ0 ) = O(1)o p (1) = o p (1), (15) l 2d where n d /l 2 O(1) by assumption. Let M nµ0 = N 1 N i=1 (M li µ 0 ) and â 2 lµ 0 = N 1 N i=1 (M li µ 0 ) 2. Similar to step (2.12) of Owen (1990), it follows from g µ0 (λ µ0 ) = 0 in (5) that using M nµ0 /â 2 lµ 0 = O p (a n /a 2 l ) from Proposition 2 and (15). λ µ0 = O p (a n /a 2 l ), (16) Define θ i = λ µ0 (M li µ 0 ) for 1 i N. The equation 0 = g µ0 (λ µ0 ) can be solved for λ µ0 = ( M nµ0 + I nµ0 )/â 2 lµ 0, where I nµ0 = N 1 N i=1 θ2 i (M li µ 0 )/(1 + θ i ) satisfies I nµ0 λ µ0 2 Z nµ0 â 2 lµ 0 max 1 i N 1 + θ i 1 = o p (a n ), by (15)-(16), Proposition 2(b), and max 1 i N θ i λ µ0 Z nµ0 = o p (1). When λ µ0 Z nµ0 < 1, a Taylor expansion gives log(1 + θ i ) = θ i θ 2 i /2 + ν i for each 1 i N where ν i λ µ0 3 Z nµ0 (M li µ 0 ) 2 (1 λ µ0 Z nµ0 ) 3. (17) Using this expansion of log(1 + θ i ), 1 i N, in (6) along with the expression for λ µ0 B n = N 1 a 2 n a 2 l, we now write and where Q 1n = (a 1 n 2B n log R n (µ 0 ) = Q 1n Q 2n + Q 3n, M nµ0 ) 2 /(a 2 l â 2 lµ 0 ) d χ 2 1 by Proposition 2; Q 2n = (a 1 n I nµ0 ) 2 /(a 2 l â 2 lµ 0 ) = o p (1) applying I nµ0 = o p (a n ); and for Q 3n = 2B n N i=1 ν i, we may bound Q 3n B n N i=1 ν i 2a 2 l a 2 n λ µ0 3 Z nµ0 â 2 lµ 0 (1 λ µ0 Z nµ0 ) 3 = o p (1) by (17). Theorem 2 follows now by applying Slutsky s theorem. Proof of Theorem 3. From Theorem 2 and l = C n in (9) by assumption, it suffices to show B n /B n = (C 2 n) (d ˆd n ) p 1, which follows from (d ˆd n ) log(c 2 p n) 0. 17

18 Acknowledgments This research was partially supported by US National Science Foundation grants DMS and DMS and by the Deutsche Forschungsgemeinschaft (SFB 475). References Adenstedt, R. K. (1974) On large-sample estimation for the mean of a stationary sequence. Ann. Statist. 2, Andrews D. W. K. and Sun, Y. (2004) Adaptive local polynomial Whittle estimation of long-range dependence. Econometrica 72, Bardet, J., Lang, G., Oppenheim, G., Philippe, S. and Taqqu, M. S. (2003) Generators of long-range dependent processes: a survey. In P. Doukhan, G. Oppenheim and M. S. Taqqu (eds.), Theory and Applications of Long-range Dependence, pp Birkhäuser, Boston. Bingham, N. H., Goldie, C. M. and Teugels, J. L. (1987) Regular variation. University Press, Cambridge. Beran, J. (1994) Statistical methods for long memory processes. Chapman & Hall, London. Brockwell, P. J. and Davis, R. A. (1987) Time series: theory and methods. Springer-Verlag, New York. Carlstein, E. (1986) The use of subseries methods for estimating the variance of a general statistic from a stationary time series. Ann. Statist. 14, Davydov, Y. A. (1970) The invariance principle for stationary processes. Theor. Prob. Appl. 15, Geweke, J. and Porter-Hudak, S. (1983) The estimation and application of long-memory time series models. J. Time Series Anal. 4, Granger, C. W. J. and Joyeux, R. (1980) An introduction to long-memory time series models and fractional differencing. J. Time Series Anal. 1, Hall, P. and La Scala, B. (1990) Methodology and algorithms of empirical likelihood. Internat. Statist. Rev. 58, Hall, P. and Jing, B.-Y. (1996) On sample reuse methods for dependent data. J. R. Stat. Soc., Ser. B 58, Hall, P., Jing, B.-Y. and Lahiri, S. N. (1998) On the sampling window method for long-range dependent data. Statist. Sinica 8, Hosking, J. R. M. (1981) Fractional differencing. Biometrika 68,

19 Hurvich, C. M., Deo, R. and Brodsky, J. (1998) The mean square error of Geweke and Porter- Hudak s estimator of the memory parameter of a long memory time series. J. Time Series Anal. 19, Ibragimov, I. A. and Linnik, Y. V. (1971) Independent and stationary sequences of random variables. Wolters-Noordhoff, Groningen. Kitamura, Y. (1997) Empirical likelihood methods with weakly dependent processes. Ann. Statist. 25, Künsch, H. R. (1987) Statistical aspects of self-similar processes. Probability Theory and Applications 1, Künsch, H. R. (1989) The jackknife and bootstrap for general stationary observations. Ann. Statist. 17, Lahiri, S. N. (1993) On the moving block bootstrap under long range dependence. Statist. Probab. Lett. 18, Lahiri, S. N. (2003) Resampling methods for dependent data. Springer, New York. Liu, R. Y. and Singh, K. (1992) Moving blocks jackknife and bootstrap capture weak dependence. In Exploring the Limits of Bootstrap, , R. LePage and L. Billard (editors). John Wiley & Sons, New York. Mandelbrot, B. B. and Van Ness, J. W. (1968) Fractional Brownian motions, fractional noises and applications. SIAM Rev. 10, Moulines, E. and Soulier, P. (2003) Semiparametric spectral estimation for fractional processes. In P. Doukhan, G. Oppenheim and M. S. Taqqu (eds.), Theory and Applications of Long-range Dependence, pp Birkhäuser, Boston. Nordman, D. J. and Lahiri, S. N. (2004) Validity of the sampling window method for long-range dependent linear processes. Preprint. Department of Statistics, University of Dortmund, Germany. Owen, A. B. (1988) Empirical likelihood ratio confidence intervals for a single functional. Biometrika 75, Owen, A. B. (1990) Empirical likelihood confidence regions. Ann. Statist. 18, Owen, A. B. (2001) Empirical likelihood. Chapman & Hall, London. Politis, D. N. and Romano, J. P. (1993) On the sample variance of linear statistics derived from mixing sequences. Stochastic Processes Appl. 45, Robinson, P.M. (1995a) Gaussian semiparametric estimation of long range dependence. Ann. Statist. 23, Robinson, P.M. (1995b) Log-periodogram regression of time series with long range dependence. Ann. Statist. 23,

20 Qin, J. and Lawless, J. (1994) Empirical likelihood and general estimating equations. Ann. Statist. 22, Taqqu, M. S. (1977) Law of the iterated logarithm for sums of non-linear functions of Gaussian variables that exhibit a long range dependence. Z. Wahr. Verw. Gebiete 40, Wilks, S. S. (1938) The large-sample distribution of the likelihood ratio for testing composite hypotheses. Ann. Math. Statist. 9, Wood, A. T. A. and Chan, G. (1994) Simulation of stationary Gaussian processes in [0, 1] d. J. Comput. Graph. Statist. 3, Zygmund, A. (1968) Trigonometric Series Volumes I, II. Cambridge University Press, Cambridge, U.K. 20

21 OL block length n^{0.5}/2; GPH m=n^{0.8}/4 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GPH m=n^{0.8}/4 OL block length 2n^{0.5}; GPH m=n^{0.8}/4 OL block length n^{0.5}/2; GPH m=n^{0.8}/2 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GPH m=n^{0.8}/2 OL block length 2n^{0.5}; GPH m=n^{0.8}/2 Figure 1: Gaussian innovations: Coverage percentages for approximate 95% one-sided upper EL confidence bounds using GPH estimator with bandwidth m = C 2 n 4/5, C 2 {1/4, 1/2}. 21

22 OL block length n^{0.5}/2; GPH m=n^{0.8}/4 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GPH m=n^{0.8}/4 OL block length 2n^{0.5}; GPH m=n^{0.8}/4 OL block length n^{0.5}/2; GPH m=n^{0.8}/2 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GPH m=n^{0.8}/2 OL block length 2n^{0.5}; GPH m=n^{0.8}/2 Figure 2: t-innovations: Coverage percentages for approximate 95% one-sided upper EL confidence bounds using GPH estimator with bandwidth m = C 2 n 4/5, C 2 {1/4, 1/2}. 22

23 OL block length n^{0.5}/2; GSE m=n^{0.8}/4 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GSE m=n^{0.8}/4 OL block length 2n^{0.5}; GSE m=n^{0.8}/4 OL block length n^{0.5}/2; GSE m=n^{0.8}/2 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GSE m=n^{0.8}/2 OL block length 2n^{0.5}; GSE m=n^{0.8}/2 Figure 3: Gaussian innovations: Coverage percentages for approximate 95% one-sided upper EL confidence bounds using GSE estimator with bandwidth m = C 2 n 4/5, C 2 {1/4, 1/2}. 23

24 OL block length n^{0.5}/2; GSE m=n^{0.8}/4 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GSE m=n^{0.8}/4 OL block length 2n^{0.5}; GSE m=n^{0.8}/4 OL block length n^{0.5}/2; GSE m=n^{0.8}/2 n= 100 n= 500 n= 1000 OL block length n^{0.5}; GSE m=n^{0.8}/2 OL block length 2n^{0.5}; GSE m=n^{0.8}/2 Figure 4: t-innovations: Coverage percentages for approximate 95% one-sided upper EL confidence bounds using GSE estimator with bandwidth m = C 2 n 4/5, C 2 {1/4, 1/2}. 24

Empirical likelihood confidence intervals for the mean of a long-range dependent process

Empirical likelihood confidence intervals for the mean of a long-range dependent process Empirical likelihood confidence intervals for the mean of a long-range dependent process Daniel J. Nordman Dept. of Statistics Iowa State University Ames, IA 50011 USA Philipp Sibbertsen Faculty of Economics

More information

econstor Make Your Publications Visible.

econstor Make Your Publications Visible. econstor Make Your Publications Visible. A Service of Wirtschaft Centre zbwleibniz-informationszentrum Economics Nordman, Daniel; Sibbertsen, Philipp; Lahiri, Soumendra N. Working Paper Empirical likelihood

More information

SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES

SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES Statistica Sinica 19 (2009), 71-81 SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES Song Xi Chen 1,2 and Chiu Min Wong 3 1 Iowa State University, 2 Peking University and

More information

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY

NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY Econometric Theory, 26, 2010, 1855 1861. doi:10.1017/s0266466610000216 NOTES AND PROBLEMS IMPULSE RESPONSES OF FRACTIONALLY INTEGRATED PROCESSES WITH LONG MEMORY UWE HASSLER Goethe-Universität Frankfurt

More information

Adjusted Empirical Likelihood for Long-memory Time Series Models

Adjusted Empirical Likelihood for Long-memory Time Series Models Adjusted Empirical Likelihood for Long-memory Time Series Models arxiv:1604.06170v1 [stat.me] 21 Apr 2016 Ramadha D. Piyadi Gamage, Wei Ning and Arjun K. Gupta Department of Mathematics and Statistics

More information

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535

Additive functionals of infinite-variance moving averages. Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Additive functionals of infinite-variance moving averages Wei Biao Wu The University of Chicago TECHNICAL REPORT NO. 535 Departments of Statistics The University of Chicago Chicago, Illinois 60637 June

More information

Stochastic Volatility Models with Long Memory

Stochastic Volatility Models with Long Memory Stochastic Volatility Models with Long Memory Clifford M. Hurvich Philippe Soulier 1 Introduction In this contribution we consider models for long memory in volatility. There are a variety of ways to construct

More information

M O N A S H U N I V E R S I T Y

M O N A S H U N I V E R S I T Y ISSN 440-77X ISBN 0 736 066 4 M O N A S H U N I V E R S I T Y AUSTRALIA A Test for the Difference Parameter of the ARIFMA Model Using the Moving Blocks Bootstrap Elizabeth Ann Mahara Working Paper /99

More information

Long memory and changing persistence

Long memory and changing persistence Long memory and changing persistence Robinson Kruse and Philipp Sibbertsen August 010 Abstract We study the empirical behaviour of semi-parametric log-periodogram estimation for long memory models when

More information

A NOTE ON THE STATIONARY BOOTSTRAP S VARIANCE. By Daniel J. Nordman

A NOTE ON THE STATIONARY BOOTSTRAP S VARIANCE. By Daniel J. Nordman Submitted to the Annals of Statistics A NOTE ON THE STATIONARY BOOTSTRAP S VARIANCE By Daniel J. Nordman Because the stationary bootstrap resamples data blocks of random length, this method has been thought

More information

A Study on "Spurious Long Memory in Nonlinear Time Series Models"

A Study on Spurious Long Memory in Nonlinear Time Series Models A Study on "Spurious Long Memory in Nonlinear Time Series Models" Heri Kuswanto and Philipp Sibbertsen Leibniz University Hanover, Department of Economics, Germany Discussion Paper No. 410 November 2008

More information

High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data

High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data Song Xi CHEN Guanghua School of Management and Center for Statistical Science, Peking University Department

More information

Stochastic volatility models: tails and memory

Stochastic volatility models: tails and memory : tails and memory Rafa l Kulik and Philippe Soulier Conference in honour of Prof. Murad Taqqu 19 April 2012 Rafa l Kulik and Philippe Soulier Plan Model assumptions; Limit theorems for partial sums and

More information

Long-range dependence

Long-range dependence Long-range dependence Kechagias Stefanos University of North Carolina at Chapel Hill May 23, 2013 Kechagias Stefanos (UNC) Long-range dependence May 23, 2013 1 / 45 Outline 1 Introduction to time series

More information

Time Series Analysis. Correlated Errors in the Parameters Estimation of the ARFIMA Model: A Simulated Study

Time Series Analysis. Correlated Errors in the Parameters Estimation of the ARFIMA Model: A Simulated Study Communications in Statistics Simulation and Computation, 35: 789 802, 2006 Copyright Taylor & Francis Group, LLC ISSN: 0361-0918 print/1532-4141 online DOI: 10.1080/03610910600716928 Time Series Analysis

More information

Bootstrapping Long Memory Tests: Some Monte Carlo Results

Bootstrapping Long Memory Tests: Some Monte Carlo Results Bootstrapping Long Memory Tests: Some Monte Carlo Results Anthony Murphy and Marwan Izzeldin University College Dublin and Cass Business School. July 2004 - Preliminary Abstract We investigate the bootstrapped

More information

UNIT ROOT TESTING FOR FUNCTIONALS OF LINEAR PROCESSES

UNIT ROOT TESTING FOR FUNCTIONALS OF LINEAR PROCESSES Econometric Theory, 22, 2006, 1 14+ Printed in the United States of America+ DOI: 10+10170S0266466606060014 UNIT ROOT TESTING FOR FUNCTIONALS OF LINEAR PROCESSES WEI BIAO WU University of Chicago We consider

More information

c 2010 by Zhewen Fan. All rights reserved.

c 2010 by Zhewen Fan. All rights reserved. c 2010 by Zhewen Fan. All rights reserved. STATISTICAL ISSUES AND DEVELOPMENTS IN TIME SERIES ANALYSIS AND EDUCATIONAL MEASUREMENT BY ZHEWEN FAN DISSERTATION Submitted in partial fulfillment of the requirements

More information

A comparison of four different block bootstrap methods

A comparison of four different block bootstrap methods Croatian Operational Research Review 189 CRORR 5(014), 189 0 A comparison of four different block bootstrap methods Boris Radovanov 1, and Aleksandra Marcikić 1 1 Faculty of Economics Subotica, University

More information

Economics Division University of Southampton Southampton SO17 1BJ, UK. Title Overlapping Sub-sampling and invariance to initial conditions

Economics Division University of Southampton Southampton SO17 1BJ, UK. Title Overlapping Sub-sampling and invariance to initial conditions Economics Division University of Southampton Southampton SO17 1BJ, UK Discussion Papers in Economics and Econometrics Title Overlapping Sub-sampling and invariance to initial conditions By Maria Kyriacou

More information

Robust estimation in time series

Robust estimation in time series Robust estimation in time series V. A. Reisen NuMes, D.E; Universidade Federal do Espirito Santo, Brazil. 20 de setembro de 2011 Contents Abstract Introduction-Co-authors references Introduction-Examples

More information

Asymptotic inference for a nonstationary double ar(1) model

Asymptotic inference for a nonstationary double ar(1) model Asymptotic inference for a nonstationary double ar() model By SHIQING LING and DONG LI Department of Mathematics, Hong Kong University of Science and Technology, Hong Kong maling@ust.hk malidong@ust.hk

More information

Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach

Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach Goodness-of-Fit Tests for Time Series Models: A Score-Marked Empirical Process Approach By Shiqing Ling Department of Mathematics Hong Kong University of Science and Technology Let {y t : t = 0, ±1, ±2,

More information

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements

Estimation of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements of the long Memory parameter using an Infinite Source Poisson model applied to transmission rate measurements François Roueff Ecole Nat. Sup. des Télécommunications 46 rue Barrault, 75634 Paris cedex 13,

More information

Tail bound inequalities and empirical likelihood for the mean

Tail bound inequalities and empirical likelihood for the mean Tail bound inequalities and empirical likelihood for the mean Sandra Vucane 1 1 University of Latvia, Riga 29 th of September, 2011 Sandra Vucane (LU) Tail bound inequalities and EL for the mean 29.09.2011

More information

Statistica Sinica Preprint No: SS R3

Statistica Sinica Preprint No: SS R3 Statistica Sinica Preprint No: SS-2015-0435R3 Title Subsampling for General Statistics under Long Range Dependence Manuscript ID SS-2015.0435 URL http://www.stat.sinica.edu.tw/statistica/ DOI 10.5705/ss.202015.0435

More information

FOURIER TRANSFORMS OF STATIONARY PROCESSES 1. k=1

FOURIER TRANSFORMS OF STATIONARY PROCESSES 1. k=1 FOURIER TRANSFORMS OF STATIONARY PROCESSES WEI BIAO WU September 8, 003 Abstract. We consider the asymptotic behavior of Fourier transforms of stationary and ergodic sequences. Under sufficiently mild

More information

Thomas J. Fisher. Research Statement. Preliminary Results

Thomas J. Fisher. Research Statement. Preliminary Results Thomas J. Fisher Research Statement Preliminary Results Many applications of modern statistics involve a large number of measurements and can be considered in a linear algebra framework. In many of these

More information

QED. Queen s Economics Department Working Paper No. 1244

QED. Queen s Economics Department Working Paper No. 1244 QED Queen s Economics Department Working Paper No. 1244 A necessary moment condition for the fractional functional central limit theorem Søren Johansen University of Copenhagen and CREATES Morten Ørregaard

More information

Bootstrapping Long Memory Tests: Some Monte Carlo Results

Bootstrapping Long Memory Tests: Some Monte Carlo Results Bootstrapping Long Memory Tests: Some Monte Carlo Results Anthony Murphy and Marwan Izzeldin Nu eld College, Oxford and Lancaster University. December 2005 - Preliminary Abstract We investigate the bootstrapped

More information

Modeling and testing long memory in random fields

Modeling and testing long memory in random fields Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous

More information

Estimation of a quadratic regression functional using the sinc kernel

Estimation of a quadratic regression functional using the sinc kernel Estimation of a quadratic regression functional using the sinc kernel Nicolai Bissantz Hajo Holzmann Institute for Mathematical Stochastics, Georg-August-University Göttingen, Maschmühlenweg 8 10, D-37073

More information

The Slow Convergence of OLS Estimators of α, β and Portfolio. β and Portfolio Weights under Long Memory Stochastic Volatility

The Slow Convergence of OLS Estimators of α, β and Portfolio. β and Portfolio Weights under Long Memory Stochastic Volatility The Slow Convergence of OLS Estimators of α, β and Portfolio Weights under Long Memory Stochastic Volatility New York University Stern School of Business June 21, 2018 Introduction Bivariate long memory

More information

Mohsen Pourahmadi. 1. A sampling theorem for multivariate stationary processes. J. of Multivariate Analysis, Vol. 13, No. 1 (1983),

Mohsen Pourahmadi. 1. A sampling theorem for multivariate stationary processes. J. of Multivariate Analysis, Vol. 13, No. 1 (1983), Mohsen Pourahmadi PUBLICATIONS Books and Editorial Activities: 1. Foundations of Time Series Analysis and Prediction Theory, John Wiley, 2001. 2. Computing Science and Statistics, 31, 2000, the Proceedings

More information

On bootstrap coverage probability with dependent data

On bootstrap coverage probability with dependent data On bootstrap coverage probability with dependent data By Jānis J. Zvingelis Department of Finance, The University of Iowa, Iowa City, Iowa 52242, U.S.A. Abstract This paper establishes that the minimum

More information

ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE. Joo-Mok Kim* 1. Introduction

ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE. Joo-Mok Kim* 1. Introduction JOURNAL OF THE CHUNGCHEONG MATHEMATICAL SOCIETY Volume 26, No. 2, May 2013 ON THE CONVERGENCE OF FARIMA SEQUENCE TO FRACTIONAL GAUSSIAN NOISE Joo-Mok Kim* Abstract. We consider fractional Gussian noise

More information

Consistent estimation of the memory parameter for nonlinear time series

Consistent estimation of the memory parameter for nonlinear time series Consistent estimation of the memory parameter for nonlinear time series Violetta Dalla, Liudas Giraitis and Javier Hidalgo London School of Economics, University of ork, London School of Economics 18 August,

More information

A COMPARISON OF ESTIMATION METHODS IN NON-STATIONARY ARFIMA PROCESSES. October 30, 2002

A COMPARISON OF ESTIMATION METHODS IN NON-STATIONARY ARFIMA PROCESSES. October 30, 2002 A COMPARISON OF ESTIMATION METHODS IN NON-STATIONARY ARFIMA PROCESSES Lopes, S.R.C. a1, Olbermann, B.P. a and Reisen, V.A. b a Instituto de Matemática - UFRGS, Porto Alegre, RS, Brazil b Departamento de

More information

The Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University

The Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University The Bootstrap: Theory and Applications Biing-Shen Kuo National Chengchi University Motivation: Poor Asymptotic Approximation Most of statistical inference relies on asymptotic theory. Motivation: Poor

More information

Statistica Sinica Preprint No: SS R3

Statistica Sinica Preprint No: SS R3 Statistica Sinica Preprint No: SS-2015-0435R3 Title Subsampling for General Statistics under Long Range Dependence Manuscript ID SS-2015.0435 URL http://www.stat.sinica.edu.tw/statistica/ DOI 10.5705/ss.202015.0435

More information

Econ 423 Lecture Notes: Additional Topics in Time Series 1

Econ 423 Lecture Notes: Additional Topics in Time Series 1 Econ 423 Lecture Notes: Additional Topics in Time Series 1 John C. Chao April 25, 2017 1 These notes are based in large part on Chapter 16 of Stock and Watson (2011). They are for instructional purposes

More information

A COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS. OFFER LIEBERMAN and PETER C. B. PHILLIPS

A COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS. OFFER LIEBERMAN and PETER C. B. PHILLIPS A COMPLETE ASYMPTOTIC SERIES FOR THE AUTOCOVARIANCE FUNCTION OF A LONG MEMORY PROCESS BY OFFER LIEBERMAN and PETER C. B. PHILLIPS COWLES FOUNDATION PAPER NO. 1247 COWLES FOUNDATION FOR RESEARCH IN ECONOMICS

More information

COMPLETE QTH MOMENT CONVERGENCE OF WEIGHTED SUMS FOR ARRAYS OF ROW-WISE EXTENDED NEGATIVELY DEPENDENT RANDOM VARIABLES

COMPLETE QTH MOMENT CONVERGENCE OF WEIGHTED SUMS FOR ARRAYS OF ROW-WISE EXTENDED NEGATIVELY DEPENDENT RANDOM VARIABLES Hacettepe Journal of Mathematics and Statistics Volume 43 2 204, 245 87 COMPLETE QTH MOMENT CONVERGENCE OF WEIGHTED SUMS FOR ARRAYS OF ROW-WISE EXTENDED NEGATIVELY DEPENDENT RANDOM VARIABLES M. L. Guo

More information

3. ARMA Modeling. Now: Important class of stationary processes

3. ARMA Modeling. Now: Important class of stationary processes 3. ARMA Modeling Now: Important class of stationary processes Definition 3.1: (ARMA(p, q) process) Let {ɛ t } t Z WN(0, σ 2 ) be a white noise process. The process {X t } t Z is called AutoRegressive-Moving-Average

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

The Codimension of the Zeros of a Stable Process in Random Scenery

The Codimension of the Zeros of a Stable Process in Random Scenery The Codimension of the Zeros of a Stable Process in Random Scenery Davar Khoshnevisan The University of Utah, Department of Mathematics Salt Lake City, UT 84105 0090, U.S.A. davar@math.utah.edu http://www.math.utah.edu/~davar

More information

Strictly Stationary Solutions of Autoregressive Moving Average Equations

Strictly Stationary Solutions of Autoregressive Moving Average Equations Strictly Stationary Solutions of Autoregressive Moving Average Equations Peter J. Brockwell Alexander Lindner Abstract Necessary and sufficient conditions for the existence of a strictly stationary solution

More information

Adaptive wavelet based estimator of long-range dependence parameter for stationary Gaussian processes

Adaptive wavelet based estimator of long-range dependence parameter for stationary Gaussian processes Adaptive wavelet based estimator of long-range dependence parameter for stationary Gaussian processes Jean-Marc Bardet a, Hatem Bibi a, Abdellatif Jouini b bardet@univ-paris.fr, hatem.bibi@malix.univ-paris.fr,

More information

Generalised AR and MA Models and Applications

Generalised AR and MA Models and Applications Chapter 3 Generalised AR and MA Models and Applications 3.1 Generalised Autoregressive Processes Consider an AR1) process given by 1 αb)x t = Z t ; α < 1. In this case, the acf is, ρ k = α k for k 0 and

More information

Nonparametric regression with martingale increment errors

Nonparametric regression with martingale increment errors S. Gaïffas (LSTA - Paris 6) joint work with S. Delattre (LPMA - Paris 7) work in progress Motivations Some facts: Theoretical study of statistical algorithms requires stationary and ergodicity. Concentration

More information

Time Series: Theory and Methods

Time Series: Theory and Methods Peter J. Brockwell Richard A. Davis Time Series: Theory and Methods Second Edition With 124 Illustrations Springer Contents Preface to the Second Edition Preface to the First Edition vn ix CHAPTER 1 Stationary

More information

LONG MEMORY MODELS Peter M. Robinson Department of Economics, London School of Economics

LONG MEMORY MODELS Peter M. Robinson Department of Economics, London School of Economics LONG MEMORY MODELS Peter M. Robinson Department of Economics, London School of Economics Summary Keywords Introductory Definitions and Discussion Parametric Models Semiparametric Models Volatility Models

More information

Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes

Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes By Tomohito NAITO, Kohei ASAI and Masanobu TANIGUCHI Department of Mathematical Sciences, School of Science and Engineering,

More information

On block bootstrapping areal data Introduction

On block bootstrapping areal data Introduction On block bootstrapping areal data Nicholas Nagle Department of Geography University of Colorado UCB 260 Boulder, CO 80309-0260 Telephone: 303-492-4794 Email: nicholas.nagle@colorado.edu Introduction Inference

More information

Estimation of Parameters in ARFIMA Processes: A Simulation Study

Estimation of Parameters in ARFIMA Processes: A Simulation Study Estimation of Parameters in ARFIMA Processes: A Simulation Study Valderio Reisen Bovas Abraham Silvia Lopes Departamento de Department of Statistics Instituto de Estatistica and Actuarial Science Matematica

More information

Memory properties of transformations of linear processes

Memory properties of transformations of linear processes Memory properties of transformations of linear processes February 13, 2016 Hailin Sang and Yongli Sang Department of Mathematics, University of Mississippi, University, MS 38677, USA. E-mail address: sang@olemiss.edu,

More information

Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions

Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions Vadim Marmer University of British Columbia Artyom Shneyerov CIRANO, CIREQ, and Concordia University August 30, 2010 Abstract

More information

A COMPARISON OF POISSON AND BINOMIAL EMPIRICAL LIKELIHOOD Mai Zhou and Hui Fang University of Kentucky

A COMPARISON OF POISSON AND BINOMIAL EMPIRICAL LIKELIHOOD Mai Zhou and Hui Fang University of Kentucky A COMPARISON OF POISSON AND BINOMIAL EMPIRICAL LIKELIHOOD Mai Zhou and Hui Fang University of Kentucky Empirical likelihood with right censored data were studied by Thomas and Grunkmier (1975), Li (1995),

More information

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½

Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ University of Pennsylvania ScholarlyCommons Statistics Papers Wharton Faculty Research 1998 Asymptotic Nonequivalence of Nonparametric Experiments When the Smoothness Index is ½ Lawrence D. Brown University

More information

RESIDUAL-BASED BLOCK BOOTSTRAP FOR UNIT ROOT TESTING. By Efstathios Paparoditis and Dimitris N. Politis 1

RESIDUAL-BASED BLOCK BOOTSTRAP FOR UNIT ROOT TESTING. By Efstathios Paparoditis and Dimitris N. Politis 1 Econometrica, Vol. 71, No. 3 May, 2003, 813 855 RESIDUAL-BASED BLOCK BOOTSTRAP FOR UNIT ROOT TESTING By Efstathios Paparoditis and Dimitris N. Politis 1 A nonparametric, residual-based block bootstrap

More information

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS*

LARGE DEVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILED DEPENDENT RANDOM VECTORS* LARGE EVIATION PROBABILITIES FOR SUMS OF HEAVY-TAILE EPENENT RANOM VECTORS* Adam Jakubowski Alexander V. Nagaev Alexander Zaigraev Nicholas Copernicus University Faculty of Mathematics and Computer Science

More information

Covariance function estimation in Gaussian process regression

Covariance function estimation in Gaussian process regression Covariance function estimation in Gaussian process regression François Bachoc Department of Statistics and Operations Research, University of Vienna WU Research Seminar - May 2015 François Bachoc Gaussian

More information

Generalized Neyman Pearson optimality of empirical likelihood for testing parameter hypotheses

Generalized Neyman Pearson optimality of empirical likelihood for testing parameter hypotheses Ann Inst Stat Math (2009) 61:773 787 DOI 10.1007/s10463-008-0172-6 Generalized Neyman Pearson optimality of empirical likelihood for testing parameter hypotheses Taisuke Otsu Received: 1 June 2007 / Revised:

More information

Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals

Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals Acta Applicandae Mathematicae 78: 145 154, 2003. 2003 Kluwer Academic Publishers. Printed in the Netherlands. 145 Asymptotically Efficient Nonparametric Estimation of Nonlinear Spectral Functionals M.

More information

AN EMPIRICAL COMPARISON OF BLOCK BOOTSTRAP METHODS: TRADITIONAL AND NEWER ONES

AN EMPIRICAL COMPARISON OF BLOCK BOOTSTRAP METHODS: TRADITIONAL AND NEWER ONES Journal of Data Science 14(2016), 641-656 AN EMPIRICAL COMPARISON OF BLOCK BOOTSTRAP METHODS: TRADITIONAL AND NEWER ONES Beste H. Beyaztas a,b, Esin Firuzan b* a Department of Statistics, Istanbul Medeniyet

More information

Block Bootstrap Prediction Intervals for Vector Autoregression

Block Bootstrap Prediction Intervals for Vector Autoregression Department of Economics Working Paper Block Bootstrap Prediction Intervals for Vector Autoregression Jing Li Miami University 2013 Working Paper # - 2013-04 Block Bootstrap Prediction Intervals for Vector

More information

A Practical Guide to Measuring the Hurst Parameter

A Practical Guide to Measuring the Hurst Parameter A Practical Guide to Measuring the Hurst Parameter Richard G. Clegg June 28, 2005 Abstract This paper describes, in detail, techniques for measuring the Hurst parameter. Measurements are given on artificial

More information

Inference in VARs with Conditional Heteroskedasticity of Unknown Form

Inference in VARs with Conditional Heteroskedasticity of Unknown Form Inference in VARs with Conditional Heteroskedasticity of Unknown Form Ralf Brüggemann a Carsten Jentsch b Carsten Trenkler c University of Konstanz University of Mannheim University of Mannheim IAB Nuremberg

More information

Temporal aggregation of stationary and nonstationary discrete-time processes

Temporal aggregation of stationary and nonstationary discrete-time processes Temporal aggregation of stationary and nonstationary discrete-time processes HENGHSIU TSAI Institute of Statistical Science, Academia Sinica, Taipei, Taiwan 115, R.O.C. htsai@stat.sinica.edu.tw K. S. CHAN

More information

Some functional (Hölderian) limit theorems and their applications (II)

Some functional (Hölderian) limit theorems and their applications (II) Some functional (Hölderian) limit theorems and their applications (II) Alfredas Račkauskas Vilnius University Outils Statistiques et Probabilistes pour la Finance Université de Rouen June 1 5, Rouen (Rouen

More information

Wavelet domain test for long range dependence in the presence of a trend

Wavelet domain test for long range dependence in the presence of a trend Wavelet domain test for long range dependence in the presence of a trend Agnieszka Jach and Piotr Kokoszka Utah State University July 24, 27 Abstract We propose a test to distinguish a weakly dependent

More information

EXTENDED TAPERED BLOCK BOOTSTRAP

EXTENDED TAPERED BLOCK BOOTSTRAP Statistica Sinica 20 (2010), 000-000 EXTENDED TAPERED BLOCK BOOTSTRAP Xiaofeng Shao University of Illinois at Urbana-Champaign Abstract: We propose a new block bootstrap procedure for time series, called

More information

large number of i.i.d. observations from P. For concreteness, suppose

large number of i.i.d. observations from P. For concreteness, suppose 1 Subsampling Suppose X i, i = 1,..., n is an i.i.d. sequence of random variables with distribution P. Let θ(p ) be some real-valued parameter of interest, and let ˆθ n = ˆθ n (X 1,..., X n ) be some estimate

More information

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes

Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Alea 4, 117 129 (2008) Convergence rates in weighted L 1 spaces of kernel density estimators for linear processes Anton Schick and Wolfgang Wefelmeyer Anton Schick, Department of Mathematical Sciences,

More information

Asymptotic efficiency of simple decisions for the compound decision problem

Asymptotic efficiency of simple decisions for the compound decision problem Asymptotic efficiency of simple decisions for the compound decision problem Eitan Greenshtein and Ya acov Ritov Department of Statistical Sciences Duke University Durham, NC 27708-0251, USA e-mail: eitan.greenshtein@gmail.com

More information

A Primer on Asymptotics

A Primer on Asymptotics A Primer on Asymptotics Eric Zivot Department of Economics University of Washington September 30, 2003 Revised: October 7, 2009 Introduction The two main concepts in asymptotic theory covered in these

More information

Nonlinear time series

Nonlinear time series Based on the book by Fan/Yao: Nonlinear Time Series Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna October 27, 2009 Outline Characteristics of

More information

Habilitation à diriger des recherches. Resampling methods for periodic and almost periodic processes

Habilitation à diriger des recherches. Resampling methods for periodic and almost periodic processes Habilitation à diriger des recherches Resampling methods for periodic and almost periodic processes Anna Dudek Université Rennes 2 aedudek@agh.edu.pl diriger des recherches Resampling methods for periodic

More information

Asymptotic distribution of GMM Estimator

Asymptotic distribution of GMM Estimator Asymptotic distribution of GMM Estimator Eduardo Rossi University of Pavia Econometria finanziaria 2010 Rossi (2010) GMM 2010 1 / 45 Outline 1 Asymptotic Normality of the GMM Estimator 2 Long Run Covariance

More information

Single Equation Linear GMM with Serially Correlated Moment Conditions

Single Equation Linear GMM with Serially Correlated Moment Conditions Single Equation Linear GMM with Serially Correlated Moment Conditions Eric Zivot October 28, 2009 Univariate Time Series Let {y t } be an ergodic-stationary time series with E[y t ]=μ and var(y t )

More information

A nonparametric test for seasonal unit roots

A nonparametric test for seasonal unit roots Robert M. Kunst robert.kunst@univie.ac.at University of Vienna and Institute for Advanced Studies Vienna To be presented in Innsbruck November 7, 2007 Abstract We consider a nonparametric test for the

More information

A Single-Pass Algorithm for Spectrum Estimation With Fast Convergence Han Xiao and Wei Biao Wu

A Single-Pass Algorithm for Spectrum Estimation With Fast Convergence Han Xiao and Wei Biao Wu 4720 IEEE TRANSACTIONS ON INFORMATION THEORY, VOL 57, NO 7, JULY 2011 A Single-Pass Algorithm for Spectrum Estimation With Fast Convergence Han Xiao Wei Biao Wu Abstract We propose a single-pass algorithm

More information

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis

Prof. Dr. Roland Füss Lecture Series in Applied Econometrics Summer Term Introduction to Time Series Analysis Introduction to Time Series Analysis 1 Contents: I. Basics of Time Series Analysis... 4 I.1 Stationarity... 5 I.2 Autocorrelation Function... 9 I.3 Partial Autocorrelation Function (PACF)... 14 I.4 Transformation

More information

Invariance of the first difference in ARFIMA models

Invariance of the first difference in ARFIMA models Computational Statistics DOI 10.1007/s00180-006-0005-0 ORIGINAL PAPER Invariance of the first difference in ARFIMA models Barbara P. Olbermann Sílvia R.C. Lopes Valdério A. Reisen Physica-Verlag 2006 Abstract

More information

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t

Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t Differencing Revisited: I ARIMA(p,d,q) processes predicated on notion of dth order differencing of a time series {X t }: for d = 1 and 2, have X t 2 X t def in general = (1 B)X t = X t X t 1 def = ( X

More information

Journal of Statistical Research 2007, Vol. 41, No. 1, pp Bangladesh

Journal of Statistical Research 2007, Vol. 41, No. 1, pp Bangladesh Journal of Statistical Research 007, Vol. 4, No., pp. 5 Bangladesh ISSN 056-4 X ESTIMATION OF AUTOREGRESSIVE COEFFICIENT IN AN ARMA(, ) MODEL WITH VAGUE INFORMATION ON THE MA COMPONENT M. Ould Haye School

More information

Adaptive Local Polynomial Whittle Estimation of Long-range Dependence

Adaptive Local Polynomial Whittle Estimation of Long-range Dependence Adaptive Local Polynomial Whittle Estimation of Long-range Dependence Donald W. K. Andrews 1 Cowles Foundation for Research in Economics Yale University Yixiao Sun Department of Economics University of

More information

Bahadur representations for bootstrap quantiles 1

Bahadur representations for bootstrap quantiles 1 Bahadur representations for bootstrap quantiles 1 Yijun Zuo Department of Statistics and Probability, Michigan State University East Lansing, MI 48824, USA zuo@msu.edu 1 Research partially supported by

More information

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets

TIME SERIES AND FORECASTING. Luca Gambetti UAB, Barcelona GSE Master in Macroeconomic Policy and Financial Markets TIME SERIES AND FORECASTING Luca Gambetti UAB, Barcelona GSE 2014-2015 Master in Macroeconomic Policy and Financial Markets 1 Contacts Prof.: Luca Gambetti Office: B3-1130 Edifici B Office hours: email:

More information

STA205 Probability: Week 8 R. Wolpert

STA205 Probability: Week 8 R. Wolpert INFINITE COIN-TOSS AND THE LAWS OF LARGE NUMBERS The traditional interpretation of the probability of an event E is its asymptotic frequency: the limit as n of the fraction of n repeated, similar, and

More information

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2

THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS. 1 Introduction. Mor Ndongo 1 & Abdou Kâ Diongue 2 THE K-FACTOR GARMA PROCESS WITH INFINITE VARIANCE INNOVATIONS Mor Ndongo & Abdou Kâ Diongue UFR SAT, Universit Gaston Berger, BP 34 Saint-Louis, Sénégal (morndongo000@yahoo.fr) UFR SAT, Universit Gaston

More information

LONG TERM DEPENDENCE IN STOCK RETURNS

LONG TERM DEPENDENCE IN STOCK RETURNS LONG TERM DEPENDENCE IN STOCK RETURNS John T. Barkoulas Department of Economics Boston College Christopher F. Baum Department of Economics Boston College Keywords: Stock returns, long memory, fractal dynamics,

More information

Unit Root Tests Using the ADF-Sieve Bootstrap and the Rank Based DF Test Statistic: Empirical Evidence

Unit Root Tests Using the ADF-Sieve Bootstrap and the Rank Based DF Test Statistic: Empirical Evidence Unit Root Tests Using the ADF-Sieve Bootstrap and the Rank Based DF Test Statistic: Empirical Evidence Valderio A. Reisen 1 and Maria Eduarda Silva 2 1 Departamento de Estatística, CCE, UFES, Vitoria,

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Robust Backtesting Tests for Value-at-Risk Models

Robust Backtesting Tests for Value-at-Risk Models Robust Backtesting Tests for Value-at-Risk Models Jose Olmo City University London (joint work with Juan Carlos Escanciano, Indiana University) Far East and South Asia Meeting of the Econometric Society

More information

An Evaluation of Errors in Energy Forecasts by the SARFIMA Model

An Evaluation of Errors in Energy Forecasts by the SARFIMA Model American Review of Mathematics and Statistics, Vol. 1 No. 1, December 13 17 An Evaluation of Errors in Energy Forecasts by the SARFIMA Model Leila Sakhabakhsh 1 Abstract Forecasting is tricky business.

More information

Multivariate Analysis and Likelihood Inference

Multivariate Analysis and Likelihood Inference Multivariate Analysis and Likelihood Inference Outline 1 Joint Distribution of Random Variables 2 Principal Component Analysis (PCA) 3 Multivariate Normal Distribution 4 Likelihood Inference Joint density

More information

On the robustness of cointegration tests when series are fractionally integrated

On the robustness of cointegration tests when series are fractionally integrated On the robustness of cointegration tests when series are fractionally integrated JESUS GONZALO 1 &TAE-HWYLEE 2, 1 Universidad Carlos III de Madrid, Spain and 2 University of California, Riverside, USA

More information

Research Statement. Mamikon S. Ginovyan. Boston University

Research Statement. Mamikon S. Ginovyan. Boston University Research Statement Mamikon S. Ginovyan Boston University 1 Abstract My research interests and contributions mostly are in the areas of prediction, estimation and hypotheses testing problems for second

More information

5: MULTIVARATE STATIONARY PROCESSES

5: MULTIVARATE STATIONARY PROCESSES 5: MULTIVARATE STATIONARY PROCESSES 1 1 Some Preliminary Definitions and Concepts Random Vector: A vector X = (X 1,..., X n ) whose components are scalarvalued random variables on the same probability

More information