Uniform confidence bands for kernel regression estimates with dependent data IN MEMORY OF F. MARMOL

Size: px
Start display at page:

Download "Uniform confidence bands for kernel regression estimates with dependent data IN MEMORY OF F. MARMOL"

Transcription

1 . Uniform confidence bands for kernel regression estimates with dependent data J. Hidalgo London School of Economics IN MEMORY OF F. MARMOL

2 AIMS AND ESTIMATION METHODOLOGY CONDITIONS RESULTS BOOTSTRAP

3 AIMS AND ESTIMATION: Given a set of data {y t } n t=1 f ( ) = A question of interest can be: regression f unction density f unction spectral density is f (x) smooth? What is its functional form?

4 Here we consider a regression function f ( ) =r ( ), thatis y t = r (x t )+u t ; t =1,...,n, (1.1) where the errors u t are assumed to follow a covariance stationary linear process, and where for simplicity we shall consider henceforth that x t = t/n. Our interests are

5 Test for a particular functional form for r (x), H 0 : r (x) =r 0 (x) x [0, 1]. Test for the continuity of r (x), H 0 : r + (x) =r (x) x [0, 1] where r + (x) =lim z x+ r (z) and r (x) = lim z x r (z).

6 We now describe the kernel estimator of r (x) at a point x q = q/n, 1+ň q n ň 1 is br a (q) :=br a (x q )= 1 ň nx t=1 y t K µ t q where K (x) is a symmetric in [ 1, 1], ň =[na] and a 0 as n. ň, (2.1)

7 To test for the continuity of r (x), we employ one-sided kernels of Rice (1984). These kernels were introduced to obtain consistent estimates at the boundary of their compact support. In our context, it appears useful as the implementation of the test requires the estimation of r + (x) and r (x). Let K + (x) and K (x) be one-sided kernels. Hence, our kernel estimators at the point x q are br a,± (q) = 1 ň nx t=1 µ t q y t K ±. (2.2) ň

8 METHODOLOGY: Tests are based on uniform confidence bands for br (x), thatis sup br (x) r (x). x [0,1] Main result is to obtain its limiting distribution.

9 For the kernel density estimator, say b f (x), off (x), Woodroofe(1967) showed that Pr ( b 1/2 n sup x Ξ n bf (x) f (x) c n <x ) d exp ³ 2e x, (1.2) where x R +, b n =( 2ň log a) and Ξ n is a grid of points in [0, 1]. This result was extended by Bickel and Rosenblatt (1973) when the supremum is over all x [0, 1].

10 In regression models, they were extended, among others, by Johnston (1982) for iid errors u t. However,we wish to relax the condition of independence of u t and in particular we wish to drop any mixing dependence condition on u t. For instance, we allow for strong dependence as in Csörgo and Mielniczuk (1995). Recall that when we allow the data not to be weakly dependent, mixing conditions are incompatible with this. See Ibragimov and Rozanov (1981).

11 However, contrary to Csörgo and Mielniczuk (1995), we do not take the supremum on Ξ n but, on [0, 1]. The possibility of the strong dependence of u t will imply that the normalization constants b n and c n in (1.2), depend on the degree of dependence of u t,measured by the long memory parameter, in contrast to when u t is strong mixing dependent.

12 CONDITIONS: C1 {u t } t Z is a covariance stationary linear process defined as u t = X j=0 ϑ j ε t j ; X j=0 ϑ 2 j <, withϑ 0 =1, where {ε t } t Z is a zero mean iid sequence with E ³ ε 2 t μ < for some >4. Also,forj 0, = σ 2 ε and E ³ ε t = ϑ j = jx k=0 ϑ k b k j, ϑ 0 = b 0 =1, and where, for k 1, ϑ k = (k) k d 1, d [0, 1/2) ; (k) (k +1) < 0 (k) k 1 with 0 (k) > 0 and P k=0 k 2 b k <.

13 Condition C1 implies that u t = X j=0 ϑ j ε 0 t j ; ε0 t = X j=0 One model satisfying (2.3) is the FARIMA(p, d, q) process (1 L) d Φ p (L) u t = Θ q (L) ε t, b j ε t j. (2.3) where (1 L) d = P k=0 ϑ k L k with ϑ k = Γ (k + d) / (Γ (d) Γ (k +1)) and Φ p (L) and Θ q (L) are the autoregressive and moving average polynomials with no common roots and outside the unit circle. The latter implies that Φ 1 p (L) Θ q (L) = P j=0 b j L j with b j = O ³ j a for any a>0.

14 Denoting B (λ) = X j=0 b j e ijλ ; λ ( π, π], the decomposition in (2.3) implies that the s.d.f of {u t } t Z is where g (λ) = f (λ) = σ2 ε g (λ) h (λ), 2π P j=0 ϑ j e ijλ 2 and h (λ) = B (λ) 2. The condition P k=0 k 2 b k < implies that h (λ) is twice continuously differentiable for all λ [0,π]. An example of the function (k) is (k) =log ξ k for any ξ>0.

15 Moreover, because C1, ϑ k is of bounded variation, that is P k=0 ϑk ϑ k+1 <, and ϑ k is also a quasi-monotonic sequence, see Yong (1974, p.4), wehavethat g (λ) Dλ 2d as λ 0+, 0 <D<, seeyong s(1974) Theorems III-11 and 12. Finally, we have that γ j = Z π π g (λ) cos (jλ) dλ ζ (d) j 2d 1 as j, d µ 0, 1, (2.4) 2 =2π j0, d =0, where ab is the Kronecker delta and ζ (d) =2Γ (1 2d)cos ³ π ³ 1 2 d.

16 We could allow for more than one singularity when examining the asymptotic properties of the supremum of br a (q). However, it eases the arguments for the validity of the bootstrap.

17 C2 For all x [0, 1], the regression function r (x) satisfies lim y x r (y) r (x) Q (x) x y τ = o (1) where 0 <τ 2 and Q (x) is a polynomial of degree [τ 1], with[z] denoting the integer part of z. C3 K :[ 1, 1] R is a symmetric continuous differentiable function in ( 1, 1) which integrates 1. C4 K + : [0, 1] R and K : [ 1, 0] R, where K + (x) = K ( x), K (1) + (x) = K(1) ( x), K + (0) = K + (1) = 0, R 1 0 K + (x) dx =1and R 10 xk + (x) dx =0.

18 C5 As n, (i) ň 1 0 and (ii) n 1 2 d a 1 2 d+τ D<, withτ as in Condition C2. If d =0and in C2 τ =2, then the optimal choice of a satisfies a = Dn 1/5 for some finite positive constant D, which corresponds to the choice of the bandwidth parameter by, say, cross-validation. Also, note that for a given degree of smoothness on r (x), thatisτ in C2, the bandwidth parameter a converges to zero slower as d increases.

19 We shall give a proposition before the main results. Proposition 1 Under Assumptions C1 and C3, sup 1 s n sx t=1 K t u t sx t=1 K t u t = o p ³ n d+1/4, (2.5) where u t = P j=0 ϑ j ε t j and {ε t } t Z is a zero mean iid sequence of standard normal random variables.

20 RESULTS We shall first examine the covariance of br a (q) at two different points x q1 x q2. Define b (q 1,q 2 )=(q 2 q 1 ) /ň and b := lim n b (q 1,q 2 ).

21 Proposition 2 Assuming C1 C3 and C5,foranyň < q 1 q 2 < n ň,asn, ň 1 2d Cov (br a (q 1 ),br a (q 2 )) ρ (b, d), (3.1) (a) if 0 < d < 1 2, ρ (b; d)=h θ (d) Z 1 1 Z 1 +b b 1 v w 2d 1 K (v) K (w b) dvdw, (3.2) (b) if d = 0, ρ (b; d)=h R 1 1 K (v) K (v b) dv.

22 The next proposition deals with the correlation structure of br a (q) as b 0 and when b as n. This will play an important role when studying the limit distribution of T r given in (3.3). Proposition 3 Under C1 C3 and C5,forsomeα (0, 2 ], asn, (a) ρ (b (q 1,q 2 ),d) ρ (b (q 1,q 1 );d) 1= D b (q 1,q 2 ) α + o ( b (q 1,q 2 ) α ) as b (q 1,q 2 ) 0 (b) ρ (b (q 1,q 2 ),d)log(b (q 1,q 2 )) = o (1) as b (q 1,q 2 ).

23 Proposition 4 Assuming C1 C3 and C5,foranyfinite collection q j, j = 1,..., p, such that ň < q j < n ň and q j1 q j2 nz > 0,asn, ň 1 2 d ρ 1 2 (0 ; d) ³ br a ³ qj r ³ qj j =1,...,p d N (0, diag (1,...,1 )).

24 The results are key to obtain the asymptotic distribution of T r = sup br a (q) r (q). (3.3) q=ň+1,...,n ň

25 Theorem 5 Let υ n =( 2 log a) 1 /2. Assuming that C1 C3 and C5 holds Prob ½υ n µň 1 2 d ¾ ρ 1 2 (0 ; d) T r ξ n x exp ³ 2e x, for x > 0, n where (a) If 0 < d < 1 /2,then ( µ1 ξ n = υ n + υn α log log a 1 +log à (2 π) α 2 α E for some 0 < E <, whereα is as given in Proposition 3.2, 0 <J α lim a 0 Z 0 e s Pr sup Y (t) > s 0 t [a] 1 ds <!) α 1 J α

26 and Y (t) is a stationary mean zero Gaussian process with covariance structure Cov (Y (t 1 ), Y (t 2 )) = t 1 α + t 2 α t 2 t 1 α. (b) If d = 0,then ξ n = υ n +υ 1 n log 1 2 π Ã Z 1 1 K (1 ) (x) 2 dx! 1 /2.

27 Atestfor H 0 : r (x) =r 0 (x), x (0, 1) at a κ significant level is based just on rejecting if υ n µ ň 1 2 d ρ 1 2 (0; d) T r ξ n is greater than x κ,the(1 κ) th quantile of the Gumbel limiting distribution.

28 We now describe the test for continuity or structural breaks, that is H 0 : r + (x) =r (x), x (0, 1). (3.5) Because under H 0, r + (x) = r (x) for all x [0, 1], we should expect that br a,+ (q) ' br a, (q), forallq = ň +1,...,n ň 1. This suggests that test for breaks can be based on T b r = sup br a,+ (q) br a, (q). (3.6) [ň]<q<n [ň]

29 Before we examine the properties of T b r,wefirst examine the covariance of br a,+ (q) br a, (q) at two different points x q1 x q2.

30 Proposition 6 Assuming C1, C2, C4 and C5,underH 0,foranyň < q 1 q 2 < n ň, as n, where ň 1 2d Cov br a,+ (q 1 ) br a, (q 1 ),br a,+ (q 2 ) br a, (q 2 ) ρ (b, d)=:ρ + (b, d)+ρ (b, d) ρ ± (b, d) ρ :(b, d), (a) if 0 < d < 1 2, ρ + (b; d)=h θ (d) Z 1 0 Z 1 +b b v w 2d 1 K + (v) K + (w b) dvdw,

31 Z 0 ρ (b; d) = H θ (d) ρ ± (b; d) = H θ (d) ρ (b; d) = H θ (d) Z b 1 Z 1 Z b 0 Z 0 1 b 1 v w 2d 1 K (v) K (w b) dvdw, b 1 v w 2d 1 K + (v) K (w b) dvdw, Z 1 +b b v w 2d 1 K (v) K + (w b) dvdw. (b) if d = 0, ρ + (b; d)=ρ (b; d) =4 1 H R 1 0 K + (v) K + (v b) dv and ρ ± (b; d)=ρ (b; d)=0.

32 The next proposition deals with the correlation structure of br a,+ (q) br a, (q) as b (q 1,q 2 ) 0. Proposition 7 Under C1, C2, C4 and C5,forsomeα (0, 2 ], asn, (a) ρ (b (q 1,q 2 ),d) ρ (b (q 1,q 1 );d) 1= D b (q 1,q 2 ) α + o ( b (q 1,q 2 ) α ) as b (q 1,q 2 ) 0 (b) ρ (b (q 1,q 2 ),d)log(b (q 1,q 2 )) = o (1) as b (q 1,q 2 ). Proposition 8 Assuming C1, C2, C.4 and C5,foranyfinite collection q j, j = 1,...,p, such that ň < q j < n ň and q j1 q j2 nz > 0,asn, ň 1 2 d ρ 1 2 (0 ; d) ³ br a,+ ³ qj br a, ³ qj j =1,...,p d N (0, diag (1,...,1 )).

33 Theorem 9 Let υ n =( 2 log a) 1 /2. Assuming that C1, C2, C4 and C5 holds µ Prob ½υ n ň 1 2 d ρ 1 2 (0 ; d) Tr b ¾ ξ n x exp ³ 2e x, for x > 0, n where (a) If 0 < d < 1 /2,then ( µ1 ξ n = υ n + υn α where E, J α and Y (t) as Theorem 5. log log a 1 +log à (2 π) α 2 α E!) α 1 J α (b) If d = 0,then ξ n = υ n +υ 1 n log 1 2 π à Z 1 0 K (1 ) + (x)2 dx! 1 /2.

34 Theorems 5 and 9 give asymptotic justification for T r or T b r given in (3.3) or (3.6), respectively, and so for the hypothesis testing in, say, (3.5) under H 0,weobserve that one of the normalization constants needed to achieve the asymptotic distribution depends on d. Also, and perhaps more importantly, not only ξ n depends on d but on J α which may be very difficult to compute except for the especial cases α =1or 2. In particular, we have that J 2 = υ n + υ 1 n log ³ π 1 (E/2) 1/2 υ 1 n and J 1 = υ n + log n (E/π) 1/ log log a 1o,whereE is a constant which depends on K and easy to obtain.

35 More specifically, although d can be estimated, we face one potential difficulty to obtain the uniform bounds of br a (q). FromtheproofofProposition3,α depends on K as well as on d, sothattoobtainj α does not seem an easy task. Under these circumstances, bootstrap algorithms appear to be a sensible procedure to go forward.

36 BOOTSTRAP - Major interest to make inferences when quantiles limiting distribution difficult or possible to obtain. - The bootstrap statistic, say T r, consistently estimates the distribution of T r under H 0 H a. That is T r d T r. - A second requirement, (to have good power), is that under H 1, T r also converges in bootstrap distribution.

37 Motivation a Rate of convergence to the Gumbel distribution is logarithmic. See for example Hall(1979, 1991) or Resnick (1987). Konakov and Piterbarg (1984) or Hall (1991) suggest the rate is the best possible one. b The normalization constants needed to achieve the asymptotic distribution of sup x [0,1] br (x) r (x) is model dependent and difficult to obtain.

38 Solution: Edgeworth expansions and bootstrap techniques.. Edgeworth expansions: Konakov and Piterbarg (1984) showedthatitiso ³ n δ for any arbitrarily small δ>0. Hall (1991) showed that bootstrap methods better job in that the accuracy is O ³ ň 1/2 log 2 n.

39 Bootstrap Time Series. - Time Domain. - Frequency Domain.

40 TIME DOMAIN 1. Moving block bootstrap (MBB) of Künsch (1989). 2. Subsampling of Politis and Romano (1994). 3. Sieve-bootstrap of Bühlmann (1997) following ideas in Kreiss (1988).

41 FREQUENCY DOMAIN Basically, these procedures are based on resampling the periodogram or (modulus) of DFT of {u t } n t=1. Specifically, the bootstrap approximates P j=0 ϑ j e ijλ 2. Examples: Franke and Härdle (1992) or Dahlhaus and Janas (1996), whereas among the latter, see Theiler et al. (1992) or Prichard and Theiler (1994) or more recently Hidalgo (2003). One problem with the latter bootstraps is that they may not be valid in some circumstances.

42 One example is when bootstrapping the sample mean, and the same happens to be in our context. See also Dahlhaus and Janas (1996) for some remarks and other examples for its not validity.

43 - Validity: common assumption is that the data satisfy some mixing dependence such as strong-mixing. - For data exhibit strong dependence, there is not justification for the validity of the sieve-bootstrap. - However, some results are given in Hall et al. (1998) for the subsampling algorithm. Requirement:v t = H (u t ) and u t to be Gaussian and H (u) =u or u 2 E ³ u 2. - For the MBB requires that H (u) =u see Lahiri (1993).

44 - When perform hypothesis testing, methods as subsampling would have less power than bootstraps schemes whose (asymptotic) distribution is the same under both the null and alternative hypotheses.

45 Bootstrap proposed: a Valid for data which do not satisfy the standard mixing conditions, such as strong mixing. b Similarities with the sieve-bootstrap. For example, it approximates the whole transfer function P j=0 ϑ j e ijλ instead of its modulus as that in, say, Theiler et al. (1992) or Hidalgo (1993). Ability to cope in a natural form with missing observations or unequal spaced data.

46 it is not a subset of the original data. bootstrap data, say {u t } t 1, is covariance stationary in the sense that Cov (u t,u s) is a function of t s.

47 We now describe the bootstrap algorithm. Suppose interest to bootstrap u = n 1 P n t=1 u t. Let for a generic sequence {a t } n t=1, w a ³ λj = 1 n 1/2 nx t=1 a t e itλ j. Suppose that in C1 ϑ k = I (k =0),sou t = P k=0 b k ε t k. Now, using the identity u t = 1 n 1/2 nx j=1 e itλ jw u ³ λj. (4.1)

48 Using Bartlett s approximation w u ³ λj B ³ e iλ j wε ³ λj, u t 1 n 1/2 nx j=1 e itλ jb ³ e iλ j wε ³ λj.

49 Now suppose that u t =(1 L) d X k=0 b k ε t k ; b 0 =1, The previous arguments may induce to employ the Bartlett s approximation and the approximation w u ³ λj ³ 1 e iλ j d B ³ e iλ j wε ³ λj, (4.2) u t 1 n 1/2 nx e itλ j j=1 ³ 1 e iλ j d B ³ e iλ j wε ³ λj. (4.3)

50 However, the lack of smoothness of ³ 1 e iλ j d around λj =0and results given in Robinson s (1995a) Theorem 1 at frequencies λ j for fixed j indicate that the approximation in (4.2) for those frequencies appears not to be appropriate. Observe thatthesefrequenciesarepreciselythosewhicharemorerelevantwhenexamining the asymptotic behaviour of u or, br a (x) or br a,+ (x) and br a, (x). Recall that the asymptotic variance of br a (x) depends only on f (λ) for λ in a neighbourhood of zero. For that reason, we replace ³ 1 e iλ j d by eg 1/2 ³ λ j ; d = n 1 X = n+1 γ ( ; d) e i λ j 1/2,

51 where So, eg ³ λ j ; d γ ( ; d) = ( 1) Γ (1 2d) Γ ( d +1)Γ (1 d) can be regarded as a truncated version of 1 e iλ j 2d.

52 Therefore we modify (4.3) as u t eu t =: 1 n 1/2 nx j=1 e itλ j eg 1/2 ³ λ j ; d B ³ e iλ j wε ³ λj, (4.4) whichistobeusedtoobtainthebootstrapsampleu 1,...,u n. The right side of (4.4) preserves (asymptotically) the covariance structure of u t. Thus, we replace d and B ³ e iλ j in the right of (4.4) by consistent estimators, say bd and b B ³ e iλ j, the problem of how to obtain a bootstrap sample u t, t =1,...,n, becomes that of how to perform a valid bootstrap algorithm for w ε ³ λj, j =1,...,n.

53 The previous arguments suggest the bootstrap algorithm to bootstrapping T r in (3.3), andwhichisdescribedinthefollowing7steps.

54 STEP 1 Obtain the centred residuals bu t = eu t n 1 P n t=1 eu t, t =1,...,n,where and br a (t) is given in (2.1). eu t = y t br a (t), t =1,..., n STEP 2 Let eu = ³ eu 1, eu 2,..., eu 0 n be a random sample with replacement from the empirical distribution of the standardized residuals ǔ t = eσ 1 u bu b t, eσ 2 bu = 1 n and obtain the discrete Fourier transform of eu, nx t=1 bu 2 t, η j := w ε ³ λj = 1 n 1/2 nx t=1 eu t e itλ j, j =1,..., n.

55 STEP 3 We estimate d by b d an estimator of d, say Robinson s (1995b) GSE, defined as where 0 < 1 < 1/2, and er (d) =log 1 m bd =arg mx j=1 min d [0, 1 ] λ 2d j I ³ bubu λj 2d er (d), (4.5) mx j=1 log λ j for integer m [1, ñ =[n/2]) and where I bubu (λ) = w bu (λ) 2 / (2π) is the periodogram of bu t,withm 1 + mn 1 0.

56 Let our estimator of σ 2 εh (λ) /2π be 1 bh (λ) = 2m +1 mx j= m 1 e i(λ+λ j) 2 b d Ibubu ³ λ + λj. Remark 1 In general g (λ) 6= by C1, seealso(2.4), h (λ) =g (λ) of h (λ). 1 e iλ 2d. However this is not a problem because 1 e iλ 2d h (λ) satisfiesthesameconditions

57 STEP 4 Consider M =[n/4m] and compute u t = 1 n 1/2 where nx e itλ n 1 X j j=1 = n+1 bγ ³, b d e i λ j 1/2 bb ³ e iλ j η j, (t =1,...,n), bb ³ e iλ j =1+ b b 1 e iλ j b b M e imλ j (4.6) and b b = 1 n ñ 1 X ba ³ e iλ j ei λ j, ( =1,...,M) j= ñ+1

58 with bc r =ñ 1 P ñ =1 log ³b h (λ ) cos (rλ ); for r =0,...,M and ba (λ) =exp MX r=1 bc r e irλ. The estimator b A (λ) of A (λ) comes from the canonical spectral decomposition of h (λ), see Brillinger (1981). Also, notice that bb ³ e iλ j 2 is an estimator of h ³ e iλ j.

59 STEP 5 Compute and br e (t) = br e,+ (t), t =1,...,ň br e (t), t = ň +1,..., n ň 1 br e, (t), t = n ň,..., n. y t = br e (t)+(2π) 1/2 e 1 2 bc 0 u t ; t =1,..., n, where br e,+ (t), andbr e,+ (t) and br e, (t) are defined as in (2.1) and (2.2) but with a bandwidth parameter e such that e 0 and a = o (e). Note that by the well-known Kolmogorov formula, 2πe bc 0 is an estimator of the variance of ε t, that is the one-step mean square prediction error. Moreover, the requirement that a = o (e) is typical when bootstrapping nonparametric regression

60 functions, see for instance Härdle and Marron (1991). Basically this is needed to make sure that the bias of the bootstrap nonparametric estimator converges to zero in probability. Recall that, in our setup, the bias of the nonparametric estimator of r + ( ) r ( ) converges to zero. STEP 6 Compute br a (q), q = ň +1,...,n ň 1, asin(2.1) but with y t replaced by yt and the same bandwidth parameter a employed in STEP 1 to compute br a (t). Our final step is: STEP 7 Compute the bootstrap version of T r as T r = sup br a (q) br a (q). (4.7) ň<q<n ň

61 Condition on m and the bandwidth parameters a and e. C6 As n, (i) (ne) 1 0, (ii) n 1 2 d e 5 2 d D<, wherewehavetaken τ =2in C3, (iii) D 1 n 1/3 <a<dn 1/4 and (iv) D 1 n 3/5 <m< Dn 3/4. Observe that C6 implies that a = o (e). However, the bandwidth e satisfies the same conditions as the generic bandwidth a in C5.

62 We modify Condition C1 to: C1 C1 holds, where the coefficients ϑ k are given by ϑ k = Γ (k + d) / (Γ (d) Γ (k +1)). The modification of C1 to C1 0 will imply that the function g (λ) in (2.4) is 1 e iλ 2d.

63 As was mentioned after STEP 5, the motivation to use a pilot bandwidth parameter e such that a = o (e) is to provide a correct adjustment of the bias, and as the next proposition indicates, this is the case in our context. To that end, remember that in the proof of Proposition 3.3, we obtain that ϕ a (q) =E (br a (q)) r (q) =o ³ ň d 1/2, whereas the bootstrap bias constructed from the resampled data is ϕ a (q) =E (br a (q) br a (q)).

64 Proposition 10 Assuming C1 0, C2, C3,withτ = 2 there, and C5, under H 0 H 1, as n, ϕ a (q) =o p ³ňd 1/2. Proposition 11 Assuming C1 0, C2, C3,withτ = 2 there, and C5,wehavethat for 0 q < n ň, 1 ň 2d ň+q X v=q+1 E ³ u t u t+v δ v = op (1 ). (4.8) Theorem 12 Under the same conditions of Proposition 4.2 and H 0 H 1,asn µ Prob ½υ n ň 1 2 b d ρ 1 ³ ¾ 2 0 ; d b Tr ξ n x P ³ exp 2e x, for x > 0, where υ n and ξ n were definedintheorem3.4. Y

65 We now comment on how to obtain the bootstrap analogue of T b r given in (3.6), that is when the interest is on testing for (3.5). STEP 1 Obtain the centred residuals bu t = eu t n 1 P n t=1 eu t, t =1,...,n,where eu t = y t br a,+ (t), t =1,...,ň y t 1 b 2 r a,+ (t)+br a, (t), t = ň +1,..., n ň 1 y t br a, (t), t = n ň,..., n, and br a,+ (t) and br a, (t) are given in (2.2). STEPS 2 TO 4 As Steps 2 to 4.

66 STEP 5 As in STEP 5 but br e (t) replace by 1 r be,+ (t)+br e, (t), t = ň +1,...,n ň 1 2 and y t = br e (t)+(2π) 1/2 e bc 0/2 u t ; t =1,...,n, where br e,+ (t) and br e, (t) are defined as in (2.2) but with a bandwidth parameter e such that e 0 and a = o (e). STEP 6 Compute br a,+ (q) and br a, (q), q = ň +1,...,n ň 1, asin(2.2) but with y t replaced by yt and the same bandwidth parameter a employed in STEP 1 to compute br a,+ (t) and br a, (t).

67 STEP 7 Compute the bootstrap version of Tr b as T b r = sup ň<q<n ň br a,+ (q) br a, (q).

68 Denote the bootstrap bias as ϕ a (q) =E ³ br a,+ (q) br a, (q). Proposition 13 Assuming C1 0, C2,withτ = 2 there, C4 and C6,underH 0 H 1, as n, ϕ a (q) =o p ³ňd 1/2. Theorem 14 Under the same conditions of Proposition 4.2 and H 0 H 1,asn µ Prob ½υ n ň 1 2 b d ρ 1 ³ ¾ 2 0 ; d b Tr b ξ n x P ³ exp 2e x, for x > 0, where υ n and ξ n were definedintheorem3.4. Y

69 Remark 2 To obtain a critical value x (β), forwhichexp ³ 2e x(β) =1 β, is the same as to find the value, say z n,suchthat lim Prob {T n r zn Y}=1 β. N Now, zn depends on ξ n and υ n and thus on the choice of the bandwidth parameter a. N In fact, since ξ n and υ n cannot be computed, in practice we would compute z n. N The latter implies that for the bootstrap to be valid, we need zn to satisfy that z n zn p 0, wherez n is such that lim n Prob {T r z n } =1 β, e.g. the corresponding value with the original data.

70 N But, for z n zn p 0 to hold true, it is obvious that we need the normalization constants ξ n and υ n tobethesameforbotht r and Tr, or at least their difference to converge to zero. N This is obviously possible only if the bandwidth parameter a isthesamewhen estimating the regression function with both the original y t and bootstrap y t data.

TESTING FOR BREAKS IN REGRESSION MODELS WITH DEPENDENT DATA

TESTING FOR BREAKS IN REGRESSION MODELS WITH DEPENDENT DATA TESTING FOR BREAKS IN REGRESSION MODELS WITH DEPENDENT DATA J. HIDALGO AND V. DALLA A. The paper examines a test for smoothness/breaks in a nonparametric regression model with dependent data. The test

More information

Residual Bootstrap for estimation in autoregressive processes

Residual Bootstrap for estimation in autoregressive processes Chapter 7 Residual Bootstrap for estimation in autoregressive processes In Chapter 6 we consider the asymptotic sampling properties of the several estimators including the least squares estimator of the

More information

Parametric Inference on Strong Dependence

Parametric Inference on Strong Dependence Parametric Inference on Strong Dependence Peter M. Robinson London School of Economics Based on joint work with Javier Hualde: Javier Hualde and Peter M. Robinson: Gaussian Pseudo-Maximum Likelihood Estimation

More information

Bickel Rosenblatt test

Bickel Rosenblatt test University of Latvia 28.05.2011. A classical Let X 1,..., X n be i.i.d. random variables with a continuous probability density function f. Consider a simple hypothesis H 0 : f = f 0 with a significance

More information

UNIVERSITÄT POTSDAM Institut für Mathematik

UNIVERSITÄT POTSDAM Institut für Mathematik UNIVERSITÄT POTSDAM Institut für Mathematik Testing the Acceleration Function in Life Time Models Hannelore Liero Matthias Liero Mathematische Statistik und Wahrscheinlichkeitstheorie Universität Potsdam

More information

Robust Backtesting Tests for Value-at-Risk Models

Robust Backtesting Tests for Value-at-Risk Models Robust Backtesting Tests for Value-at-Risk Models Jose Olmo City University London (joint work with Juan Carlos Escanciano, Indiana University) Far East and South Asia Meeting of the Econometric Society

More information

Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance

Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Advances in Decision Sciences Volume, Article ID 893497, 6 pages doi:.55//893497 Research Article Least Squares Estimators for Unit Root Processes with Locally Stationary Disturbance Junichi Hirukawa and

More information

Defect Detection using Nonparametric Regression

Defect Detection using Nonparametric Regression Defect Detection using Nonparametric Regression Siana Halim Industrial Engineering Department-Petra Christian University Siwalankerto 121-131 Surabaya- Indonesia halim@petra.ac.id Abstract: To compare

More information

Modeling and testing long memory in random fields

Modeling and testing long memory in random fields Modeling and testing long memory in random fields Frédéric Lavancier lavancier@math.univ-lille1.fr Université Lille 1 LS-CREST Paris 24 janvier 6 1 Introduction Long memory random fields Motivations Previous

More information

Extremogram and ex-periodogram for heavy-tailed time series

Extremogram and ex-periodogram for heavy-tailed time series Extremogram and ex-periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Zagreb, June 6, 2014 1 2 Extremal

More information

Statistica Sinica Preprint No: SS

Statistica Sinica Preprint No: SS Statistica Sinica Preprint No: SS-017-0013 Title A Bootstrap Method for Constructing Pointwise and Uniform Confidence Bands for Conditional Quantile Functions Manuscript ID SS-017-0013 URL http://wwwstatsinicaedutw/statistica/

More information

Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes

Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes Local Whittle Likelihood Estimators and Tests for non-gaussian Linear Processes By Tomohito NAITO, Kohei ASAI and Masanobu TANIGUCHI Department of Mathematical Sciences, School of Science and Engineering,

More information

Extremogram and Ex-Periodogram for heavy-tailed time series

Extremogram and Ex-Periodogram for heavy-tailed time series Extremogram and Ex-Periodogram for heavy-tailed time series 1 Thomas Mikosch University of Copenhagen Joint work with Richard A. Davis (Columbia) and Yuwei Zhao (Ulm) 1 Jussieu, April 9, 2014 1 2 Extremal

More information

The Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University

The Bootstrap: Theory and Applications. Biing-Shen Kuo National Chengchi University The Bootstrap: Theory and Applications Biing-Shen Kuo National Chengchi University Motivation: Poor Asymptotic Approximation Most of statistical inference relies on asymptotic theory. Motivation: Poor

More information

Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION. September 2017

Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION. September 2017 Supplemental Material for KERNEL-BASED INFERENCE IN TIME-VARYING COEFFICIENT COINTEGRATING REGRESSION By Degui Li, Peter C. B. Phillips, and Jiti Gao September 017 COWLES FOUNDATION DISCUSSION PAPER NO.

More information

On block bootstrapping areal data Introduction

On block bootstrapping areal data Introduction On block bootstrapping areal data Nicholas Nagle Department of Geography University of Colorado UCB 260 Boulder, CO 80309-0260 Telephone: 303-492-4794 Email: nicholas.nagle@colorado.edu Introduction Inference

More information

Time Series and Forecasting Lecture 4 NonLinear Time Series

Time Series and Forecasting Lecture 4 NonLinear Time Series Time Series and Forecasting Lecture 4 NonLinear Time Series Bruce E. Hansen Summer School in Economics and Econometrics University of Crete July 23-27, 2012 Bruce Hansen (University of Wisconsin) Foundations

More information

High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data

High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data High Dimensional Empirical Likelihood for Generalized Estimating Equations with Dependent Data Song Xi CHEN Guanghua School of Management and Center for Statistical Science, Peking University Department

More information

Inference For High Dimensional M-estimates. Fixed Design Results

Inference For High Dimensional M-estimates. Fixed Design Results : Fixed Design Results Lihua Lei Advisors: Peter J. Bickel, Michael I. Jordan joint work with Peter J. Bickel and Noureddine El Karoui Dec. 8, 2016 1/57 Table of Contents 1 Background 2 Main Results and

More information

Large sample distribution for fully functional periodicity tests

Large sample distribution for fully functional periodicity tests Large sample distribution for fully functional periodicity tests Siegfried Hörmann Institute for Statistics Graz University of Technology Based on joint work with Piotr Kokoszka (Colorado State) and Gilles

More information

Bahadur representations for bootstrap quantiles 1

Bahadur representations for bootstrap quantiles 1 Bahadur representations for bootstrap quantiles 1 Yijun Zuo Department of Statistics and Probability, Michigan State University East Lansing, MI 48824, USA zuo@msu.edu 1 Research partially supported by

More information

large number of i.i.d. observations from P. For concreteness, suppose

large number of i.i.d. observations from P. For concreteness, suppose 1 Subsampling Suppose X i, i = 1,..., n is an i.i.d. sequence of random variables with distribution P. Let θ(p ) be some real-valued parameter of interest, and let ˆθ n = ˆθ n (X 1,..., X n ) be some estimate

More information

Orthogonal samples for estimators in time series

Orthogonal samples for estimators in time series Orthogonal samples for estimators in time series Suhasini Subba Rao Department of Statistics, Texas A&M University College Station, TX, U.S.A. suhasini@stat.tamu.edu May 10, 2017 Abstract Inference for

More information

Optimal Estimation of a Nonsmooth Functional

Optimal Estimation of a Nonsmooth Functional Optimal Estimation of a Nonsmooth Functional T. Tony Cai Department of Statistics The Wharton School University of Pennsylvania http://stat.wharton.upenn.edu/ tcai Joint work with Mark Low 1 Question Suppose

More information

RESIDUAL-BASED BLOCK BOOTSTRAP FOR UNIT ROOT TESTING. By Efstathios Paparoditis and Dimitris N. Politis 1

RESIDUAL-BASED BLOCK BOOTSTRAP FOR UNIT ROOT TESTING. By Efstathios Paparoditis and Dimitris N. Politis 1 Econometrica, Vol. 71, No. 3 May, 2003, 813 855 RESIDUAL-BASED BLOCK BOOTSTRAP FOR UNIT ROOT TESTING By Efstathios Paparoditis and Dimitris N. Politis 1 A nonparametric, residual-based block bootstrap

More information

University, Tempe, Arizona, USA b Department of Mathematics and Statistics, University of New. Mexico, Albuquerque, New Mexico, USA

University, Tempe, Arizona, USA b Department of Mathematics and Statistics, University of New. Mexico, Albuquerque, New Mexico, USA This article was downloaded by: [University of New Mexico] On: 27 September 2012, At: 22:13 Publisher: Taylor & Francis Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered

More information

Reliable Inference in Conditions of Extreme Events. Adriana Cornea

Reliable Inference in Conditions of Extreme Events. Adriana Cornea Reliable Inference in Conditions of Extreme Events by Adriana Cornea University of Exeter Business School Department of Economics ExISta Early Career Event October 17, 2012 Outline of the talk Extreme

More information

Estimation of Treatment Effects under Essential Heterogeneity

Estimation of Treatment Effects under Essential Heterogeneity Estimation of Treatment Effects under Essential Heterogeneity James Heckman University of Chicago and American Bar Foundation Sergio Urzua University of Chicago Edward Vytlacil Columbia University March

More information

Confidence intervals for kernel density estimation

Confidence intervals for kernel density estimation Stata User Group - 9th UK meeting - 19/20 May 2003 Confidence intervals for kernel density estimation Carlo Fiorio c.fiorio@lse.ac.uk London School of Economics and STICERD Stata User Group - 9th UK meeting

More information

Inference For High Dimensional M-estimates: Fixed Design Results

Inference For High Dimensional M-estimates: Fixed Design Results Inference For High Dimensional M-estimates: Fixed Design Results Lihua Lei, Peter Bickel and Noureddine El Karoui Department of Statistics, UC Berkeley Berkeley-Stanford Econometrics Jamboree, 2017 1/49

More information

On detection of unit roots generalizing the classic Dickey-Fuller approach

On detection of unit roots generalizing the classic Dickey-Fuller approach On detection of unit roots generalizing the classic Dickey-Fuller approach A. Steland Ruhr-Universität Bochum Fakultät für Mathematik Building NA 3/71 D-4478 Bochum, Germany February 18, 25 1 Abstract

More information

Peak Detection for Images

Peak Detection for Images Peak Detection for Images Armin Schwartzman Division of Biostatistics, UC San Diego June 016 Overview How can we improve detection power? Use a less conservative error criterion Take advantage of prior

More information

Estimation of Threshold Cointegration

Estimation of Threshold Cointegration Estimation of Myung Hwan London School of Economics December 2006 Outline Model Asymptotics Inference Conclusion 1 Model Estimation Methods Literature 2 Asymptotics Consistency Convergence Rates Asymptotic

More information

Economics Division University of Southampton Southampton SO17 1BJ, UK. Title Overlapping Sub-sampling and invariance to initial conditions

Economics Division University of Southampton Southampton SO17 1BJ, UK. Title Overlapping Sub-sampling and invariance to initial conditions Economics Division University of Southampton Southampton SO17 1BJ, UK Discussion Papers in Economics and Econometrics Title Overlapping Sub-sampling and invariance to initial conditions By Maria Kyriacou

More information

Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao

Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics. Jiti Gao Model Specification Testing in Nonparametric and Semiparametric Time Series Econometrics Jiti Gao Department of Statistics School of Mathematics and Statistics The University of Western Australia Crawley

More information

Statistical inference on Lévy processes

Statistical inference on Lévy processes Alberto Coca Cabrero University of Cambridge - CCA Supervisors: Dr. Richard Nickl and Professor L.C.G.Rogers Funded by Fundación Mutua Madrileña and EPSRC MASDOC/CCA student workshop 2013 26th March Outline

More information

Goodness-of-fit tests for the cure rate in a mixture cure model

Goodness-of-fit tests for the cure rate in a mixture cure model Biometrika (217), 13, 1, pp. 1 7 Printed in Great Britain Advance Access publication on 31 July 216 Goodness-of-fit tests for the cure rate in a mixture cure model BY U.U. MÜLLER Department of Statistics,

More information

Time series models in the Frequency domain. The power spectrum, Spectral analysis

Time series models in the Frequency domain. The power spectrum, Spectral analysis ime series models in the Frequency domain he power spectrum, Spectral analysis Relationship between the periodogram and the autocorrelations = + = ( ) ( ˆ α ˆ ) β I Yt cos t + Yt sin t t= t= ( ( ) ) cosλ

More information

SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES

SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES Statistica Sinica 19 (2009), 71-81 SMOOTHED BLOCK EMPIRICAL LIKELIHOOD FOR QUANTILES OF WEAKLY DEPENDENT PROCESSES Song Xi Chen 1,2 and Chiu Min Wong 3 1 Iowa State University, 2 Peking University and

More information

A New Test in Parametric Linear Models with Nonparametric Autoregressive Errors

A New Test in Parametric Linear Models with Nonparametric Autoregressive Errors A New Test in Parametric Linear Models with Nonparametric Autoregressive Errors By Jiti Gao 1 and Maxwell King The University of Western Australia and Monash University Abstract: This paper considers a

More information

Nonparametric regression with martingale increment errors

Nonparametric regression with martingale increment errors S. Gaïffas (LSTA - Paris 6) joint work with S. Delattre (LPMA - Paris 7) work in progress Motivations Some facts: Theoretical study of statistical algorithms requires stationary and ergodicity. Concentration

More information

Political Science 236 Hypothesis Testing: Review and Bootstrapping

Political Science 236 Hypothesis Testing: Review and Bootstrapping Political Science 236 Hypothesis Testing: Review and Bootstrapping Rocío Titiunik Fall 2007 1 Hypothesis Testing Definition 1.1 Hypothesis. A hypothesis is a statement about a population parameter The

More information

Introduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued

Introduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued Introduction to Empirical Processes and Semiparametric Inference Lecture 02: Overview Continued Michael R. Kosorok, Ph.D. Professor and Chair of Biostatistics Professor of Statistics and Operations Research

More information

Small ball probabilities and metric entropy

Small ball probabilities and metric entropy Small ball probabilities and metric entropy Frank Aurzada, TU Berlin Sydney, February 2012 MCQMC Outline 1 Small ball probabilities vs. metric entropy 2 Connection to other questions 3 Recent results for

More information

TFT-Bootstrap: Resampling time series in the frequency domain to obtain replicates in the time domain

TFT-Bootstrap: Resampling time series in the frequency domain to obtain replicates in the time domain F-Bootstrap: Resampling time series in the frequency domain to obtain replicates in the time domain Claudia Kirch and Dimitris N. Politis December 23, 200 Abstract A new time series bootstrap scheme, the

More information

Negative Association, Ordering and Convergence of Resampling Methods

Negative Association, Ordering and Convergence of Resampling Methods Negative Association, Ordering and Convergence of Resampling Methods Nicolas Chopin ENSAE, Paristech (Joint work with Mathieu Gerber and Nick Whiteley, University of Bristol) Resampling schemes: Informal

More information

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES

SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES SF2943: TIME SERIES ANALYSIS COMMENTS ON SPECTRAL DENSITIES This document is meant as a complement to Chapter 4 in the textbook, the aim being to get a basic understanding of spectral densities through

More information

STAT 461/561- Assignments, Year 2015

STAT 461/561- Assignments, Year 2015 STAT 461/561- Assignments, Year 2015 This is the second set of assignment problems. When you hand in any problem, include the problem itself and its number. pdf are welcome. If so, use large fonts and

More information

Maximum Non-extensive Entropy Block Bootstrap

Maximum Non-extensive Entropy Block Bootstrap Overview&Motivation MEB Simulation of the MnEBB Conclusions References Maximum Non-extensive Entropy Block Bootstrap Jan Novotny CEA, Cass Business School & CERGE-EI (with Michele Bergamelli & Giovanni

More information

OPTIMAL POINTWISE ADAPTIVE METHODS IN NONPARAMETRIC ESTIMATION 1

OPTIMAL POINTWISE ADAPTIVE METHODS IN NONPARAMETRIC ESTIMATION 1 The Annals of Statistics 1997, Vol. 25, No. 6, 2512 2546 OPTIMAL POINTWISE ADAPTIVE METHODS IN NONPARAMETRIC ESTIMATION 1 By O. V. Lepski and V. G. Spokoiny Humboldt University and Weierstrass Institute

More information

A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers

A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers 6th St.Petersburg Workshop on Simulation (2009) 1-3 A Note on Data-Adaptive Bandwidth Selection for Sequential Kernel Smoothers Ansgar Steland 1 Abstract Sequential kernel smoothers form a class of procedures

More information

Nonparametric Indepedence Tests: Space Partitioning and Kernel Approaches

Nonparametric Indepedence Tests: Space Partitioning and Kernel Approaches Nonparametric Indepedence Tests: Space Partitioning and Kernel Approaches Arthur Gretton 1 and László Györfi 2 1. Gatsby Computational Neuroscience Unit London, UK 2. Budapest University of Technology

More information

Adjusted Empirical Likelihood for Long-memory Time Series Models

Adjusted Empirical Likelihood for Long-memory Time Series Models Adjusted Empirical Likelihood for Long-memory Time Series Models arxiv:1604.06170v1 [stat.me] 21 Apr 2016 Ramadha D. Piyadi Gamage, Wei Ning and Arjun K. Gupta Department of Mathematics and Statistics

More information

On variable bandwidth kernel density estimation

On variable bandwidth kernel density estimation JSM 04 - Section on Nonparametric Statistics On variable bandwidth kernel density estimation Janet Nakarmi Hailin Sang Abstract In this paper we study the ideal variable bandwidth kernel estimator introduced

More information

Nonparametric Inference via Bootstrapping the Debiased Estimator

Nonparametric Inference via Bootstrapping the Debiased Estimator Nonparametric Inference via Bootstrapping the Debiased Estimator Yen-Chi Chen Department of Statistics, University of Washington ICSA-Canada Chapter Symposium 2017 1 / 21 Problem Setup Let X 1,, X n be

More information

AN ALTERNATIVE BOOTSTRAP TO MOVING BLOCKS FOR TIME SERIES REGRESSION MODELS *

AN ALTERNATIVE BOOTSTRAP TO MOVING BLOCKS FOR TIME SERIES REGRESSION MODELS * AN ALERNAIVE BOOSRAP O MOVING BLOCKS FOR IME SERIES REGRESSION MODELS * by Javier Hidalgo London School of Economics and Political Science Contents: Abstract 1. Introduction 2. Conditions and Main Results

More information

Nonparametric Inference In Functional Data

Nonparametric Inference In Functional Data Nonparametric Inference In Functional Data Zuofeng Shang Purdue University Joint work with Guang Cheng from Purdue Univ. An Example Consider the functional linear model: Y = α + where 1 0 X(t)β(t)dt +

More information

University of California San Diego and Stanford University and

University of California San Diego and Stanford University and First International Workshop on Functional and Operatorial Statistics. Toulouse, June 19-21, 2008 K-sample Subsampling Dimitris N. olitis andjoseph.romano University of California San Diego and Stanford

More information

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia

GARCH Models Estimation and Inference. Eduardo Rossi University of Pavia GARCH Models Estimation and Inference Eduardo Rossi University of Pavia Likelihood function The procedure most often used in estimating θ 0 in ARCH models involves the maximization of a likelihood function

More information

ON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP

ON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP ON THE RANGE OF VALIDITY OF THE AUTOREGRESSIVE SIEVE BOOTSTRAP JENS-PETER KREISS, EFSTATHIOS PAPARODITIS, AND DIMITRIS N. POLITIS Abstract. We explore the limits of the autoregressive (AR) sieve bootstrap,

More information

11. Bootstrap Methods

11. Bootstrap Methods 11. Bootstrap Methods c A. Colin Cameron & Pravin K. Trivedi 2006 These transparencies were prepared in 20043. They can be used as an adjunct to Chapter 11 of our subsequent book Microeconometrics: Methods

More information

Computation Of Asymptotic Distribution. For Semiparametric GMM Estimators. Hidehiko Ichimura. Graduate School of Public Policy

Computation Of Asymptotic Distribution. For Semiparametric GMM Estimators. Hidehiko Ichimura. Graduate School of Public Policy Computation Of Asymptotic Distribution For Semiparametric GMM Estimators Hidehiko Ichimura Graduate School of Public Policy and Graduate School of Economics University of Tokyo A Conference in honor of

More information

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan

Monte-Carlo MMD-MA, Université Paris-Dauphine. Xiaolu Tan Monte-Carlo MMD-MA, Université Paris-Dauphine Xiaolu Tan tan@ceremade.dauphine.fr Septembre 2015 Contents 1 Introduction 1 1.1 The principle.................................. 1 1.2 The error analysis

More information

Massachusetts Institute of Technology Department of Economics Time Series Lecture 6: Additional Results for VAR s

Massachusetts Institute of Technology Department of Economics Time Series Lecture 6: Additional Results for VAR s Massachusetts Institute of Technology Department of Economics Time Series 14.384 Guido Kuersteiner Lecture 6: Additional Results for VAR s 6.1. Confidence Intervals for Impulse Response Functions There

More information

Estimation and Inference of Quantile Regression. for Survival Data under Biased Sampling

Estimation and Inference of Quantile Regression. for Survival Data under Biased Sampling Estimation and Inference of Quantile Regression for Survival Data under Biased Sampling Supplementary Materials: Proofs of the Main Results S1 Verification of the weight function v i (t) for the lengthbiased

More information

Single Index Quantile Regression for Heteroscedastic Data

Single Index Quantile Regression for Heteroscedastic Data Single Index Quantile Regression for Heteroscedastic Data E. Christou M. G. Akritas Department of Statistics The Pennsylvania State University SMAC, November 6, 2015 E. Christou, M. G. Akritas (PSU) SIQR

More information

Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions

Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions Supplement to Quantile-Based Nonparametric Inference for First-Price Auctions Vadim Marmer University of British Columbia Artyom Shneyerov CIRANO, CIREQ, and Concordia University August 30, 2010 Abstract

More information

Long memory and changing persistence

Long memory and changing persistence Long memory and changing persistence Robinson Kruse and Philipp Sibbertsen August 010 Abstract We study the empirical behaviour of semi-parametric log-periodogram estimation for long memory models when

More information

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes.

LECTURE 10 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA. In this lecture, we continue to discuss covariance stationary processes. MAY, 0 LECTURE 0 LINEAR PROCESSES II: SPECTRAL DENSITY, LAG OPERATOR, ARMA In this lecture, we continue to discuss covariance stationary processes. Spectral density Gourieroux and Monfort 990), Ch. 5;

More information

Nonparametric Function Estimation with Infinite-Order Kernels

Nonparametric Function Estimation with Infinite-Order Kernels Nonparametric Function Estimation with Infinite-Order Kernels Arthur Berg Department of Statistics, University of Florida March 15, 2008 Kernel Density Estimation (IID Case) Let X 1,..., X n iid density

More information

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3

Brownian Motion. 1 Definition Brownian Motion Wiener measure... 3 Brownian Motion Contents 1 Definition 2 1.1 Brownian Motion................................. 2 1.2 Wiener measure.................................. 3 2 Construction 4 2.1 Gaussian process.................................

More information

F9 F10: Autocorrelation

F9 F10: Autocorrelation F9 F10: Autocorrelation Feng Li Department of Statistics, Stockholm University Introduction In the classic regression model we assume cov(u i, u j x i, x k ) = E(u i, u j ) = 0 What if we break the assumption?

More information

Asymptotic Statistics-VI. Changliang Zou

Asymptotic Statistics-VI. Changliang Zou Asymptotic Statistics-VI Changliang Zou Kolmogorov-Smirnov distance Example (Kolmogorov-Smirnov confidence intervals) We know given α (0, 1), there is a well-defined d = d α,n such that, for any continuous

More information

Understanding Regressions with Observations Collected at High Frequency over Long Span

Understanding Regressions with Observations Collected at High Frequency over Long Span Understanding Regressions with Observations Collected at High Frequency over Long Span Yoosoon Chang Department of Economics, Indiana University Joon Y. Park Department of Economics, Indiana University

More information

Least Squares Estimation-Finite-Sample Properties

Least Squares Estimation-Finite-Sample Properties Least Squares Estimation-Finite-Sample Properties Ping Yu School of Economics and Finance The University of Hong Kong Ping Yu (HKU) Finite-Sample 1 / 29 Terminology and Assumptions 1 Terminology and Assumptions

More information

Inverse problems in statistics

Inverse problems in statistics Inverse problems in statistics Laurent Cavalier (Université Aix-Marseille 1, France) Yale, May 2 2011 p. 1/35 Introduction There exist many fields where inverse problems appear Astronomy (Hubble satellite).

More information

Advanced Statistics II: Non Parametric Tests

Advanced Statistics II: Non Parametric Tests Advanced Statistics II: Non Parametric Tests Aurélien Garivier ParisTech February 27, 2011 Outline Fitting a distribution Rank Tests for the comparison of two samples Two unrelated samples: Mann-Whitney

More information

Stability of optimization problems with stochastic dominance constraints

Stability of optimization problems with stochastic dominance constraints Stability of optimization problems with stochastic dominance constraints D. Dentcheva and W. Römisch Stevens Institute of Technology, Hoboken Humboldt-University Berlin www.math.hu-berlin.de/~romisch SIAM

More information

BOOTSTRAPPING DIFFERENCES-IN-DIFFERENCES ESTIMATES

BOOTSTRAPPING DIFFERENCES-IN-DIFFERENCES ESTIMATES BOOTSTRAPPING DIFFERENCES-IN-DIFFERENCES ESTIMATES Bertrand Hounkannounon Université de Montréal, CIREQ December 2011 Abstract This paper re-examines the analysis of differences-in-differences estimators

More information

SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions

SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions SYSM 6303: Quantitative Introduction to Risk and Uncertainty in Business Lecture 4: Fitting Data to Distributions M. Vidyasagar Cecil & Ida Green Chair The University of Texas at Dallas Email: M.Vidyasagar@utdallas.edu

More information

On the Uniform Asymptotic Validity of Subsampling and the Bootstrap

On the Uniform Asymptotic Validity of Subsampling and the Bootstrap On the Uniform Asymptotic Validity of Subsampling and the Bootstrap Joseph P. Romano Departments of Economics and Statistics Stanford University romano@stanford.edu Azeem M. Shaikh Department of Economics

More information

Lecture 13: Subsampling vs Bootstrap. Dimitris N. Politis, Joseph P. Romano, Michael Wolf

Lecture 13: Subsampling vs Bootstrap. Dimitris N. Politis, Joseph P. Romano, Michael Wolf Lecture 13: 2011 Bootstrap ) R n x n, θ P)) = τ n ˆθn θ P) Example: ˆθn = X n, τ n = n, θ = EX = µ P) ˆθ = min X n, τ n = n, θ P) = sup{x : F x) 0} ) Define: J n P), the distribution of τ n ˆθ n θ P) under

More information

Can we do statistical inference in a non-asymptotic way? 1

Can we do statistical inference in a non-asymptotic way? 1 Can we do statistical inference in a non-asymptotic way? 1 Guang Cheng 2 Statistics@Purdue www.science.purdue.edu/bigdata/ ONR Review Meeting@Duke Oct 11, 2017 1 Acknowledge NSF, ONR and Simons Foundation.

More information

ECO Class 6 Nonparametric Econometrics

ECO Class 6 Nonparametric Econometrics ECO 523 - Class 6 Nonparametric Econometrics Carolina Caetano Contents 1 Nonparametric instrumental variable regression 1 2 Nonparametric Estimation of Average Treatment Effects 3 2.1 Asymptotic results................................

More information

where x i and u i are iid N (0; 1) random variates and are mutually independent, ff =0; and fi =1. ff(x i )=fl0 + fl1x i with fl0 =1. We examine the e

where x i and u i are iid N (0; 1) random variates and are mutually independent, ff =0; and fi =1. ff(x i )=fl0 + fl1x i with fl0 =1. We examine the e Inference on the Quantile Regression Process Electronic Appendix Roger Koenker and Zhijie Xiao 1 Asymptotic Critical Values Like many other Kolmogorov-Smirnov type tests (see, e.g. Andrews (1993)), the

More information

Learning Theory. Ingo Steinwart University of Stuttgart. September 4, 2013

Learning Theory. Ingo Steinwart University of Stuttgart. September 4, 2013 Learning Theory Ingo Steinwart University of Stuttgart September 4, 2013 Ingo Steinwart University of Stuttgart () Learning Theory September 4, 2013 1 / 62 Basics Informal Introduction Informal Description

More information

Detection of structural breaks in multivariate time series

Detection of structural breaks in multivariate time series Detection of structural breaks in multivariate time series Holger Dette, Ruhr-Universität Bochum Philip Preuß, Ruhr-Universität Bochum Ruprecht Puchstein, Ruhr-Universität Bochum January 14, 2014 Outline

More information

Bootstrapping high dimensional vector: interplay between dependence and dimensionality

Bootstrapping high dimensional vector: interplay between dependence and dimensionality Bootstrapping high dimensional vector: interplay between dependence and dimensionality Xianyang Zhang Joint work with Guang Cheng University of Missouri-Columbia LDHD: Transition Workshop, 2014 Xianyang

More information

Testing Statistical Hypotheses

Testing Statistical Hypotheses E.L. Lehmann Joseph P. Romano Testing Statistical Hypotheses Third Edition 4y Springer Preface vii I Small-Sample Theory 1 1 The General Decision Problem 3 1.1 Statistical Inference and Statistical Decisions

More information

Stochastic Convergence, Delta Method & Moment Estimators

Stochastic Convergence, Delta Method & Moment Estimators Stochastic Convergence, Delta Method & Moment Estimators Seminar on Asymptotic Statistics Daniel Hoffmann University of Kaiserslautern Department of Mathematics February 13, 2015 Daniel Hoffmann (TU KL)

More information

Estimation of the functional Weibull-tail coefficient

Estimation of the functional Weibull-tail coefficient 1/ 29 Estimation of the functional Weibull-tail coefficient Stéphane Girard Inria Grenoble Rhône-Alpes & LJK, France http://mistis.inrialpes.fr/people/girard/ June 2016 joint work with Laurent Gardes,

More information

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models

Economics 536 Lecture 7. Introduction to Specification Testing in Dynamic Econometric Models University of Illinois Fall 2016 Department of Economics Roger Koenker Economics 536 Lecture 7 Introduction to Specification Testing in Dynamic Econometric Models In this lecture I want to briefly describe

More information

Weak convergence of dependent empirical measures with application to subsampling in function spaces

Weak convergence of dependent empirical measures with application to subsampling in function spaces Journal of Statistical Planning and Inference 79 (1999) 179 190 www.elsevier.com/locate/jspi Weak convergence of dependent empirical measures with application to subsampling in function spaces Dimitris

More information

6.435, System Identification

6.435, System Identification System Identification 6.435 SET 3 Nonparametric Identification Munther A. Dahleh 1 Nonparametric Methods for System ID Time domain methods Impulse response Step response Correlation analysis / time Frequency

More information

A Primer on Asymptotics

A Primer on Asymptotics A Primer on Asymptotics Eric Zivot Department of Economics University of Washington September 30, 2003 Revised: October 7, 2009 Introduction The two main concepts in asymptotic theory covered in these

More information

Change detection problems in branching processes

Change detection problems in branching processes Change detection problems in branching processes Outline of Ph.D. thesis by Tamás T. Szabó Thesis advisor: Professor Gyula Pap Doctoral School of Mathematics and Computer Science Bolyai Institute, University

More information

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539

Brownian motion. Samy Tindel. Purdue University. Probability Theory 2 - MA 539 Brownian motion Samy Tindel Purdue University Probability Theory 2 - MA 539 Mostly taken from Brownian Motion and Stochastic Calculus by I. Karatzas and S. Shreve Samy T. Brownian motion Probability Theory

More information

Evgeny Spodarev WIAS, Berlin. Limit theorems for excursion sets of stationary random fields

Evgeny Spodarev WIAS, Berlin. Limit theorems for excursion sets of stationary random fields Evgeny Spodarev 23.01.2013 WIAS, Berlin Limit theorems for excursion sets of stationary random fields page 2 LT for excursion sets of stationary random fields Overview 23.01.2013 Overview Motivation Excursion

More information

EMPIRICAL EDGEWORTH EXPANSION FOR FINITE POPULATION STATISTICS. I. M. Bloznelis. April Introduction

EMPIRICAL EDGEWORTH EXPANSION FOR FINITE POPULATION STATISTICS. I. M. Bloznelis. April Introduction EMPIRICAL EDGEWORTH EXPANSION FOR FINITE POPULATION STATISTICS. I M. Bloznelis April 2000 Abstract. For symmetric asymptotically linear statistics based on simple random samples, we construct the one-term

More information

HAR Inference: Recommendations for Practice

HAR Inference: Recommendations for Practice HAR Inference: Recommendations for Practice Eben Lazarus, Harvard Daniel Lewis, Harvard Mark Watson, Princeton & NBER James H. Stock, Harvard & NBER JBES Invited Session Sunday, Jan. 7, 2018, 1-3 pm Marriott

More information